MPNowPlayingInfoCenter throwing EXC_BAD_ACCESS - ios

I am making an app that plays back audio and I have set it up so that the lock screen gets updated through MPNowPlayingInfoCenter, but I've run into a problem.
At seemingly random times, I get an EXC_BAD_ACCESS error when trying to update the now playing info.
Here's the code that does so:
- (void)updatePlayback
{
if(!active)
return;
NowPlayingController* npc = [AudioController nowPlayingController];
CMTime elapsed = player.currentTime;
Float64 elInterval = CMTimeGetSeconds(elapsed);
[npc setElapsed:elInterval];
CMTime duration = player.currentItem.duration;
Float64 durInterval = CMTimeGetSeconds(duration);
[npc setRemaining:ceilf(durInterval - elInterval)];
[npc setPlayPauseValue:isPlaying];
if(durInterval > 0)
{
[npc setProgressValue:elInterval/durInterval];
[npc setAudioDuration:durInterval];
}
_activeMetadata[MPMediaItemPropertyPlaybackDuration] = #(durInterval);
_activeMetadata[MPNowPlayingInfoPropertyPlaybackRate] = #(isPlaying);
_activeMetadata[MPNowPlayingInfoPropertyElapsedPlaybackTime] = #(elInterval);
MPNowPlayingInfoCenter* npInfoCenter = [MPNowPlayingInfoCenter defaultCenter];
if(npInfoCenter && _activeMetadata)
{
if([npInfoCenter respondsToSelector:#selector(setNowPlayingInfo:)])
{
//////////THE FOLLOWING LINE TRIGGERS EXC_BAD_ACCESS SOMETIMES////////////
[npInfoCenter setNowPlayingInfo:_activeMetadata];
}
}
}
99.9% of the time, this works, but sometimes when resigning the app to the background or when changing audio files, or just randomly,
[npInfoCenter setNowPlayingInfo:_activeMetadata];
throws EXC_BAD_ACCESS.
Also, _activeMetadata is declared as:
#property (atomic, strong, retain) NSMutableDictionary* activeMetadata;
It is instantiated when the AVPlayer is created:
AVAsset* asset = [AVAsset assetWithURL:[NSURL fileURLWithPath:path]];
AVPlayerItem* playerItem = [AVPlayerItem playerItemWithAsset:asset];
player = [AVPlayer playerWithPlayerItem:playerItem];
CMTime duration = player.currentItem.duration;
NSTimeInterval durInterval = CMTimeGetSeconds(duration);
NSLog(#"%f", durInterval);
MPMediaItemArtwork* albumArtwork = [[MPMediaItemArtwork alloc] initWithImage:[downloader useCachedImage:CacheKeySeriesBanners withName:nil withURL:info[#"image"]]];
NSDictionary* nowPlayingInfo = #{MPMediaItemPropertyTitle:ptString,
MPMediaItemPropertyArtist:spString,
MPMediaItemPropertyArtwork:albumArtwork,
MPMediaItemPropertyAlbumTitle:info[#"title"],
MPMediaItemPropertyPlaybackDuration:#(durInterval),
MPNowPlayingInfoPropertyPlaybackRate:#(1),
MPNowPlayingInfoPropertyElapsedPlaybackTime:#(0)};
[[MPNowPlayingInfoCenter defaultCenter] setNowPlayingInfo:nowPlayingInfo];
_activeMetadata = [nowPlayingInfo mutableCopy];
updatePlayback is called via a CADisplayLink on every frame.
Any ideas what could be causing the exception?

I think you're calling setNowPlayingInfo way too often. Granted, it really shouldn't crash but there's no need to use CADisplayLink to call it 60 times a second.
So why are you calling it so often? If it's because you want to progress bar to track smoothly, there's still no need. From the MPNowPlayingInfoPropertyElapsedPlaybackTime declaration:
// The elapsed time of the now playing item, in seconds.
// Note the elapsed time will be automatically extrapolated from the previously
// provided elapsed time and playback rate, so updating this property frequently
// is not required (or recommended.)
p.s. I tried the code with an m4a file and found durInterval was NotANumber. With the correct duration and calling setNowPlayingInfo only once, the progress bar tracked fine & nothing crashed.

Apple fixed this crash in iOS 10.3 and above.
So if you want to support iOS 10.2.1 and below, be sure to throttle how often you set [MPNowPlayingInfoCenter defaultCenter].nowPlayingInfo property. Perhaps limiting setting the property only once a second.

Related

AVAudioPlayer currentTime lag issue

When I change the currentTime of AVAudioPlayer after pausing the player, it gives a lag (some time in positive and some time in negative).
[self.bookAudioPlayer pause];
[self.bookAudioPlayer setCurrentTime:[currentPage.audioStartTime doubleValue]];
[self.bookAudioPlayer prepareToPlay];
When I print the currentTime and audioStartTime it prints values with slight difference. For example,
audioStartTime : 2.203665, currentTime : 2.194286
audioStartTime : 137.521347, currentTime : 137.508571
I have tried to fix it using the following code but results stay the same.
- (void)fixSetCurentTime:(NSTimeInterval)newTime {
self.currentTime = newTime;
if (self.currentTime != newTime) {
[self prepareToPlay];
self.currentTime = newTime;
}
}
Has anyone experienced this issue? Any pointers for a possible fix?
Note that the MP3 audio frames are about 26 milliseconds long. So, block-based audio compression formats are only (re)startable on time-quantized block boundaries... And the nearest block start might be earlier (or later) in time.
Credit: Apple Developer Forum User (Reference)

Update UISlider from AVPlayer (iOS / objective-c)

I would like my AVPlayer object to automaticly update my UISlider when playing.
I have found a code on this forum that seems to work for other but I'm broken at some point:
CMTime interval = CMTimeMakeWithSeconds(1.0, NSEC_PER_SEC); // 1 second
self.playbackTimeObserver = [self.player addPeriodicTimeObserverForInterval:interval queue:NULL usingBlock:^(CMTime time) {
// update slider value here...
}];
I have inserted this code in my viewDidLoad but I removed "self.playbackTimeObserver" as I can't find what type of object is this. I guess that's why it s not working correctly.
Can you please tell me what type is it and where/how to declare it?
Here is my current code:
- (void)viewDidLoad
{
[super viewDidLoad];
CMTime interval = CMTimeMakeWithSeconds(1.0, NSEC_PER_SEC); // 1 second
[songPlayer addPeriodicTimeObserverForInterval:interval queue:NULL usingBlock:^(CMTime time) {
NSLog(#"seconds = %f", CMTimeGetSeconds(songPlayer.currentTime));
}];
self.mmContainerSearch.hidden = NO;
self.mmContainerDownload.hidden = YES;
self.mmContainerLibrary.hidden = YES;
}
Its type is id. It right in the documentation.
Return Value
An opaque object that you pass as the argument to removeTimeObserver: to cancel observation.
If you never need to remove the time observer, then you don't really need to save the return value. My guess is that as some point you will want to cleanup. At that point, then you will need to call -removeTimeObserver:.

How do I audio crossfade using cocoalibspotify?

I'd like to crossfade from one track to the next in a Spotify enabled app. Both tracks are Spotify tracks, and since only one data stream at a time can come from Spotify, I suspect I need to buffer (I think I can read ahead 1.5 x playback speed) the last few seconds of the first track, start the stream for track two, fade out one and fade in two using an AudioUnit.
I've reviewed sample apps:
Viva - https://github.com/iKenndac/Viva SimplePlayer with EQ - https://github.com/iKenndac/SimplePlayer-with-EQ and tried to get my mind around the SPCircularBuffer, but I still need help. Could someone point me to another example or help bullet-point a track crossfade game plan?
Update: Thanks to iKenndac, I'm about 95% there. I'll post what I have so far:
in SPPlaybackManager.m: initWithPlaybackSession:(SPSession *)aSession {
added:
self.audioController2 = [[SPCoreAudioController alloc] init];
self.audioController2.delegate = self;
and in
- (void)observeValueForKeyPath:(NSString *)keyPath ofObject:(id)object change:(NSDictionary *)change context:(void *)context {
...
self.audioController.audioOutputEnabled = self.playbackSession.isPlaying;
// for crossfade, add
self.audioController2.audioOutputEnabled = self.playbackSession.isPlaying;
and added a new method based on playTrack
-(void)crossfadeTrack:(SPTrack *)aTrack callback:(SPErrorableOperationCallback)block {
// switch audiocontroller from current to other
if (self.playbackSession.audioDeliveryDelegate == self.audioController)
{
self.playbackSession.audioDeliveryDelegate = self.audioController2;
self.audioController2.delegate = self;
self.audioController.delegate = nil;
}
else
{
self.playbackSession.audioDeliveryDelegate = self.audioController;
self.audioController.delegate = self;
self.audioController2.delegate = nil;
}
if (aTrack.availability != SP_TRACK_AVAILABILITY_AVAILABLE) {
if (block) block([NSError spotifyErrorWithCode:SP_ERROR_TRACK_NOT_PLAYABLE]);
self.currentTrack = nil;
}
self.currentTrack = aTrack;
self.trackPosition = 0.0;
[self.playbackSession playTrack:self.currentTrack callback:^(NSError *error) {
if (!error)
self.playbackSession.playing = YES;
else
self.currentTrack = nil;
if (block) {
block(error);
}
}];
}
this starts a timer for crossfade
crossfadeTimer = [NSTimer scheduledTimerWithTimeInterval: 0.5
target: self
selector: #selector ( crossfadeCountdown)
userInfo: nil
repeats: YES];
And in order to keep the first track playing after its data has loaded in SPCoreAudioController.m I changed target buffer length:
static NSTimeInterval const kTargetBufferLength = 20;
and in SPSession.m : end_of_track(sp_session *session) {
I removed
// sess.playing = NO;
I call preloadTrackForPlayback: about 15 seconds before end of track, then crossfadeTrack: at 10 seconds before.
Then set crossfadeCountdownTime = [how many seconds you want the crossfade]*2;
I fade volume over the crosssfade with:
- (void) crossfadeCountdown
{
[UIAppDelegate.playbackSPManager setVolume:(1- (((float)crossfadeCountdownTime/ (thisCrossfadeSeconds*2.0)) *0.2) )];
crossfadeCountdownTime -= 0.5;
if (crossfadeCountdownTime == 1.0)
{
NSLog(#"Crossfade countdown done");
crossfadeCountdownTime = 0;
[crossfadeTimer invalidate];
crossfadeTimer = nil;
[UIAppDelegate.playbackSPManager setVolume:1.0];
}
}
I'll keep working on it, and update if I can make it better. Thanks again to iKenndac for his always spot-on help!
There isn't a pre-written crossfade example that I'm aware of that uses CocoaLibSpotify. However, a (perhaps not ideal) game plan would be:
Make two separate audio queues. SPCoreAudioController is an encapsulation of an audio queue, so you should just be able to instantiate two of them.
Play music as normal to one queue. When you're approaching the end of the track, call SPSession's preloadTrackForPlayback:callback: method with the next track to get it ready.
When all audio data for the playing track has been delivered, SPSession will fire the audio delegate method sessionDidEndPlayback:. This means that all audio data has been delivered. However, since CocoaLibSpotify buffers the audio from libspotify, there's still some time before audio stops.
At this point, start playing the new track but divert the audio data to the second audio queue. Start ramping down the volume of the first queue while ramping up the volume of the next one. This should give a pleasing crossfade.
A few pointers:
In SPCoreAudioController.m, you'll find the following line, which defines how much audio CocoaLibSpotify buffers, in seconds. If you want a bigger crossfade, you'll need to increase it.
static NSTimeInterval const kTargetBufferLength = 0.5;
Since you get audio data at a maximum of 1.5x actual playback speed, be careful not to do, for example, a 5 second crossfade when the user has just skipped near to the end of the track. You might not have enough audio data available to pull it off.
Take a good look at SPPlaybackManager.m. This class is the interface between CocoaLibSpotify and Core Audio. It's not too complicated, and understanding it will get you a long way. SPCoreAudioController and SPCircularBuffer are pretty much implementation details of getting the audio into Core Audio, and you shouldn't need to understand their implementations to achieve what you want.
Also, make sure you understand the various delegates SPSession has. The audio delivery delegate only has one job - to receive audio data. The playback delegate gets all other playback events - when audio has finished being delivered to the audio delivery delegate, etc. There's nothing stopping one class being both, but in the current implementation, SPPlaybackManager is the playback delegate, which creates an instance of SPCoreAudioController to be the audio delivery delegate. If you modify SPPlaybackManager to have two Core Audio controllers and alternate which one is the audio delivery delegate, you should be golden.

Playing sounds in sequence with SimpleAudioEngine

I'm building an iOS app with cocos2d 2, and I'm using SimpleAudioEngine to play some effects.
Is there a way to sequence multiple sounds to be played after the previous sound is complete?
For example in my code:
[[SimpleAudioEngine sharedEngine] playEffect:#"yay.wav"];
[[SimpleAudioEngine sharedEngine] playEffect:#"youDidIt.wav"];
When this code is run, yay.wav and youDidIt.wav play at the exact same time, over each other.
I want yay.wav to play and once complete, youDidIt.wav to play.
If not with SimpleAudioEngine, is there a way with AudioToolbox, or something else?
Thank you!
== UPDATE ==
I think I'm going to go with this method using AVFoundation:
AVPlayerItem *sound1 = [[AVPlayerItem alloc] initWithURL:[NSURL fileURLWithPath:[[NSBundle mainBundle] pathForResource:#"yay" ofType:#"wav"]]];
AVPlayerItem *sound2 = [[AVPlayerItem alloc] initWithURL:[NSURL fileURLWithPath:[[NSBundle mainBundle] pathForResource:#"YouDidIt" ofType:#"wav"]]];
AVQueuePlayer *player = [[AVQueuePlayer alloc] initWithItems: [NSArray arrayWithObjects:sound1, sound2, nil]];
[player play];
The easy way would be getting the track duration using -[CDSoundSource durationInSeconds] and then schedule the second effect playing after a proper delay:
[[SimpleAudioEngine sharedEngine] performSelector:#selector(playEffect:) withObject:#"youDidIt.wav" afterDelay:duration];
An easier way to get the audio duration would be patching SimpleAudioManager and add a method that queries its CDSoundEngine (a static global) for the audio duration:
- (float)soundDuration {
return [_engine bufferDurationInSeconds:_soundId];
}
The second approach would be polling on the status of the audio engine and wait for it to stop playing.
alGetSourcei(sourceId, AL_SOURCE_STATE, &state);
if (state == AL_PLAYING) {
...
The sourceId is the ALuint returned by playEffect.
It's simple, tested with Cocos2D 2.1:
Creates a property durationInSeconds on SimpleAudioEngine.h :
#property (readonly) float durationInSeconds;
Synthesize them :
#synthesize durationInSeconds;
Modify playEffect like this :
-(ALuint) playEffect:(NSString*) filePath pitch:(Float32) pitch pan:(Float32) pan gain:(Float32) gain {
int soundId = [bufferManager bufferForFile:filePath create:YES];
if (soundId != kCDNoBuffer) {
durationInSeconds = [soundEngine bufferDurationInSeconds:soundId];
return [soundEngine playSound:soundId sourceGroupId:0 pitch:pitch pan:pan gain:gain loop:false];
} else {
return CD_MUTE;
}
}
As you can see, I inserted durationInSeconds value directly from soundEngine bufferDurationInSeconds from the result of bufferManager.
Now, only ask for the value [SimpleAudioEngine sharedEngine].durationInSeconds and you've got the float duration in seconds of this sound.
Put a timer this seconds and after this, play the next iteration of your sound or put the flag OFF or something.

IOS addPeriodicTimeObserverForInterval fire too many times

i have noticed a strange thing happening to my app.
It's a video app, that use the AVFoundation classes.
I need to fire some events at given time.
I put some code then i comment it :
/* I prepare the movie clip */
AVMutableComposition *composition = [[AVMutableComposition alloc] init];
NSBundle *bundle = [NSBundle mainBundle];
NSDictionary *optionsDictionary = [NSDictionary dictionaryWithObject:[NSNumber numberWithBool:YES] forKey:AVURLAssetPreferPreciseDurationAndTimingKey];
NSString *path = [bundle pathForResource:#"13.VIDEO_A (BAULE)" ofType:#"mp4"];
NSURL *videoUrl = [NSURL fileURLWithPath:path];
AVURLAsset* sourceAsset = [AVURLAsset URLAssetWithURL:videoUrl options:optionsDictionary];
[composition insertTimeRange:CMTimeRangeMake(kCMTimeZero, [sourceAsset duration]) ofAsset:sourceAsset atTime:currentTime error:NULL];
In my viewDidLoad i prepare the clip. I use AVUrlAsset to be able to use the options dictionary with AVURLAssetPreferPreciseDurationAndTimingKey to have a more precise use.
/* I create the player */
AVPlayer *mPlayer = [AVPlayer playerWithPlayerItem:[AVPlayerItem playerItemWithAsset:composition]];
AVPlayerLayer *mPlayerLayer = [AVPlayerLayer playerLayerWithPlayer:mPlayer];
mPlayerLayer.frame = CGRectMake(0.00, 96.00, 1024.00, 576.00);
[self.view.layer addSublayer: mPlayerLayer];
I create a player with a item from my AVUrlasset and then i create a layout in my view
/* I set the observer */
[mPlayer addPeriodicTimeObserverForInterval:CMTimeMake(5,25) queue:NULL usingBlock:^(CMTime time) {
NSLog(#"Event : value: %lld, timescale %d, seconds: %f",
time.value, time.timescale,(float) time.value / time.timescale); }];
I set the observer, every 5/25 of second, 0,2 seconds (25 is the framerate of the movie).
In my block i only write log for now.
/* Play the movie */
[mPlayer play];
At the end i play.
Seems everything working except that my log is wrong :
2012-11-15 16:43:05.382 PerfectCircle Beta[6680:707] Evento : value: 0, timescale 1, seconds: 0.000000
2012-11-15 16:43:05.410 PerfectCircle Beta[6680:707] Evento : value: 0, timescale 1, seconds: 0.000000
2012-11-15 16:43:05.563 PerfectCircle Beta[6680:707] Evento : value: 0, timescale 1, seconds: 0.000000
2012-11-15 16:43:05.580 PerfectCircle Beta[6680:707] Evento : value: 0, timescale 1, seconds: 0.000000
2012-11-15 16:43:05.747 PerfectCircle Beta[6680:707] Evento : value: 5489807, timescale 1000000000, seconds: 0.005490
2012-11-15 16:43:05.751 PerfectCircle Beta[6680:707] Evento : value: 8949705, timescale 1000000000, seconds: 0.008950
2012-11-15 16:43:05.753 PerfectCircle Beta[6680:707] Evento : value: 10679967, timescale 1000000000, seconds: 0.010680
2012-11-15 16:43:05.990 PerfectCircle Beta[6680:707] Evento : value: 248121672, timescale 1000000000, seconds: 0.248122
2012-11-15 16:43:06.169 PerfectCircle Beta[6680:707] Evento : value: 426865945, timescale 1000000000, seconds: 0.426866
After a random number of fires it's starting count well. But it fire the event 5/6 times more at start. I tried different movies and codec.
If i raise the rate (es: CMTimeMake(25,25) ) nothing change.
I started my work with addBoundaryTimeObserverForTimes in this way :
NSArray *starts = [NSArray arrayWithObjects:[NSValue valueWithCMTime:CMTimeMakeWithSeconds(0.2,25)],nil];
[_player addBoundaryTimeObserverForTimes:starts queue:NULL usingBlock:^{ log_function }];
But i had the same problems. But here if i raise the rate i dont see anymore the problem (but its not good for my target).
My problem is that i must count precisely how many time the movie play a precise moment. And i cannot test it with if (currenttime==0.3) because its not precise.
It's a bug ? I miss something ? Have u ever heard of something similar ?
Thanks for helping.
Daniele
UPDATE :
It seems to be an issue at start and end.
2012-11-15 16:43:05.747 PerfectCircle Beta[6680:707] Evento : value: 0, timescale 1, seconds: 0.000000
2012-11-15 16:43:05.747 PerfectCircle Beta[6680:707] Evento : value: 5489807, timescale 1000000000, seconds: 0.005490
The wrong logs have a different timescale towards the right ones. The same happen at the end of playback. It seems that at start and end it execute the timer but the movie isn't yet loaded or already closed.
I tried put the observer after play but nothing changed.
I also tried a different and more higher timescale for mine CMTimeMake ... but no effects
I know that this question is a bit old, but anyway.......
First things first:
If you check the documentation you may see the following statement.
The block is invoked periodically at the interval specified,
interpreted according to the timeline of the current item. The block
is also invoked whenever time jumps and whenever playback starts or
stops. If the interval corresponds to a very short interval in real
time, the player may invoke the block less frequently than requested.
Even so, the player will invoke the block sufficiently often for the
client to update indications of the current time appropriately in its
end-user interface.
Which indicates why it is being called at the start and end playback.
About the function being called several times I guess it's happening because of the internal state changes that the player is suffering. I've checked for the 'rate', 'status' and 'playerItem' properties of the player but nothing seems to say what's happening.
One thing you can do o work around this is only consider events after the player is really playing.
Add the following code before you call the play method.
__block AVPlayer* blockPlayer = self.player;
__block typeof(self) blockSelf = self;
// Setup boundary time observer to trigger when audio really begins,
// specifically after 1/3 of a second playback
self.startObserver = [self.player addBoundaryTimeObserverForTimes:
#[[NSValue valueWithCMTime:CMTimeAdd(self.player.currentTime, CMTimeMake(1, 3))]]
queue:NULL
usingBlock:^{
blockSelf.isPlaying = YES;
// Remove the boundary time observer
[blockPlayer removeTimeObserver:blockSelf.startObserver];
}];
Now on the addPeriodicTimeObserverForInterval block you just need to check the variable that we've just assigned.
__block typeof(self) blockSelf = self;
[self addPeriodicTimeObserverForInterval:CMTimeMake(60, 1)
queue:dispatch_get_main_queue()
usingBlock:^(CMTime time)
{
if (blockSelf.isPlaying) {
... do some stuff here
}
}];
Well is not the cleaner solution but worked fine for me. If i find something better i'll come edit.
Perhaps in your CMTime call you need to set greater precision. At the moment you are setting it to 25ths of a second but that allows no leeway for rounding of float values. Try using 25,000 as your timescale and see if that works.

Resources