I'm having difficulty stopping an AVPlayers time observer.
I have an AVPlayer player running like this:
player = [[AVPlayer alloc] initWithURL:[NSURL fileURLWithPath:path]];
then I add an observer
[player addPeriodicTimeObserverForInterval:CMTimeMake(3, 10) queue:NULL usingBlock:^(CMTime time){
NSTimeInterval seconds = CMTimeGetSeconds(time);
NSLog(#"observer called");
for (NSDictionary *item in robotR33) {
NSNumber *time = item[#"time"];
if ( seconds > [time doubleValue] && [time doubleValue] >= [lastTime doubleValue] ) {
// NSLog(#"LastTime: %qi", [lastTime longLongValue]);
lastTime = #(seconds);
NSString *str = item[#"line"];
[weakSelf nextLine:str];
// NSLog(#"item: %qi", [time longLongValue]);
// NSLog(#"Seconds: %f", seconds)
};
}
}];
[player play];
once I am finished with the player I do this:
[player pause];
[player removeTimeObserver:self.timeObserver]
player = nil;
the weird thing is when I put a breakpoint in the code and step through the code using XCode it works. I can see the block stops printing out "observer code"
But when I run the code normally with no breakpoints, I can see that the observer is still running at the same interval after the [player removeTimeObserver] has been called.
Any ideas?
Glad to see weakSelf worked for you ...
In the above I don't see you assigning the result of
[player addPeriodicTimeObserverForInterval ...
to self.timeObserver it may be a typo but it should be ;
self.timeObserver = [player addPeriodicTimeObserverForInterval ...
if you intend to call
[player removeTimeObserver:self.timeObserver];
you also missed the ";" above.
Related
I am working on an iOS app where I am using AVPlayer to play different mp4 videos. Most of the times it works fine. except sometimes my app completely freezes the phone. I haven't been able to catch it at what place this happens but I think it usually happens right after this line. I verified this by placing a bunch of NSLog where I print the [[NSDate date] timeIntervalSince1970] :
mylayer =[AVPlayerLayer playerLayerWithPlayer:myplayer];
The freeze happens for a few seconds (sometimes much longer).
Even if I press home button or lock button, the phone is unresponsive.
I have to end up pressing the lock button for about 6-10 seconds which hard restarts the entire phone.
Note that the CPU and memory usage doesn't spike during this.
I understand my code might be buggy and all but shouldn't the OS be intelligent enough to not let a single app completely freeze the entire phone? Would this be considered an OS bug? If so, I might log a DTS with Apple.
****EDIT: added code****
Note the comment which says "// this is the line which freezes"
dispatch_queue_t LOADQUEUE = dispatch_queue_create("com.yolo.LOADQUEUE", DISPATCH_QUEUE_CONCURRENT);
dispatch_async(LOADQUEUE, ^{
AVURLAsset *avAsset = [[AVURLAsset alloc] initWithURL:url options:nil];
NSLog(#"current time 4: %f",[[NSDate date] timeIntervalSince1970]);
if ([avAsset tracksWithMediaType:AVMediaTypeVideo] && [avAsset tracksWithMediaType:AVMediaTypeVideo].count>0) {
NSLog(#"current time 4.5: %f",[[NSDate date] timeIntervalSince1970]);
CGSize size = [[[avAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0] naturalSize];
NSLog(#"current time 5: %f",[[NSDate date] timeIntervalSince1970]);
CGRect r = self.topHeader.frame;
r.size.height=((size.height*self.view.frame.size.width)/size.width)+self.topheaderBottomView.frame.size.height+self.topheadertopview.frame.size.height+self.itemTitle.frame.size.height;
howMuchToScrollToShowCommentButton=r.size.height;
dispatch_async(dispatch_get_main_queue(), ^{
self.topHeader.frame=r;
[UIView animateWithDuration:0 animations:^{
[self.mytableview setTableHeaderView:self.topHeader];
}completion:^(BOOL finished) {
NSArray *keys = #[#"playable"];
NSLog(#"current time 6: %f",[[NSDate date] timeIntervalSince1970]);
[avAsset loadValuesAsynchronouslyForKeys:keys completionHandler:^{
dispatch_async(dispatch_get_main_queue(), ^{
NSLog(#"current time 7: %f",[[NSDate date] timeIntervalSince1970]);
AVPlayerItem *newItem = [[AVPlayerItem alloc] initWithAsset:avAsset];
if (!myplayer) {
myplayer = [[AVPlayer alloc]initWithPlayerItem:newItem];
} else {
[myplayer replaceCurrentItemWithPlayerItem:newItem];
}
NSLog(#"current time 7.5: %f",[[NSDate date] timeIntervalSince1970]);
[myplayer addObserver:self forKeyPath:#"status" options:(NSKeyValueObservingOptionNew | NSKeyValueObservingOptionInitial) context:nil];
[myplayer addObserver:self forKeyPath:#"rate" options:(NSKeyValueObservingOptionNew | NSKeyValueObservingOptionInitial) context:nil];
[[NSNotificationCenter defaultCenter] addObserver:self
selector:#selector(playerItemDidReachEnd:)
name:AVPlayerItemDidPlayToEndTimeNotification
object:[myplayer currentItem]];
myplayer.actionAtItemEnd = AVPlayerActionAtItemEndNone;
NSLog(#"current time 7.6: %f",[[NSDate date] timeIntervalSince1970]);
mylayer =[AVPlayerLayer playerLayerWithPlayer:myplayer]; // this is the line which freezes
NSLog(#"current time 7.7: %f",[[NSDate date] timeIntervalSince1970]);
[playerView.layer addSublayer:mylayer];
mylayer.videoGravity = AVLayerVideoGravityResize;
[mylayer setFrame:playerView.bounds];
[myplayer seekToTime:kCMTimeZero];
NSLog(#"current time 8: %f",[[NSDate date] timeIntervalSince1970]);
});
}];
}];
});
} else {
NSLog(#"ITEM doesn't exist");
}
});
Output:
Note the 21 second break between time 7.6 and 7.7 :
2016-06-04 01:27:20.853 XYZ[402:49072] current time 7: 1465018040.853897
2016-06-04 01:27:20.875 XYZ[402:49072] current time 7.5: 1465018040.875220
2016-06-04 01:27:20.875 XYZ[402:49072] current time 7.6: 1465018040.875871
2016-06-04 01:27:41.841 XYZ[402:49072] current time 7.7: 1465018061.841419
2016-06-04 01:27:41.841 XYZ[402:49072] current time 8: 1465018061.841863
Edit 2:
I paused the app in xcode and looked at what the threads were doing on the left. Here's a screenshot:
This could be a symptom of debugging and running through XCode. You're right, normally you should always be able to hit the Home button and exit the application.
Edit your scheme and change from Debug to Release. Run a build once through Xcode. Kill the app, then Launch it without Xcode from the Home screen of the device.
How can I save time when audio was stopped in session and continue playback from the stop point in next session?
My code:
- (void)initPlayer:(NSString*) audioFile fileExtension:(NSString*)fileExtension
{
NSURL *audioFileLocationURL = [[NSBundle mainBundle] URLForResource:audioFile withExtension:fileExtension];
NSError *error;
self.audioPlayer = [[AVAudioPlayer alloc] initWithContentsOfURL:audioFileLocationURL error:&error];
if ([audioFile isEqualToString:#"2"]) {
_index = 1;
}
else if ([audioFile isEqualToString:#"3"]) {
_index = 2;
}
[self song];
}
- (void)playAudio {
[self.audioPlayer play];
}
- (void)pauseAudio {
[self.audioPlayer pause];
}
- (BOOL)isPlaying {
return [self.audioPlayer isPlaying];
}
-(NSString*)timeFormat:(float)value{
float minutes = floor(lroundf(value)/60);
float seconds = lroundf(value) - (minutes * 60);
int roundedSeconds = lroundf(seconds);
int roundedMinutes = lroundf(minutes);
NSString *time = [[NSString alloc]
initWithFormat:#"%d:%02d",
roundedMinutes, roundedSeconds];
return time;
}
- (void)setCurrentAudioTime:(float)value {
[self.audioPlayer setCurrentTime:value];
}
- (NSTimeInterval)getCurrentAudioTime {
return [self.audioPlayer currentTime];
}
- (float)getAudioDuration {
return [self.audioPlayer duration];
}
You can use AVPlayer's currentTime property. It returns the playback time of the current AVPlayerItem.
To restore the playback time in the next session, you can pass the stored time to AVPlayer's seekToTime:
[self.player seekToTime:storedPlaybackTime];
https://developer.apple.com/library/mac/documentation/AVFoundation/Reference/AVPlayer_Class/index.html#//apple_ref/doc/uid/TP40009530-CH1-SW2
https://developer.apple.com/library/mac/documentation/AVFoundation/Reference/AVPlayer_Class/index.html#//apple_ref/occ/instm/AVPlayer/seekToTime%3a
To persist the CMTime returned by currentTime, you can use the AVFoundation convenience methods provided by NSValue.
To wrap CMTime in an NSValue, use valueWithCMTime:
[NSValue valueWithCMTime:player.currentTime];
To get an CMTime struct from the persisted value, use:
CMTime persistedTime = [storeValue CMTimeValue];
After you wrapped the CMTime struct in a NSValue instance, you can use keyed archiver & NSData to write the time to disk.
NSHipster has a good article about that topic:http://nshipster.com/nscoding/
The most easy way will be to keep a local db with the song name and when it is stopped add that data to the db. Then when the playback resumes later check the local db first if it has any entries in the past. If not continue from starting.
Also make sure that there is no entry made if the song finishes.
Hope this idea helps you...
I have an AVPlayer class all set up that streams an audio file. It's a bit long, so I can't post the whole thing here. What I am stuck on is how to allow the user to replay the audio file after they have finished listening to it once. When it finishes the first time, I correctly receive a notification AVPlayerItemDidPlayToEndTimeNotification. When I go to replay it, I immediately receive the same notification, which blocks me from replaying it.
How can I reset this such that the AVPlayerItem doesn't think that it has already played the audio file? I could deallocate everything and set it up again, but I believe that would force the user to download the audio file again, which is pointless and slow.
Here are some parts of the class that I think are relevant. The output that I get when attempting to replay the file looks like this. The first two lines are exactly what I would expect, but the third is a surprise.
is playing no timer audio player has finished playing audio
- (id) initWithURL : (NSString *) urlString
{
self = [super init];
if (self) {
self.isPlaying = NO;
self.verbose = YES;
if (self.verbose) NSLog(#"url: %#", urlString);
NSURL *url = [NSURL URLWithString:urlString];
self.playerItem = [AVPlayerItem playerItemWithURL:url];
self.player = [[AVPlayer alloc] initWithPlayerItem:self.playerItem];
[self determineAudioPlayTime : self.playerItem];
self.lengthOfAudioInSeconds = #0.0f;
[self.player addObserver:self forKeyPath:#"status" options:0 context:nil];
[[NSNotificationCenter defaultCenter] addObserver:self selector:#selector(itemDidFinishPlaying:) name:AVPlayerItemDidPlayToEndTimeNotification object:self.playerItem];
}
return self;
}
// this is what gets called when the user clicks the play button after they have
// listened to the file and the AVPlayerItemDidPlayToEndTimeNotification has been received
- (void) playAgain {
[self.playerItem seekToTime:kCMTimeZero];
[self toggleState];
}
- (void) toggleState {
self.isPlaying = !self.isPlaying;
if (self.isPlaying) {
if (self.verbose) NSLog(#"is playing");
[self.player play];
if (!timer) {
NSLog(#"no timer");
CMTime audioTimer = CMTimeMake(0, 1);
[self.player seekToTime:audioTimer];
timer = [NSTimer scheduledTimerWithTimeInterval:1.0
target:self
selector:#selector(updateProgress)
userInfo:nil
repeats:YES];
}
} else {
if (self.verbose) NSLog(#"paused");
[self.player pause];
}
}
-(void)itemDidFinishPlaying:(NSNotification *) notification {
if (self.verbose) NSLog(#"audio player has finished playing audio");
[[NSNotificationCenter defaultCenter] postNotificationName:#"audioFinished" object:self];
[timer invalidate];
timer = nil;
self.totalSecondsPlayed = [NSNumber numberWithInt:0];
self.isPlaying = NO;
}
You can call the seekToTime method when your player received AVPlayerItemDidPlayToEndTimeNotification
func itemDidFinishPlaying() {
self.player.seek(to: CMTime.zero)
self.player.play()
}
Apple recommends using AVQueueplayer with an AVPlayerLooper.
Here's Apple's (slightly revised) sample code:
AVQueuePlayer *queuePlayer = [[AVQueuePlayer alloc] init];
AVAsset *asset = // AVAsset with its 'duration' property value loaded
AVPlayerItem *playerItem = [AVPlayerItem playerItemWithAsset:asset];
// Create a new player looper with the queue player and template item
self.playerLooper = [AVPlayerLooper playerLooperWithPlayer:queuePlayer
templateItem:playerItem];
// Begin looping playback
[queuePlayer play];
The AVPlayerLooper does all the event listening and playing for you, and the queue player is used to create what they call a "treadmill pattern". This pattern is essentially chaining multiple instances of the same AVAssetItem in a queue player and moving each finished asset back to the beginning of the queue.
The advantage of this approach is that it enables the framework to preroll the next asset (which is the same asset in this case, but its start still needs prerolling) before it arrives, reducing latency between the asset's end and looped start.
This is described in greater detail at ~15:00 in the video here: https://developer.apple.com/videos/play/wwdc2016/503/
AVPlayerItem has this property forwardPlaybackEndTime
The value indicated the time at which playback should end when the
playback rate is positive (see AVPlayer’s rate property).
The default value is kCMTimeInvalid, which indicates that no end time
for forward playback is specified. In this case, the effective end
time for forward playback is the item’s duration.
But I don't know why it does not work. I tried to set it in AVPlayerItemStatusReadyToPlay, duration available callback, ... but it does not have any effect, it just plays to the end
I think that forwardPlaybackEndTime is used to restrict the playhead, right?
In my app, I want to play from the beginning to the half of the movie only
My code looks like this
- (void)playURL:(NSURL *)URL
{
AVPlayerItem *playerItem = [AVPlayerItem playerItemWithURL:URL];
if (self.avPlayer) {
if (self.avPlayer.currentItem && self.avPlayer.currentItem != playerItem) {
[self.avPlayer replaceCurrentItemWithPlayerItem:playerItem];
}
} else {
[self setupAVPlayerWithPlayerItem:playerItem];
}
playerItem.forwardPlaybackEndTime = CMTimeMake(5, 1);
// Play
[self.avPlayer play];
}
How to make forwardPlaybackEndTime work?
Try this
AVPlayerItem.forwardPlaybackEndTime = CMTimeMake(5, 1);
Here 5 is the time till the AVPlayerItem will play.
Set the following on your AVPlayer
AVPlayer.actionAtItemEnd = AVPlayerActionAtItemEndNone;
Then set your notification
[[NSNotificationCenter defaultCenter] addObserver:self selector:#selector(playerItemDidPlayToEndTime:) name:AVPlayerItemDidPlayToEndTimeNotification object:nil];
with the method obviously
- (void) playerItemDidPlayToEndTime:(NSNotification*)notification
{
// do something here
}
then set your forwardPlaybackEndTime
AVPlayer.currentItem.forwardPlaybackEndTime = CMTimeAdd(AVPlayer.currentItem.currentTime, CMTimeMake(5.0, 1));
and start your avplayer playing
AVPlayer.rate = 1.0;
the notification will be triggered and your track will continue playing. In your handler you can stop it and do a seekToTime or whatever.
Alternatively you can just set a boundary observer
NSArray *array = [NSArray arrayWithObject:[NSValue valueWithCMTime:CMTimeMakeWithSeconds(5.0, 1)]];
__weak OTHD_AVPlayer* weakself = self;
self.observer_End = [self addBoundaryTimeObserverForTimes:array queue:NULL usingBlock:^{ if( weakself.rate >= 0.0 ) [weakself endBoundaryHit]; }];
I have checked below code its running and streaming stops at specified time:
- (void)playURL
{
NSURL *url = [NSURL URLWithString:#"http://clips.vorwaerts-gmbh.de/VfE_html5.mp4"];
self.playerItem = [AVPlayerItem playerItemWithURL:url];
self.playerItem.forwardPlaybackEndTime = CMTimeMake(10, 1);
self.avPlayer = [AVPlayer playerWithPlayerItem:self.playerItem];
[videoView setPlayer:self.avPlayer];
// Play
[self.avPlayer play];
}
I hope this will help you.
Also please check these tutorials: AVFoundation Framework
I think that you have to update avPlayer's current playerItem.
So,
playerItem.forwardPlaybackEndTime = CMTimeMake(5, 1);
it should be:
self.avPlayer.currentItem.forwardPlaybackEndTime = CMTimeMake(5, 1);
I thought maybe the fastest way was to go with Sound Services. It is quite efficient, but I need to play sounds in a sequence, not overlapped. Therefore I used a callback method to check when the sound has finished. This cycle produces around 0.3 seconds in lag. I know this sounds very strict, but it is basically the main axis of the program.
EDIT: I now tried using AVAudioPlayer, but I can't play sounds in a sequence without using audioPlayerDidFinishPlaying since that would put me in the same situation as with the callback method of SoundServices.
EDIT2: I think that if I could somehow get to join the parts of the sounds I want to play into a large file, I could get the whole audio file to sound continuously.
EDIT3: I thought this would work, but the audio overlaps:
waitTime = player.deviceCurrentTime;
for (int k = 0; k < [colores count]; k++)
{
player.currentTime = 0;
[player playAtTime:waitTime];
waitTime += player.duration;
}
Thanks
I just tried a technique that I think will work well for you. Build an audio file with your sounds concatenated. Then build some meta data about your sounds like this:
#property (strong, nonatomic) NSMutableDictionary *soundData;
#synthesize soundData=_soundData;
- (void)viewDidLoad {
[super viewDidLoad];
_soundData = [NSMutableDictionary dictionary];
NSArray *sound = [NSArray arrayWithObjects:[NSNumber numberWithFloat:5.0], [NSNumber numberWithFloat:0.5], nil];
[self.soundData setValue:sound forKey:#"soundA"];
sound = [NSArray arrayWithObjects:[NSNumber numberWithFloat:6.0], [NSNumber numberWithFloat:0.5], nil];
[self.soundData setValue:sound forKey:#"soundB"];
sound = [NSArray arrayWithObjects:[NSNumber numberWithFloat:7.0], [NSNumber numberWithFloat:0.5], nil];
[self.soundData setValue:sound forKey:#"soundC"];
}
The first number is the offset of the sound in the file, the second is the duration. Then get your player ready to play like this...
- (void)viewDidAppear:(BOOL)animated {
[super viewDidAppear:animated];
NSURL *url = [NSURL fileURLWithPath:[NSString stringWithFormat:#"%#/audiofile.mp3", [[NSBundle mainBundle] resourcePath]]];
NSError *error;
audioPlayer = [[AVAudioPlayer alloc] initWithContentsOfURL:url error:&error];
audioPlayer.numberOfLoops = -1;
if (audioPlayer == nil)
NSLog(#"%#", [error description]);
else {
[audioPlayer prepareToPlay];
}
}
Then you can build a low-level sound playing method like this ...
- (void)playSound:(NSString *)name withCompletion:(void (^)(void))completion {
NSArray *sound = [self.soundData valueForKey:name];
if (!sound) return;
NSTimeInterval offset = [[sound objectAtIndex:0] floatValue];
NSTimeInterval duration = [[sound objectAtIndex:1] floatValue];
audioPlayer.currentTime = offset;
[audioPlayer play];
dispatch_after(dispatch_time(DISPATCH_TIME_NOW, duration * NSEC_PER_SEC), dispatch_get_current_queue(), ^{
[audioPlayer pause];
completion();
});
}
And you can play sounds in rapid combination like this ...
- (IBAction)playAB:(id)sender {
[self playSound:#"soundA" withCompletion:^{
[self playSound:#"soundB" withCompletion:^{}];
}];
}
Rather than nesting blocks, you could build a higher-level method that takes a list of sound names and plays them one after the other, that would look like this:
- (void)playSoundList:(NSArray *)soundNames withCompletion:(void (^)(void))completion {
if (![soundNames count]) return completion();
NSString *firstSound = [soundNames objectAtIndex:0];
NSRange remainingRange = NSMakeRange(1, [soundNames count]-1);
NSArray *remainingSounds = [soundNames subarrayWithRange:remainingRange];
[self playSound:firstSound withCompletion:^{
[self playSoundList:remainingSounds withCompletion:completion];
}];
}
Call it like this...
NSArray *list = [NSArray arrayWithObjects:#"soundB", #"soundC", #"soundA", nil];
[self playSoundList:list withCompletion:^{ NSLog(#"done"); }];
I'm assuming you want to change the sequence or omit sounds sometimes. (Otherwise you would just build the asset with all three sounds in a row and play that).
There might be a better idea out there, but to get things very tight, you could consider producing that concatenated asset, pre-loading it - moving up all the latency to that one load, then seeking around it to change the sound.