AVPlayer pauses for no obvious reason - ios

While playing a video, I'm seeing rate change notifications from AVPlayer that don't seem to be connected to app activity.
When my app receives a UIApplicationDidEnterBackgroundNotification notification, I tell the AVPlayer to pause.  The logic is that it should come back to the foreground at the same place the user left. If I do not call pause when going to the background, the problem doesn't appear.
The sequence of events sent to the player is pause, seekToTime:, play.  Generally, this works fine but, after the app has been sent to the background and then returned to the foreground, each play invocation results in two rate changes from the AVPlayer.  The first is to 1 and the second, immediately following, is to 0.  This pattern continues for each call to  -[AVPlayer play] as long as that player instance is in use.
I'm able to put a breakpoint on -[AVPlayer pause] and I do not see it being hit when the rate changes to 0.  If I comment out the seekToTime: call, the problem goes away.  If I use seekToTime:completionHandler:, I also get the same problem although my block's finished parameter is YES.
Apart from "how do I fix this", I'm interested in any details about how to detect the reason for rate changes in AVPlayer that aren't connected to play/pause.  (Putting a breakpoint on -[AVPlayer setRate:] never seems to trigger.)
(One workaround that almost works is to save the player position when entering the background, let it play, and fix the position when returning to the foreground.  This also requires some manipulation of audio levels, which is probably doable, but another problem is that not all background notifications indicate that the view has been obscured (e.g. double-tap home button).  This leads to cases where my workaround shows a distracting moving image when the app is interrupted but still visible.)
Suggestions?
(A last bit of extra information: In all the cases I've tried, I eventually get to a state where the AVPlayer is changing the rate from 1 to 0 moments after I invoke 'play' if I've returned from the background and then performed a seek. There are things I can do to make it less frequent but none that eliminate it except getting rid of the AVPlayer and creating a new instance. This results is very long delays but is better than a complete malfunction...I guess.
I have some evidence that the seek distance affects the result, which suggests that the error might be in the underlying buffering mechanism. Without knowing what causes the rate change (other than play/pause) I don't see a way to investigate further.)

You are probably getting an AVPlayerItemPlaybackStalledNotification.
Try this:
[[NSNotificationCenter defaultCenter]
addObserverForName:AVPlayerItemPlaybackStalledNotification
object:cell.avPlayerItem
queue:[NSOperationQueue mainQueue]
usingBlock:^(NSNotification *note) {
DDLogVerbose(#"%#", #"AVPlayerItemPlaybackStalledNotification");
}];
However, Apple docs say Playback will continue once a sufficient amount of media has subsequently been delivered. (https://developer.apple.com/library/iOS/documentation/AVFoundation/Reference/AVPlayerItem_Class/Reference/Reference.html#//apple_ref/doc/uid/TP40009532-CH1-SW83)
This is not happening for me or you.
I am still tracking the reason down.
EDIT:
This was the problem for me. When AVPlayerItem.likelyToKeepUp was YES then it wouldn't happen.
Not sure why it wasn't resuming.
EDIT:
From the Apple docs there is a situation where playback will not resume:
This property communicates a prediction of playability. Factors
considered in this prediction include I/O throughput and media decode
performance. It is possible for playbackLikelyToKeepUp to indicate NO
while the property playbackBufferFull indicates YES. In this event the
playback buffer has reached capacity but there isn't the statistical
data to support a prediction that playback is likely to keep up in the
future.
It is up to you to decide whether to continue media playback.

Related

AVPlayer Callback (Observer) for Frame Changed

Is there way to set up an observer / callback on an AVPlayer to get notified when the frame changes?
I am aware of both addBoundaryTimeObserver and addPeriodicTimeObserver however these are approximations that require me to estimate the frame rate, etc.
There is a note that:
General State Observations: You can use Key-value observing (KVO) to observe state changes to many of the player’s dynamic properties, such as its currentItem or its playback rate. You should register and unregister for KVO change notifications on the main thread. This avoids the possibility of receiving a partial notification if a change is being made on another thread. AV Foundation invokes observeValue(forKeyPath:of:change:context:) on the main thread, even if the change operation is made on another thread.
However currentTime on AVPlayerItem is a method, not a property so I cannot use KVO for that.
You could add an AVPlayerItemVideoOutput to your AVPlayerItem and periodically poll the output with hasNewPixelBufferForItemTime which will tell you of the arrival of a new frame. However you then need to acquire the frame with copyPixelBufferForItemTime, so you should probably immediately release it. Here's an example of setting up AVPlayerItemVideoOutput. This is polling, so you could realise late or even miss a frame change.
You could also quickly preprocess the video file (if it is a file) without decompressing the frames, to determine the frame presentation time stamps. You could feed those timestamps one at a time to addBoundaryTimeObserver to decide when you'd crossed a frame boundary. Here's an example of parsing a video file.
AVSampleBufferDisplayLayer, which is a lower level AVPlayerLayer, that lets you feed it video frame CMSampleBuffers looks like a promising way to find out when a frame changes, but it doesn't seem to tell you when it has displayed one of the sample buffers you gave it. And I don't think AVSampleBufferDisplayLayer handles audio, either.
You could also reimplement the AVPlayer playback system - then you'd be painfully (and accurately) aware of frame changes (and audio changes, and opengl/metal). Surely that kind of effort is not required here. What kind of feature are are you trying to implement exactly?

VoiceOver stops announcing UIAccessibilityPostNotification messages

I'm working on an application that is designed to speak some information about the data it captures from video as the video is being captured. Right now I'm using UIAccessibilityPostNotification to get VoiceOver to say what I want it to say with UIAccessibilityAnnouncementNotification. This typically works great until the user attempts to navigate between my UI controls. After swiping back and forth along the elements (a menu and an info button) a little bit, the application stops speaking the persistent notifications. I also find that posted notifications do not announce if I background the app and then return it to the foreground
I have a magic tap handler that pauses and resumes that persistent announcement and once it is triggered (which also triggers speech about the last rendered info), manually triggered speech works again and upon resume from a second magic tap the announcements continue as if nothing had stopped.
Is there a mechanism to get voiceover to reliably resume speaking without requiring some other kind of user-screen input in between?
Fortunately this was, following the rule of "it's probably your own code's fault first", my own fault. A timer was inadvertently changing the state of things and what I was using to track when to announce was being set to an invalid state which stopped the announcements.

iPodMusicPlayer playbackState goes wrong after queue updates

I'm currently developing a music player app and ran into a coulpe of interesting behaviours regarding the the playback states of MPMusicPlayerController:
when I update the player queue and and set the nowPlayingItem because the music was playing, so I want it to continue. These changes obviously changes playbackStates. But if these updates happen rapidly, after some time (sometimes almost at first), the playbackState goes wrong and it says paused but the music is still playing!
and an other problem regarding these updates, when I update the queue, set the nowPlaying item and call the play method, I think it should cause 1 or 2 playback state change, but what happens is that the playbackStateDidChange notification gets called 4-5 times quickly. I really don't get why is there so many playback state changes, and I think maybe these causes the playbackState to go wrong.
I would very much appreciate any help!
Thanks

Is there any scenario that can cause ViewDidLoad to be called before didBecomeActive?

I know it's sounds silly but just to clear a point.
Is there any chance that view did load will be called before didBecomeActive ?
Is it totally impossible ?
EDIT
We have a crash that happens when user is coming back to the app from the background and we start to use openGL. The crash error points that we try to use openGL in the background.
It is important to say that our app lives in the background as a VOIP app.
We try to figure out if there is a chance that somehow we are triggering something in the background thats causes the app restart openGl in the background.
In the stack we see:
[VideoCallViewController viewDidLoad] (VideoCallViewController.m:283)
And few lines after that:
[GPUImageContext createContext]
And finally:
gpus_ReturnNotPermittedKillClient + 10
We are trying to figure out if there is a way that [VideoCallViewController viewDidLoad] was called in the background or that we must assume that we are in the foreground, and somehow moving to the background right after the viewDidLoad ?
Second option
The second option is that we are indeed moving to the background right after the viewDidLoad. The point here is that we are listening to AppWillResignActive and we pause the GPUIMage. So we can not understand why do we get the crash ?
Thanks
Thanks
When / Where do you instantiate the various GPUImage objects that you are using? Is it within viewDidLoad or possibly within init:?
It's just pure rampant speculation here since you didn't really post any code...
but if you are disposing of objects when the app heads to the background that are not then re-created when it comes back to the foreground (perhaps because the viewController was retained by a parent, and therefore init: was not called again but viewDidLoad was...) Then you may be trying to send OpenGL messages to objects that don't actually exist anymore.
On the (probably unlikely) chance that my speculation is right, you could easily fix it with the common "getter" pattern of:
- (GPUImageObjectOfInterest*)instanceOfObject {
if (!_classVariableOfThisType) {
_classVariableOfThisType = [[GPUImageObjectOfInterest alloc] init];
// custom configuration, etc...
}
return _classVariableOfThisType;
}
and then use [self instanceOfObject]; wherever you used to use _classVariableOfThisType
It's a low overhead, but reasonably foolproof way of making sure a key object exists under a wide range of app interruption / background & foreground & low memory conditions.
Don't be shy to post too much code though, we can read through an entire class if needed. Some of us like reading code! (and it will really help the quality of response you get...)

How to prevent Nuance's DragonMobile from turning off VoiceOver?

Nuance's DragonMobile component apparently turns off VoiceOver announcements between the initial call to SKRecognizer's initWithType:detection:language:delegate and the component's call to recognizerDidFinishRecording:. It makes some sense that they do this, since they don't want the VoiceOver announcements to be picked up by the mic and transcribed.
The problem is that there's usually a 1-2 second gap between the initialization of the recognizer and the initial call to recognizerDidBeginRecording:. In order to prevent the user's first few words from getting cut out of the transcription, it's necessary to use recognizerDidBeginRecording: to indicate to the user that they should start speaking (i.e. you can't just have them hit the mic button and start speaking immediately).
My problem is that since DragonMobile turns off VoiceOver as soon as initWithType: is called, I have no way of indicating to a VoiceOver user that they should begin talking at the appropriate time.
Found something of a workaround: DragonMobile allows you to specify SKEarcons, which are audio files that play whenever recording is started, stopped or canceled. I'm going to record VoiceOver making the announcements that I need and then use these recordings as the earcons, so that it will sound like the rest of VoiceOver.
According to a Nuance technical rep I just spoke to, DragonMobile does indeed take over the audio layer and suppress any output during recording, and they don't expose any way around this other than the earcons.

Resources