Transitioning to background, AVPlayer Video has small audio gap - ios

I have followed the many helpful previous questions to get my AVPlayer successfully streaming video when my app goes to the background. There are two methods described on Apple's QA1668 and they both work for my stream urls.
The problem is that there is a noticeable audio gap during the transition that is identical for both methods. On my iPhone 6 in release mode I would say the gap is less than 0.5 seconds, which may not seem terrible but if I'm playing something like a music video this is very distracting.
After more testing it looks like this gap actually occurs when I remove the AVPlayerLayer (or, if I am using the other method, when I disable the AVMediaCharacteristicVisual tracks) as I have determined it will still happen if I hook those actions up to a button rather than the backgrounding state.
My guess is that is has something to do with the audio re-syncing to the new video state of the AVPlayer but really I have no clue. Any help would be greatly appreciated!

Related

Playing interactive videos with finger gestures in iOS

I am working on a kids ABC learning app which will be somewhat like this app.
Petting Zoo
The user can do these gestures... Swipe UP, DOWN, LEFT , RIGHT and TOUCH and each gesture has a small animation clip (approx duration 1 -3 secs each ) linked to it like the character jumping on Swipe UP, etc. There will also be an IDLE loopable movie which will be playing continuously when there is no input from user.
So I am trying to use videos in MP4 and M4V format for these gestures but the problem is that the videos are lagging just before playing. Means they dont play instantly upon doing a gesture but take a time of say micro second to load and play.
I am looking for output like the video above. You can see that the animations are so responsive and do not hang even for little time.
My developer once achieved such smooth output with the MP4 video clips but those clips didnt have audio embedded in them and then when he used videos with audio embedded in them, they were lagging again.
Can audio be the issue for lag here ? Or anything else you experts will like to suggest.
Please help guys. Yours inputs will be very valuable for me.
You can use - (void)replaceCurrentItemWithPlayerItem:(AVPlayerItem *)item of AVPlayer.
This methods helps you to load the new item to an existing AVPlayer.
Here is the important reference for playing multiple videos using single AVPlayer.

AVPlayer or AVQueuePlayer during iPhone Call

I have an app that uses AVPlayer (or AVQueuePLayer) to play local files that were recorded by the App. All works great. But I also want this to work on iPhone when a call is in progress (the videos are event recordings). What I found is that during a phone call, the video feed to avplayerLayer goes blank, AVPlayer rate change to 0 (STOP), and all attempts to change rate to non-zero (PLAY) are ignored (rate stays at 0). There does not appear to be any documentation on this, and the only way to detect this condition in the player, is that player is STOPPED and will not start PLAYBACK. Of course, I also check for audio interruptions, and call center calls in progress.
Obviously, in this case the interruption is caused by a call, so there is always a inactive/resume or a intactive/background/foreground/resume transition. As well as audio route notification, audio interruption. So indirectly I know the condition is probably occurring.
So questions are:
(1) Is there any direct method (specific to AVPlayer,AVPlayerLayer) to be notified that AVPlayer is in this non-playing mode. I now use "avplayer.rate failed to change rate from 0 to non-zero", but this seems hacky (and too much "crossing the streams"!) I want to Notify user that video temporarily can not be played or previewed, so they do not think the App is broken. And also inform them or automatically continue Playback when iPhone call ends. (Without a looping process that keeps trying to start playback every 500ms!)
(2) Can AVPlayer play anything while a iPhone call (Green Bar) is in progress? or is this just the way apple designed the AVPlayer SDK? (If so there is no documentation on this) Obviously, other apps can play video during an iPhone call, but I suspect they are using a lower level SDK and not AVPlayer.

How does Vine looping video playback work on iOS?

Vine loops videos without any pauses in between loops. I've tried creating an AVPlayer that plays the video from the beginning whenever it ends. This introduces a slight lag between every loop. I'm looking for suggestions of ways to avoid this lag.
I have considered creating much longer videos out of repeating short clips. Am I missing some obvious solution?
Thanks.
You can use AVPlayerQueue to queue up the videos (in this case the same one multiple times), and it will handle playing them in sequence.
For example
iPhone Smooth Transition from One Video To Another

AVComposition breaks on Airplay

I have a video composition which I'd like to play over Airplay (without mirroring). The app works as expected when using normal Airplay mirroring, but I'd like to get the speed, reliability, and resolution bump you get from using Airplay video instead.
The problem is that when I set
player.usesAirPlayVideoWhileAirPlayScreenIsActive = YES;
...the player goes blank.
Notes:
Since I don't create separate windows for each display, they are both trying to use the same AVPlayer.
My AVVideoComposition contains different files and adds opacity ramps between them.
This unanswered question suggests that the problem is more likely due to the fact that I'm playing an AVComposition than the use of a shared player: AVComposition doesn't play via Airplay Video
Two questions:
Do I have to get rid of the player on the iPad?
Can an AVVideoComposition ever be played over AirPlay?
I can't make comments so I had to post this as an answer although it might not fully respond to the questions.
I had similar issue and at the end I found out that when AVPlayer plays AVComposition it simply doesn't display anything on the external display. That's why I had to do it myself by listening to UIScreen connection notifications.
I have to say that all worked pretty perfect. I'm checking first if there are more than one screen and if there are I simply move the AVPlayer on that screen while displaying a simple message on the device's screen that content is played on... plus the name of AirPlay device. This way I can put whatever I want on the external display and is not very complicated. Same thing is when I receive UIScreenDidConnectNotification.
That was fine until I noticed that the composition plays really choppy on the the external display. Even if it consists of only one video without any complex edits or overlays. Same video plays perfectly if I save it to the Camera Roll or if I use MPMoviePlayerController.
I've tried many things like lowering resolutions, lowering renderScale and so on but with no success.
One thing bothers me more is how actually Apple do this in iMovie - if you have AirPlay enabled and you play a project (note it's still not rendered so it must use a composition in order to display it) right after tapping play button it opens a player that plays content really smoothly on the external monitor. If you however activate AirPlay from the player it closes and start rendering the project. After that it plays it I thing by using MPMoviePlayerController.
I'm still trying to find a solution and will post back if I have any success.
So for the two questions:
I don't see why you have to get rid.
Yes it can be played but with different technique and obviously issues.
in the app .plist create a new item called:
required background modes
add a new array element called:
App plays audio or streams audio/video using AirPlay
Not sure if you have already tried this, but you don't mention it in your post.
Cheers!

MPMoviePlayerController blank frame after seeking to particular time-line

I am developing an iPhone application in which I play a video using MPMoviePlayerController. I use custom controls to play the video.
I have a slider that shows video time line. Using this user can seek the movie to any time-line of the movie.
When user continuously moves the slider:
Pause the video only for first time; [MPMoviePlayerController-obj pause]
MPMoviePlayerController-obj.currentPlaybackTime = slider.value
When slider action ends:
Play the video; [MPMoviePlayerController-obj play]
This plays the movie from the position where user had left the slider. But, it leads to blank frame when movie completes playing. This defect occurs randomly; i.e not for all the seek'd time.
What is the reason for getting the blank frame? How do I solve this?
I'm not sure if this will work, but try setting the initialPlaybackTime to either the slider.value or to currentPlaybackTime.
For being sure that your content is not flawed, hence possibly triggering that issue, you should try to replicate your faulty MPMoviePlayerController results using Apple's reference video content.
HTTP-Streaming: bipbop.m3u8
Progressive Download & Local
Playback: sample_mpeg4.mp4
I have personally observed many issues in connection with improper encoding. Weird things tend to happen when working with lossy compressed content. This is true for video (i-frames vs. p-frames) as well as audio (variable bitrate).
One being improper playback durations being reported. Such issue may result into an unexpected finished-state. I have seen cases where MPMoviePlayerController still shows a bunch of seconds to play even though the actual video has obviously finished. Those cases occur frequently once the user seeks around within the video.
Once you made sure that the issue occurs using the given sample files as well, you should file a bug-report.

Resources