I have two AVPlayer() items playing videos of the same duration (10 seconds). The goal is to have them loop and stay in sync with one another. I add them as sublayers of the same UIView and then call player.play() on each one of them.
The problem though is that as code execution obviously has the slightest delay as one is called after the other one, the videos are out of sync (although only a few milliseconds, it is noticeable).
I do not have the option to create an AVMutableComposition as I have seen other posts suggest, so is there anyway to have two separate players truly stay in sync and play EXACTLY at the same time?
Thank you!
If you want to achieve the sync, you should load the videos separately with AVPlayer and observe the AVPlayerItemStatus property of each player. Only when all of the players have the status .readyToPlay you can loop through the players and set the .rate property.
Edit:
You can also synchronize them by using setRate(_:time:atHostTime:). Don't forget begin loading media data using preroll(atRate:completionHandler:) before calling setRate. Basically:
wait for readyToPlay
preroll(atRate:completionHandler:) when all players are ready
setRate(_:time:atHostTime:) when all players were prerolled
Related
I found myself in a situation where I need to simulate audio playback to trick OS controls and MPNowPlayingInfoCenter into thinking that an audio is being played. This is because I am building a player that plays multiple audio tracks, with pauses in-between creating one, continuous "audio" track. I have already everything setup inside the app itself, and the lock screen controls are working correctly but the only problem I am facing is while the actual audio stops and a pause is being "played", the lock screen info center stops the timer, and it only continues with showing correct time and overall state once another audio track starts playing.
Here is the example of my audio track built from audio files and pause items:
let items: [AudioItem] = [
.audio("part-1.mp3"),
.pause(duration: 5), // value of type: TimeInterval
.audio("part-2.mp3"),
.pause(duration: 3),
... // the list goes on
]
then in my custom player, once AVAudioPlayer finishes its job with current item, I get the next one from the array and play either a .pause with a scheduled Timer or another .audio with AVAudioPlayer.
extension Player: AVAudioPlayerDelegate {
func audioPlayerDidFinishPlaying(_ player: AVAudioPlayer, successfully flag: Bool) {
playNextItem()
}
}
And here lies the problem, once the AVAudioPlayer stops, the Now Playing info center automatically stops too, even tho I keep feeding it fresh nowPlayingInfo. Then when it hits another .audio item, it resumes correctly and shows current time, etc.
And here lies the question
how do I trick the MPNowPlayingInfoCenter into thinking that audio is being played while I "play" my .pause item?
I realise that it may still not be clear, what I am trying to achieve but I am happy to share more insight if needed. Thanks!
Some solutions I am currently thinking about:
A. Keeping 1s long empty audio track that would play on loop for as long as the pause is needed to play.
B. Creating programatically empty audio track with appropriate lenght and playing it instead of using Timer for keeping track of pause duration/progress and relying completely on AVAudioPlayer for both .audio and .pause items. Not sure this is possible though.
C. Maybe there is a way to tell the MPNowPlayingInfoCenter that the audio keeps playing without the need of using AVAudioPlayer but some API I am not familiar with?
AVAudioPlayer is probably the wrong tool here. You want AVAudioPlayerNode, which is slightly lower-level. Create an AVAudioEngine, and attach an AVAudioPlayerNode. You can then call scheduleFile(_:at:completionHandler:) to play the audio at the times you want.
Much of the Apple documentation on AVAudioEngine appears broken right this moment, but the links hopefully will be available again shortly in the links for Audio Engine Building Blocks. (If it stays down and you have trouble finding docs, leave a comment and I'll hunt down the WWDC videos and other tutorials on using AVAudioEngine. It's not particularly difficult for simple problems.)
If you know in advance how you want to compose these items (and it looks like you may), see also AVMutableComposition, which lets you glue together assets very efficiently, including adding empty segments of silence. See Media Composition and Editing for the various tools in that space.
I'm using FSAudioController to play mp3's from my service. I have a URL list and I'm playing songs by shuffling the list.
What I want to do is, 5 seconds before the current song ends, next song will start playing and there will be a soft transition between songs. But I couldn't find anything about how to play 2 songs at the same time.
I thought about using 2 different FSAudioController objects, but I wanted to ask if there is smarter way to do it first.
Thanks in advance!
Unfortunately FSAudioController does not support crossfade option so you will have to use 2 different FSAudioController objects.
I have some issue with avqueueplayer like app freezing for 2-3 seconds when each item ends in the queue. I am using queueplayer to play the records one after the other without gap. In order to achieve no gaps b/w player items, I need to load the assets for a player item in advance. I load assets for current playing player item and next player item so that there will be no gap at the end of 1st record. As the 1st record ends, I add 2nd player item to queueplayer to play and also load assets for 3rd player item.
In this way my playerqueue always contains a single player item but asset loading will be done for current playing item and next playing item. This will keep moving as the new records are added.
I found that a freeze of half a second is observed in new iOS devices and around 3-4 seconds in old iPod devices like iPod 4?
Can I know how we can achieve gapless and yet non freeze UI experience?
Thanks
Try lifting your requirement to only have a single player item in the queue.
An AVQueuePlayer will operate more effectively at item transitions if there is an item in its queue beyond what's currently playing.
"Loading" media can mean a few different things, and it's not clear what you're doing. Maybe you're downloading media outside of AVFoundation, building an AVAsset or building an AVPlayerItem. The work you're doing or deferring will vary depending on the kind of media (e.g an mp4 vs an HLS stream). Even if you "preload" everything (which is much more complicated for HLS media), AVFoundation still has more work to do that it won't start until the AVPlayerItem is put in a player. It has to set up a render pipeline specific to the media it needs to play back, and then start the rendering process. AVQueuePlayer can achieve gapless playback by starting some of that process before playback for that item needs to start. The queue player is also efficient about its use of resources in that it won't start loading items far down the queue until it needs to.
I am using AVPlayer to play videos. The lenght of them is short, 2-5 second. They are played in a random order. The problem is, when changing video, and a new video starts to play, the device lags for a very short time, but i wan't the change to be fluid. Is there a way to preload videos with AVPlayer?
Try using AVQueuePlayer. I am assuming that what you described as a lag, in fact is the pre buffering delay. This should be minimized or actually entirely be gotten rid of when using AVQueuePlayer as that baby will buffer the next AVPlayerItem while playing the current one.
From the AVFoundation documentation:
On iOS 4.1 and later, you can use an AVQueuePlayer object to play a
number of items in sequence (AVQueuePlayer is a subclass of AVPlayer).
Also see Mihai's answer on Pre-buffering-for-avqueueplayer.
Please can someone tell me if there is a way to selectively pre-buffer the avplayeritems in the AVQueuePlayer array rather than leaving it down to the AVQueuePlayer automatic way of only loading the next item in as the first item finishes playing.
I'm loading a sequence of 4 short movie clips and I'd like to pre-cache them before telling the AVQueuePlayer to play the array. Is there actually a way of getting under the bonet of avqueueplayer and controlling the pre-buffering as desired?
Right now with its default lazy-loading behaviour, I'm getting some chugging in the playback, with the clips not even playing-out properly because the AVQueuePlayer is trying to loading-in the next clip while it's playing. I'm doing this on iPad deployed to the actual device and not with the simulator.
You can do this with the mpmovieplayer by calling [player prepareToPlay]; which basically manually initiates the loading of each video file you want and then you can check for the completeion of loading by watching for the mpmovieplayerLoadstateDidChange notification and testing the loadState value to see if it has fully loaded ,then telling the mpmovieplayer to play. How can you effectively do a similar thing with AVQueuePlayer?
Is this even possible or have I discovered one of the major drawbacks of the AVQueuePlayer?
Nice suggestion with the playerObserver Stephen, but what is needed is something like you need to be able to explicitly get individual items to load into memory and then tell the AVQueuePlayer 'do not play the first item in the array until ALL items in the array are loaded into memory' There currently seems to be no way to start even the second item in the array loading until the first one is coming to an end!
As a slightly separate issue, I've also noticed some weirdness in the AVQueuePlayer where, if you load two of the same source video file into the array (both referenced as two completely separate AVPlayerItems as you should do) when you play the video clips in the array all the way through, the first time the clip plays through ok, but when it comes to playing that same clip again (as a separate AVPlayerItem) it plays-through very quickly until a certain point in the video then finally starts playing at normal speed from there.
Has anyone else noticed this behaviour?
Apple Developer Support just confirmed to me that AVQueuePlayer does not buffer video items.
I have the same question. I wish AV Foundation has something like asset fully loaded notification.
Following code may partially solve the problem:
Float64 durationSeconds = CMTimeGetSeconds([<#An asset#> duration]);
CMTime secondThird = CMTimeMakeWithSeconds(durationSeconds*2.0/3.0, 1);
NSArray *times = [NSArray arrayWithObjects:[NSValue valueWithCMTime:secondThird], nil];
self.playerObserver = [<#A player#> addBoundaryTimeObserverForTimes:times queue:NULL usingBlock:^{
}];