AVPlayer does not play back AVComposition with more than 2 clips - ios

I am attempting to stitch together video assets using AVComposition based on the code here:
https://developer.apple.com/library/mac/samplecode/AVCompositionDebugViewer/Introduction/Intro.html
On OSX it works perfectly, however on iOS when playing back via AVPlayer it only works with 1 or 2 input clips. If I attempt to add a third there will be nothing played back on the AVPlayerLayer. Wierdly if I observer the AVPlayer playback time using addPeriodicTimeObserverForInterval the video appears to be playing for the correct duration, but nothing plays back on the layer.
Does anyone have any insight into why this would be?

Turns out I was creating CMTime objects with differing timeScale values, which was causing rounding errors and creating gaps in my tracks. If a track had a gap then it would just fail to play. Ensuring that all my CMTime objects had the same timeScale made everything work perfectly.

Related

Possible bug tracking buffer progress with AVPlayerItem.loadedTimeRanges

I'm running into some strange issues with AVQueuePlayer. I'll first explain the purpose of my implementation, then go over the issues I am hitting.
Running: iPhone5s, iOS 10.1.1, LTE connection
Video: Progressively downloading a video. .mp4, 5mb, 4 second duration.
The purpose of the implementation is to play a progressively downloaded video that loops seamlessly. The video won't contain any sounds.
I'm using an AVQueuePlayer (supporting iOS 9 and up) to loop videos. I set it up the same way Apple recommended. Which is to listen for the current player item to change and then move the player item to the end of the queue.
https://developer.apple.com/library/content/samplecode/avloopplayer/Listings/Projects_VideoLooper_VideoLooper_QueuePlayerLooper_swift.html#//apple_ref/doc/uid/TP40014695-Projects_VideoLooper_VideoLooper_QueuePlayerLooper_swift-DontLinkElementID_11
I am hitting 2 issues.
Issue 1: My designer gave me a video that contains a video track and an audio track. Durations are the same. I am able to track the buffer progress by checking the current player item loadedTimeRanges. However, when the video loops, it isn't seamless. So we tried a video without the audio track and hit Issue 2.
Issue 2: Testing the same video, but this video contains only a video track. The video loops amazingly. It's seamless. However, when checking the loadedTimeRanges to track the buffer progress, the duration will remain 0 until the video has completely loaded. Then the duration will report the total duration of the video.
Is Issue2 a bug? I also find it strange that removing the audio track creates a much more seamless loop.
I've provided my code below that is used to check seconds buffered. Note that it will return a duration of 0 if the playerItem.loadedTimeRanges.first.timeRangeValue doesn't exist. I can confirm that value does exist and the duration is properly returned when testing both issues.
public var secondsBuffered: Float64 {
if let playerItem = self.player?.currentItem {
if let loadedTimeRange = playerItem.loadedTimeRanges.first?.timeRangeValue {
let duration: Float64 = CMTimeGetSeconds(loadedTimeRange.duration)
return duration
}
}
return 0
}

iOS/AVFoundation: How to eliminate (occasional) blank frames between different videos within an AVComposition during playback

The app I’m working on loops a video a specified # of times by adding the same AVAssetTrack (created from the original video url) multiple times to the same AVComposition at successive intervals. The app similarly inserts a new video clip into an existing composition by 'removing' the time range from the composition's AVMutableCompositionTrack (for AVMediaTypeVideo) and inserting the new clip's AVAssetTrack into the previously removed time range.
However, occasionally and somewhat rarely, after inserting a new clip as described above into a time range within a repeat of the original looping video, there are resulting blank frames which only appear at the video loop’s transition points (within the composition), but only during playback - the video exports correctly without gaps.
This leads me to believe the issue is with the AVPlayer or AVPlayerItem and how the frames are currently buffered for playback, rather than how I'm inserting/ looping the clips or choosing the correct CMTime stamps to do so. The app is doing a bunch of things at once (loop visualization in the UI via an NSTimer, audio playback via Amazing Audio Engine) - could my issue be a result of competition for resources?
One more note: I understand that discrepancies between audio and video in an asset can cause glitches (i.e. the underlying audio is a little bit longer than the video length), but as I'm not adding an audioEncodingTarget to the GPUImageWriter that I'm using to record and save the video, the videos have no audio components.
Any thoughts or directions you can point me in would be greatly appreciated! Many thanks in advance.
Update: the flashes coincide with the "Had to drop a video frame" error logged by the GPUImage library, which according to the creator has to do with the phone not being able to process video fast enough. Can multi-threading solving this?
Update 2: So the flashes actually don't always correspond to the had to drop a video frame error. I have also disabled all of the AVRecorder/Amazing Audio Engine code and the issue still persists making it not a problem of resource competition between those engines. I have been logging properties of AVPlayer item and notice that the 'isPlayBackLikelyToKeepUp' which is always NO, and 'isPlaybackBufferFull' which is always yes.
So problem is solved - sort of frustrating how brutally simple the fix is. I just used a time range a frame shorter for adding the videos to the composition rather than the AVAssetTrack's time range. No more flashes. Hopefully the users won't miss that 30th of a second :)
shortened_duration = CMTimeSubtract(originalVideoAssetTrack.timeRange.duration, CMTimeMake(1,30));

AVMutableComposition play blank with AVPlayer but export fine with AVAssetExportSession

I create an AVMutableComposition containing videos imported from the library. I pass an AVMutableVideoComposition containing an array of AVMutableVideoCompositionInstruction in order to transform (scale, rotate) and fade (opacity) between video tracks.
With most videos no problem occurs. I can play the composition using AVPlayer or export it using AVAssetExportSession without any problem. But for some videos when the composition is passed to an AVPlayer I get a black screen like when the timing of the differents AVMutableVideoCompositionInstruction are not set correctly. However when I export the exact same composition containing the exact same instruction and timing (actually the same composition) to an AVAssetExportSession the movie exports perfectly. If I play the imported video alone as simple AVAsset using the AVPlayer, the video play fine.
I tried to raise the time scale of my movie to 60000 instead of 600 hopping it was the duration of the imported video that was an issue (floating point being lost) but it is not the case.
It happens under iOS8 but may also happens under iOS7 I dont know.
PS: I didn't post code because there are far too many lines.

Playing an AVMutableComposition with AVPlayer audio gets out of sync

I have an AVMutableComposition with 2 audio tracks and one video track. I'm using the composition to string about 40 different video clips from .mov files, putting the video content of each clip in the video track of my composition and the audio in the audio track. The second audio track I use for music.
I also have a synchronized layer for titles graphics.
When I play this composition using an AVPlayer, the audio slowly gets out of sync. It takes about 4 minutes to start becoming noticeable. It seems like if I only string together a handfull of longer clips the problem is not as apparent, it is when there are many clips shorter clips (~40 in my test) that it gets really bad.
Pausing and Playing doesn't re-sync the audio, however seeking does. In other words, if I let the video play to the end, towards the end the lip sync gets noticeably off even if I pause and play throughout, however, if I seek to a time towards the end the audio gets back in sync.
My hacky solution for now is to seek to the currentTime + 1 frame every minute or so. This creates an unpleasant jump in the video caused by a lag in the seek operation, so not a good solution.
Exporting with an ExportSession doesn't present this problem, audio remains in sync in the output movie.
I'm wondering if the new masterClock property in the AVPlayer is the answer to this, and if it is, how is it used?
I had the same issue and fixed it, among many other audio and video things, by specifying times timescales in the following manner:
CMTime(seconds: my_seconds, preferredTimescale: CMTimeScale(600))
Before, my time scale was CMTimeScale(NSEC_PER_SEC). That caused me jittery when composing clips at a different frame rate, plus this audio out-of-sync that Eddy mentions here.
In spite of looking like a magic number, 600 is a common multiple of 24, 30, 60 and 120. These are usual frame rates for different purposes. The common multiple avoids dragging around rounding problems when composing multiple clips.

Merge audio with video perfectly

I've been trying to merge audio with video using AVMutableComposition and AVExportSession. Everything works perfectly except that the audio and video sources don't have the same duration.
So the exported movie is a bit laggy. Is there anyway to resize or redefine the rate of the video so that its duration becomes exactly equal to the audio's duration? For example, if the audio lasts 10 seconds and the video lasts 9 seconds, I'd like to play the video back at 9/10 speed, so they both end at the same time.
b
Solved
Use something like that:
[compositionVideoTrack scaleTimeRange:CMTimeRangeMake(kCMTimeZero,videoAsset.duration) toDuration:audioAsset.duration];

Resources