Merge audio with video perfectly - ios

I've been trying to merge audio with video using AVMutableComposition and AVExportSession. Everything works perfectly except that the audio and video sources don't have the same duration.
So the exported movie is a bit laggy. Is there anyway to resize or redefine the rate of the video so that its duration becomes exactly equal to the audio's duration? For example, if the audio lasts 10 seconds and the video lasts 9 seconds, I'd like to play the video back at 9/10 speed, so they both end at the same time.
b

Solved
Use something like that:
[compositionVideoTrack scaleTimeRange:CMTimeRangeMake(kCMTimeZero,videoAsset.duration) toDuration:audioAsset.duration];

Related

Performance issues with AVMutableComposition - scaleTimeRange

I am using scaleTimeRange:toDuration: to produce a fast-motion effect of upto 10x the original video speed.But I noticed that videos start to stutter when played through an AVPlayer at 10x.
I also noticed that on OSX's QuickTime the same composition plays smoothly.
Another question states that the reason for this is hardware limitation , but I want to know if there is a way around this , so that the fast motion effect occurs smoothly over the length of the entire video.
Video Specs
Format : H.264 , 1280x544
FPS : 25
Data Size : 26MB
Data Rate : 1.17 Mbit/s
I have a feeling that by playing your videos at 10x using scaleTimeRange:toDuration simply has the effect of multiplying your data rate by 10, bringing it up to 10Mbit/s, which osx machines can handle, but iOS devices cannot.
In other words, you're creating videos that need to play back at 300 frames per second, which is pushing AVPlayer too hard.
If I didn't know about your other question, I would have said that the solution is to export your AVComposition using AVAssetExportSession, which should result in your high FPS video being down sampled to an easier to handle 30fps, and then play that with AVPlayer.
If AVAssetExportSession isn't working, you could try applying the speedup effect yourself, by reading the frames from the source video using AVAssetReader and writing every tenth frame to the output file using AVAssetWriter (don't forget to set the correct presentation timestamps).

AVPlayer does not play back AVComposition with more than 2 clips

I am attempting to stitch together video assets using AVComposition based on the code here:
https://developer.apple.com/library/mac/samplecode/AVCompositionDebugViewer/Introduction/Intro.html
On OSX it works perfectly, however on iOS when playing back via AVPlayer it only works with 1 or 2 input clips. If I attempt to add a third there will be nothing played back on the AVPlayerLayer. Wierdly if I observer the AVPlayer playback time using addPeriodicTimeObserverForInterval the video appears to be playing for the correct duration, but nothing plays back on the layer.
Does anyone have any insight into why this would be?
Turns out I was creating CMTime objects with differing timeScale values, which was causing rounding errors and creating gaps in my tracks. If a track had a gap then it would just fail to play. Ensuring that all my CMTime objects had the same timeScale made everything work perfectly.

seeking or scrubbing a video stream in reverse

I've created a scrubber in my app that allows the user to scrub forwards/backwards through a video via [AVPlayer seekToTime:toleranceBefore:toleranceAfter].
The video that is being scrubbed is captured via an AVCaptureSession that uses AVCaptureMovieFileOutput. I've ffprobed the resulting .MOV and the results are as expected (e.g., on my iPhone 5s I'm recording at 120fps at approx 23000 kb/s with approx 1 keyframe per second).
Since there is only approximately 1 keyframe per second, it is difficult to scrubber backward through the video with any precision and without any lag (since it would have to go back to the closest keyframe and the compute the frame at my current scrubbing position).
So I'm wondering if there is a better strategy for smooth scrubbing? There are apps out there that do this really well (e.g., I've examined the Coach's Eye app and it records video precisely the same way I do and yet its scrubbing performance is quite good).
I'd be very appreciative of any suggestions.

Synchronize multiple AVPlayers

I am trying to find a solution for a problem I got. I have 5 UIViews, which are all at the same position. Each UIView holds an AVPlayer with different videos. To be more precise they are all the same video, but encoded with different playback speed.
Video 1x speed
Video 4x speed
Video 8x speed
Video 16x speed
Video 24x speed
By default the video 1 is visible and playing, but I should be able to switch between the videos, but the switching shouldn't be visible for the user, therefore, I need to keep them synchronized. So If I am watching video 1 and switch to video 2, then video 2 should play exactly at the position, where video 1 stopped.
The sense is that it should look like, that the video is speeding up after an action, eg. a flick gesture.
I hope I described my issue good enough and I am very thankful for any suggestion.
I am using in the end an observer, which takes snapshots of the currentTime each 5 seconds, and calls all the other AVPLayer with the seekToTime method. This works fine to keep them synchronized. I just needed to adapt the CMTime for each player with different speed. As an example I got here 4x video:
videoPosition = player1.currentTime; //gets the video duration
videoPositionInSeconds = (Float64) videoPosition.value/videoPosition.timescale; //transfers the CMTime duration into seconds
[player2 seekToTime: CMTimeMakeWithSeconds(videoPositionInSeconds/4.0, player1.currentItem.asset.duration.timescale) toleranceBefore: kCMTimeZero toleranceAfter: kCMTimeZero];
Hope this helps.

Playing an AVMutableComposition with AVPlayer audio gets out of sync

I have an AVMutableComposition with 2 audio tracks and one video track. I'm using the composition to string about 40 different video clips from .mov files, putting the video content of each clip in the video track of my composition and the audio in the audio track. The second audio track I use for music.
I also have a synchronized layer for titles graphics.
When I play this composition using an AVPlayer, the audio slowly gets out of sync. It takes about 4 minutes to start becoming noticeable. It seems like if I only string together a handfull of longer clips the problem is not as apparent, it is when there are many clips shorter clips (~40 in my test) that it gets really bad.
Pausing and Playing doesn't re-sync the audio, however seeking does. In other words, if I let the video play to the end, towards the end the lip sync gets noticeably off even if I pause and play throughout, however, if I seek to a time towards the end the audio gets back in sync.
My hacky solution for now is to seek to the currentTime + 1 frame every minute or so. This creates an unpleasant jump in the video caused by a lag in the seek operation, so not a good solution.
Exporting with an ExportSession doesn't present this problem, audio remains in sync in the output movie.
I'm wondering if the new masterClock property in the AVPlayer is the answer to this, and if it is, how is it used?
I had the same issue and fixed it, among many other audio and video things, by specifying times timescales in the following manner:
CMTime(seconds: my_seconds, preferredTimescale: CMTimeScale(600))
Before, my time scale was CMTimeScale(NSEC_PER_SEC). That caused me jittery when composing clips at a different frame rate, plus this audio out-of-sync that Eddy mentions here.
In spite of looking like a magic number, 600 is a common multiple of 24, 30, 60 and 120. These are usual frame rates for different purposes. The common multiple avoids dragging around rounding problems when composing multiple clips.

Resources