Merging Videos Together with AVMutableComposition Causes No Audio - ios

I have an NSArray containing a list of video NSURL's and I want to merge them together to make one long compilation. The problem is, when I use the code below the videos merge but there is no audio.
- (IBAction)buildVideo {
// 1 - Create AVMutableComposition object. This object will hold your AVMutableCompositionTrack instances.
AVMutableComposition *mixComposition = [[AVMutableComposition alloc] init];
AVMutableCompositionTrack *track = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo
preferredTrackID:kCMPersistentTrackID_Invalid];
int i = 0;
for (id object in movieArray) {
AVAsset *asset = [AVAsset assetWithURL:object];
if(i == 0){
[track insertTimeRange:CMTimeRangeMake(kCMTimeZero, asset.duration)
ofTrack:[[asset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0] atTime:kCMTimeZero error:nil];
}else {
[track insertTimeRange:CMTimeRangeMake(kCMTimeZero, asset.duration)
ofTrack:[[asset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0] atTime:[mixComposition duration] error:nil];
}
i = i + 1;
}
// 4 - Get path
NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
NSString *documentsDirectory = [paths objectAtIndex:0];
NSString *myPathDocs = [documentsDirectory stringByAppendingPathComponent:
[NSString stringWithFormat:#"mergeVideo-%d.mov",arc4random() % 1000]];
NSURL *url = [NSURL fileURLWithPath:myPathDocs];
// 5 - Create exporter
AVAssetExportSession *exporter = [[AVAssetExportSession alloc] initWithAsset:mixComposition
presetName:AVAssetExportPresetHighestQuality];
exporter.outputURL=url;
exporter.outputFileType = AVFileTypeQuickTimeMovie;
exporter.shouldOptimizeForNetworkUse = YES;
[exporter exportAsynchronouslyWithCompletionHandler:^{
dispatch_async(dispatch_get_main_queue(), ^{
[self exportDidFinish:exporter];
});
}];
}

The reason you're not getting audio is that you're not adding the audio track. You need to create an additional AVMutableCompositionTrack with a type of AVMediaTypeAudio:
AVMutableComposition *mixComposition = [[AVMutableComposition alloc] init];
AVMutableCompositionTrack *videoTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo
preferredTrackID:kCMPersistentTrackID_Invalid];
AVMutableCompositionTrack *audioTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeAudio
preferredTrackID:kCMPersistentTrackID_Invalid];
And insert the time range for the source audio and video tracks into both composition tracks:
CMTime insertTime = kCMTimeZero;
for (id object in movieArray) {
AVAsset *asset = [AVAsset assetWithURL:object];
CMTimeRange timeRange = CMTimeRangeMake(kCMTimeZero, asset.duration);
[videoTrack insertTimeRange:timeRange
ofTrack:[[asset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0]
atTime:insertTime
error:nil];
[audioTrack insertTimeRange:timeRange
ofTrack:[[asset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0]
atTime:insertTime
error:nil];
insertTime = CMTimeAdd(insertTime,asset.duration);
}

Related

How do I add a watermark (text or image) to a video?

I'm doing some video manipulation in my app. Currently this code takes a user generated video, and adds a sound to the video, then exports it in the same quality. This works well, but I can't figure out how to add a watermark to the video. I'm happy to do it with a UIIMage, or text layer, I just want it to say the name of my app on the video, without losing quality. Does anyone know how I can augment this code so that it adds a watermark?
I am working in Objective-C, not Swift.
AVMutableComposition* mixComposition = [AVMutableComposition composition];
NSURL *audioPath = [[NSBundle mainBundle] URLForResource:#"sound" withExtension:#"mp3"];
AVURLAsset* audioAsset = [[AVURLAsset alloc]initWithURL:audioPath options:nil];
AVURLAsset* videoAsset = [[AVURLAsset alloc]initWithURL:self.videoUrl options:nil];
AVAssetTrack *assetVideoTrack = [videoAsset tracksWithMediaType:AVMediaTypeVideo].lastObject;
// add video
AVMutableCompositionTrack *compositionVideoTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo
preferredTrackID:kCMPersistentTrackID_Invalid];
[compositionVideoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, videoAsset.duration)
ofTrack:[[videoAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0]
atTime:kCMTimeZero error:nil];
[compositionVideoTrack setPreferredTransform:assetVideoTrack.preferredTransform];
// add video audio
AVMutableCompositionTrack *videoSoundTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeAudio
preferredTrackID:kCMPersistentTrackID_Invalid];
[videoSoundTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, videoAsset.duration)
ofTrack:[[videoAsset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0]
atTime:kCMTimeZero error:nil];
// add sound
AVMutableCompositionTrack *compositionCommentaryTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeAudio
preferredTrackID:kCMPersistentTrackID_Invalid];
[compositionCommentaryTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, audioAsset.duration)
ofTrack:[[audioAsset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0]
atTime:self.avPlayer.currentTime error:nil];
CGSize sizeOfVideo = [compositionVideoTrack naturalSize];
AVAssetExportSession* _assetExport = [[AVAssetExportSession alloc] initWithAsset:mixComposition
presetName:AVAssetExportPresetPassthrough];
NSArray *dirPaths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
NSString *docsDir = [dirPaths objectAtIndex:0];
NSString *savePath = [docsDir stringByAppendingPathComponent:#"video.mov"];
NSURL *savetUrl = [NSURL fileURLWithPath:savePath];
if ([[NSFileManager defaultManager] fileExistsAtPath:savePath]) {
[[NSFileManager defaultManager] removeItemAtPath:savePath error:nil];
[[NSFileManager defaultManager] removeItemAtURL:savetUrl error:nil];
}
_assetExport.outputFileType = #"com.apple.quicktime-movie";
_assetExport.outputURL = savetUrl;
_assetExport.shouldOptimizeForNetworkUse = YES;
[_assetExport exportAsynchronouslyWithCompletionHandler:
^(void ) {
dispatch_async(dispatch_get_main_queue(), ^{
[MBProgressHUD hideHUDForView:self.view animated:YES];
});
switch (_assetExport.status)
{
case AVAssetExportSessionStatusFailed:
{
NSLog (#"FAIL %#",_assetExport.error);
break;
}
case AVAssetExportSessionStatusCompleted:
{
dispatch_async(dispatch_get_main_queue(), ^{
// work with the video
});
break;
}
case AVAssetExportSessionStatusCancelled:
{
NSLog (#"CANCELED");
break;
}
}
NSLog(#"Export Status %d-- %#", _assetExport.status, _assetExport.outputURL);
}
];

AVMutableComposition - Rotate Portrait Videos To Fit Within A Landscape (16:9) Video

I am using AVMutableComposition to combine and create a landscape video of 16:9. The issue is that any videos taken in portrait mode show as been rotated within the video. Now I believe I need to use an instruction layer but I am unsure about how to add it to the following code.
- (void)combineTheAssets {
AVMutableComposition *mixComposition = [[AVMutableComposition alloc] init];
[mixComposition setNaturalSize:CGSizeMake(1280, 720)];
AVMutableCompositionTrack *track = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo
preferredTrackID:kCMPersistentTrackID_Invalid];
AVMutableCompositionTrack *audioTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeAudio
preferredTrackID:kCMPersistentTrackID_Invalid];
int i = 0;
for (NSMutableDictionary * dict in self.arraySelectedAssets) {
AVAsset *asset = [dict objectForKey:#"avasset"];
CMTime insertTime = kCMTimeZero;
CMTimeRange timeRange = CMTimeRangeMake(kCMTimeZero, asset.duration);
[track insertTimeRange:timeRange
ofTrack:[[asset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0]
atTime:insertTime
error:nil];
[audioTrack insertTimeRange:timeRange
ofTrack:[[asset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0]
atTime:insertTime
error:nil];
insertTime = CMTimeAdd(insertTime,asset.duration);
i = i + 1;
}
// 4 - Get path
NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
NSString *documentsDirectory = [paths objectAtIndex:0];
NSString *myPathDocs = [documentsDirectory stringByAppendingPathComponent:
[NSString stringWithFormat:#"mergeVideo-%d.mov",arc4random() % 1000]];
self.combinedVideoURL = [NSURL fileURLWithPath:myPathDocs];
// 5 - Create exporter
self.timerExporter = [NSTimer scheduledTimerWithTimeInterval:0.01f
target:self
selector:#selector(exporterProgress)
userInfo:nil
repeats:YES];
// 5 - Create exporter
self.exporter = [[AVAssetExportSession alloc] initWithAsset:mixComposition
presetName:AVAssetExportPresetHighestQuality];
self.exporter .outputURL=self.combinedVideoURL;
self.exporter .outputFileType = AVFileTypeQuickTimeMovie;
self.exporter .shouldOptimizeForNetworkUse = YES;
[self.exporter exportAsynchronouslyWithCompletionHandler:^{
[self.timerExporter invalidate];
dispatch_async(dispatch_get_main_queue(), ^{
self.labelProgressText.text = [NSString stringWithFormat:#"%# (100%%)", NSLocalizedString(#"Combining The Videos", nil)];
[self applyTheFilter];
});
}];
}

I am uploading video to server after merging it with other video but the recorded video is rotated clockwise

Merging Code-
AVAsset *firstAsset=[AVAsset assetWithURL:urlIntroVideo]; AVAsset
*secondAsset=[AVAsset assetWithURL:recordedVideoUrl];
if (firstAsset !=nil && secondAsset!=nil) {
//[[AppDelegate Getdelegate] showIndicator];
// 1 - Create AVMutableComposition object. This object will hold your AVMutableCompositionTrack instances.
AVMutableComposition *mixComposition = [[AVMutableComposition alloc] init];
// 2 - Video track
/********************************************************************************
--------------->> VIDEO MERGING TRACK <<--------------------
********************************************************************************/
AVMutableCompositionTrack *firstTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo
preferredTrackID:kCMPersistentTrackID_Invalid];
NSArray *videoTracks = [NSArray arrayWithArray: [firstAsset tracksWithMediaType:AVMediaTypeVideo]];
NSLog(#"Video Tracks count 1st Assest=> %ld",[videoTracks count]);
NSArray *audioTracks = [NSArray arrayWithArray: [firstAsset tracksWithMediaType:AVMediaTypeAudio]];
NSLog(#"Auido Tracks count => %ld",[audioTracks count]);
[firstTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, firstAsset.duration)
ofTrack:[[firstAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0]
atTime:kCMTimeZero error:nil];
[firstTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, secondAsset.duration)
ofTrack:[[secondAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0]
atTime:firstAsset.duration error:nil];
NSString *bundleDirectory = [[NSBundle mainBundle] bundlePath];
NSString *potrait_intro = [bundleDirectory stringByAppendingPathComponent:#"silent08s.mp3"];
NSURL *potrait_intro_url = [NSURL fileURLWithPath:potrait_intro];
AVAsset *audioAssest=[AVAsset assetWithURL:potrait_intro_url];
AVMutableCompositionTrack *audioTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeAudio
preferredTrackID:kCMPersistentTrackID_Invalid];
NSArray *audioTracksSilent = [NSArray arrayWithArray: [audioAssest tracksWithMediaType:AVMediaTypeAudio]];
[audioTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, audioAssest.duration)
ofTrack:([audioTracksSilent count]>0)?[audioTracksSilent objectAtIndex:0]:nil
atTime:kCMTimeZero
error:nil];
NSArray *audioTracks2 = [NSArray arrayWithArray: [secondAsset tracksWithMediaType:AVMediaTypeAudio]];
NSLog(#"Auido Tracks count => %ld",[audioTracks2 count]);
[audioTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, secondAsset.duration)
ofTrack:[[secondAsset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0]
atTime:audioAssest.duration
error:nil];
//
// NSFileManager *fileMgr = [NSFileManager defaultManager];
NSString *documentsDirectory = [NSHomeDirectory()
stringByAppendingPathComponent:#"Documents"];
NSString *myPathDocs = [documentsDirectory stringByAppendingPathComponent:[NSString
stringWithFormat:#"final_merged_video-%d.mov",arc4random() % 1000]];
NSURL *url = [NSURL fileURLWithPath:myPathDocs];
// 5 - Create exporter
AVAssetExportSession *exporter = [[AVAssetExportSession alloc] initWithAsset:mixComposition
presetName:AVAssetExportPresetHighestQuality];
exporter.outputURL=url;
exporter.outputFileType = AVFileTypeQuickTimeMovie;
exporter.shouldOptimizeForNetworkUse = YES;
[exporter exportAsynchronouslyWithCompletionHandler:^{
dispatch_async(dispatch_get_main_queue(), ^{
[[AppDelegate Getdelegate] hideIndicator];
[self exportDidFinish:exporter];
});
}]; }
Please help me out on this
You probably loose the rotation info on your AVAsset when you merged both videos and/or your server is unable to process it.
What you need to do is getting the rotation info (you can use this for example : https://gist.github.com/lukabernardi/5020724 ), and then rotate the video accordingly BEFORE sending it to the server.

Audio Mixing in iOS using AVFoundation doesnt work

I am trying to stitch a bunch of videos together and then add some music over the video in iOS. The audio is added using AVMutableAudioMix. However when the video is finally exported the audio mix is missing. Here is how the code looks like :
- (void)mergeVideos:(NSArray *)videos{
AVMutableComposition *mixComposition = [[AVMutableComposition alloc] init];
AVMutableCompositionTrack *videoTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo
preferredTrackID:kCMPersistentTrackID_Invalid];
AVMutableCompositionTrack *audioTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeAudio
preferredTrackID:kCMPersistentTrackID_Invalid];
CMTime currentTime = kCMTimeZero;
for (AVAsset *asset in videos) {
// 2 - Video track
[videoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, asset.duration)
ofTrack:[[asset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0] atTime:currentTime error:nil];
[audioTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, asset.duration)
ofTrack:[[asset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0] atTime:currentTime error:nil];
currentTime = CMTimeAdd(currentTime, asset.duration);
}
// 4 - Get path
NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
NSString *documentsDirectory = [paths objectAtIndex:0];
NSString *myPathDocs = [documentsDirectory stringByAppendingPathComponent:
[NSString stringWithFormat:#"mergeVideo-%d.mp4",arc4random() % 1000]];
NSURL *url = [NSURL fileURLWithPath:myPathDocs];
// 5 - Create exporter
NSArray *compatiblePresets = [AVAssetExportSession exportPresetsCompatibleWithAsset:mixComposition];
NSString *quality = AVAssetExportPresetHighestQuality;
//load the audio
AVMutableAudioMix *audioMix = nil;
NSURL *audioURL = [self loadAudioFile];// gives a url for a .caf file from the bundle
if (audioURL) {
AVURLAsset *audioAsset = [AVURLAsset assetWithURL:audioURL];
AVAssetTrack *aTrack = (AVAssetTrack *)[[audioAsset tracksWithMediaType:AVMediaTypeAudio] firstObject];
AVMutableAudioMixInputParameters *trackMix =
[AVMutableAudioMixInputParameters audioMixInputParametersWithTrack:aTrack];
[trackMix setVolumeRampFromStartVolume:0 toEndVolume:1 timeRange:
CMTimeRangeMake(CMTimeMakeWithSeconds(0, 1), CMTimeMakeWithSeconds(3, 1))];
[trackMix setVolume:1.0 atTime:kCMTimeZero];
AVMutableAudioMix *audioMix = [AVMutableAudioMix audioMix];
audioMix.inputParameters = [NSArray arrayWithObject:trackMix];
}
AVAssetExportSession *exporter = [[AVAssetExportSession alloc] initWithAsset:mixComposition
presetName:quality];
if (audioMix) {
exporter.audioMix = audioMix;
}
exporter.outputURL=url;
exporter.outputFileType = AVFileTypeMPEG4;
exporter.shouldOptimizeForNetworkUse = YES;
[exporter exportAsynchronouslyWithCompletionHandler:^{
dispatch_async(dispatch_get_main_queue(), ^{
[self exportDidFinish:exporter];
});
}];
}
There is no error while executing. Just that the music contained in the .caf file doesnt get mixed to the exported file. Any idea what is going on?
You haven't inserted the audio from the .caf file into any AVMutableCompositionTrack, so the audio mix associated with aTrack is not going to adjust the volume for the .caf file. If you'd like the audio from the .caf file to be included in the video with the associated audio mix, create another AVMutableCompositionTrack to hold the audio from the .caf file:
AVMutableCompositionTrack *audioTrackCAF = [mixComposition addMutableTrackWithMediaType:AVMediaTypeAudio
preferredTrackID:kCMPersistentTrackID_Invalid];
(with the time ranges set to your liking):
[audioTrackCAF insertTimeRange:CMTimeRangeMake(kCMTimeZero, audioAsset.duration)
ofTrack:aTrack atTime:kCMTimeZero error:nil]
Additionally, is helpful to pass an NSError * (instead of nil) to insertTimeRange:ofTrack:atTime:error to make sure you had valid time ranges and your media was inserted.

How to merge Audio and video using AVMutableCompositionTrack

In my application I need to merge audio and video and then I need to play the audio file in the Mediaplayer. How can I merge the audio and video in IOS. Is there is any source code for this. Please suggest me some ideas
Thanks in advance
Use this
AVURLAsset* audioAsset = [[AVURLAsset alloc]initWithURL:audioUrl options:nil];
AVURLAsset* videoAsset = [[AVURLAsset alloc]initWithURL:videoUrl options:nil];
AVMutableComposition* mixComposition = [AVMutableComposition composition];
AVMutableCompositionTrack *compositionCommentaryTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeAudio
preferredTrackID:kCMPersistentTrackID_Invalid];
[compositionCommentaryTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, audioAsset.duration)
ofTrack:[[audioAsset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0]
atTime:kCMTimeZero error:nil];
AVMutableCompositionTrack *compositionVideoTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo
preferredTrackID:kCMPersistentTrackID_Invalid];
[compositionVideoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, videoAsset.duration)
ofTrack:[[videoAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0]
atTime:kCMTimeZero error:nil];
AVAssetExportSession* _assetExport = [[AVAssetExportSession alloc] initWithAsset:mixComposition
presetName:AVAssetExportPresetHighestQuality];
NSString* videoName = #"export.mov";
NSString *exportPath = [NSTemporaryDirectory() stringByAppendingPathComponent:videoName];
NSURL *exportUrl = [NSURL fileURLWithPath:exportPath];
if ([[NSFileManager defaultManager] fileExistsAtPath:exportPath])
{
[[NSFileManager defaultManager] removeItemAtPath:exportPath error:nil];
}
_assetExport.outputFileType = #"com.apple.quicktime-movie";
_assetExport.outputURL = exportUrl;
_assetExport.shouldOptimizeForNetworkUse = YES;
[_assetExport exportAsynchronouslyWithCompletionHandler:
^(void ) {
// your completion code here
}
}
];
You can merge videos by creating the Mutable composition.
AVMutableComposition* composition = [[AVMutableComposition alloc]init];
AVURLAsset* video1 = [[AVURLAsset alloc]initWithURL:[NSURL fileURLWithPath:path1]options:nil];
NSArray *pathComponents = [NSArray arrayWithObjects:
[NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES) lastObject],#"MyAudio.m4a",nil];
NSURL *outputFileURL = [NSURL fileURLWithPathComponents:pathComponents];
AVAsset *audioAsset = [AVAsset assetWithURL:outputFileURL];
//Create mutable composition of audio type
AVMutableCompositionTrack *audioTrack = [composition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
[audioTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero,video1.duration)
ofTrack:[[audioAsset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0] atTime:kCMTimeZero error:nil];
AVMutableCompositionTrack* composedTrack = [composition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
[composedTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, video1.duration)
ofTrack:[[video1 tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0]
atTime:kCMTimeZero error:nil];
AVAssetExportSession*exporter = [[AVAssetExportSession alloc]initWithAsset:composition presetName:AVAssetExportPresetHighestQuality];
[exporter exportAsynchronouslyWithCompletionHandler:^{
case AVAssetExportSessionStatusFailed:
NSLog(#"Failed to export video");
break;
case AVAssetExportSessionStatusCancelled:
NSLog(#"export cancelled");
break;
}
For Video merging visit this tutorial. http://iosbucket.blogspot.in/2015/04/mp4-conversion-and-video-merging-in-ios.html
You can also find the sample project for merging videos.
http://www.raywenderlich.com/13418/how-to-play-record-edit-videos-in-ios visit this tutorial for merging audio and video files
It’s a bit late to answer, but it can help someone in the future. repeats audio if video duration is larger than audio.
+ (void)mergeVideoWithAudio:(NSURL *)videoUrl audioUrl:(NSURL *)audioUrl success:(void (^)(NSURL *url))success failure:(void (^)(NSError *error))failure {
AVMutableComposition *mixComposition = [AVMutableComposition new];
NSMutableArray<AVMutableCompositionTrack *> *mutableCompositionVideoTrack = [NSMutableArray new];
NSMutableArray<AVMutableCompositionTrack *> *mutableCompositionAudioTrack = [NSMutableArray new];
AVMutableVideoCompositionInstruction *totalVideoCompositionInstruction = [AVMutableVideoCompositionInstruction new];
AVAsset *aVideoAsset = [AVAsset assetWithURL:videoUrl];
AVAsset *aAudioAsset = [AVAsset assetWithURL:audioUrl];
AVMutableCompositionTrack *videoTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
AVMutableCompositionTrack *audioTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
if (videoTrack && audioTrack) {
[mutableCompositionVideoTrack addObject:videoTrack];
[mutableCompositionAudioTrack addObject:audioTrack];
AVAssetTrack *aVideoAssetTrack = [aVideoAsset tracksWithMediaType:AVMediaTypeVideo].firstObject;
AVAssetTrack *aAudioAssetTrack = [aAudioAsset tracksWithMediaType:AVMediaTypeAudio].firstObject;
if (aVideoAssetTrack && aAudioAssetTrack) {
[mutableCompositionVideoTrack.firstObject insertTimeRange:CMTimeRangeMake(kCMTimeZero, aVideoAssetTrack.timeRange.duration) ofTrack:aVideoAssetTrack atTime:kCMTimeZero error:nil];
CMTime videoDuration = aVideoAsset.duration;
if (CMTimeCompare(videoDuration, aAudioAsset.duration) == -1) {
[mutableCompositionAudioTrack.firstObject insertTimeRange:CMTimeRangeMake(kCMTimeZero, aVideoAssetTrack.timeRange.duration) ofTrack:aAudioAssetTrack atTime:kCMTimeZero error:nil];
} else if (CMTimeCompare(videoDuration, aAudioAsset.duration) == 1) {
CMTime currentDuration = kCMTimeZero;
while (CMTimeCompare(currentDuration, videoDuration) == -1) {
// repeats audio
CMTime restTime = CMTimeSubtract(videoDuration, currentDuration);
CMTime maxTime = CMTimeMinimum(aAudioAsset.duration, restTime);
[mutableCompositionAudioTrack.firstObject insertTimeRange:CMTimeRangeMake(kCMTimeZero, maxTime) ofTrack:aAudioAssetTrack atTime:currentDuration error:nil];
currentDuration = CMTimeAdd(currentDuration, aAudioAsset.duration);
}
}
videoTrack.preferredTransform = aVideoAssetTrack.preferredTransform;
totalVideoCompositionInstruction.timeRange = CMTimeRangeMake(kCMTimeZero, aVideoAssetTrack.timeRange.duration);
}
}
NSString *outputPath = [NSHomeDirectory() stringByAppendingPathComponent:#"tmp/screenCapture.mp4"];
if ([[NSFileManager defaultManager] fileExistsAtPath:outputPath]) {
[[NSFileManager defaultManager] removeItemAtPath:outputPath error:nil];
}
NSURL *outputURL = [NSURL fileURLWithPath:outputPath];
AVAssetExportSession *exportSession = [[AVAssetExportSession alloc] initWithAsset:mixComposition presetName:AVAssetExportPresetHighestQuality];
exportSession.outputURL = outputURL;
exportSession.outputFileType = AVFileTypeMPEG4;
exportSession.shouldOptimizeForNetworkUse = YES;
// try to export the file and handle the status cases
[exportSession exportAsynchronouslyWithCompletionHandler:^{
switch (exportSession.status) {
case AVAssetExportSessionStatusFailed:
failure(exportSession.error);
break;
case AVAssetExportSessionStatusCancelled:
failure(exportSession.error);
break;
default:
success(outputURL);
break;
}
}];
}

Resources