Finding nearest zero-crossing in an AVURLAsset - ios

I'm trying to find a way to fade in an audio track held locally without an audio glitch. I'm using AVPlayer and referencing mp3s from the iPodLibrary using AVURLAsset. The following method works most of the time but not all the time so I'm thinking I need to scan through the audio data looking for the nearest volume zero-crossing and do my fade from there. Any pointers would be much appreciated.
float duration = 0.5;
AVAsset *asset = [self.av_Player.currentItem asset];
NSArray *keys = [NSArray arrayWithObject:#"tracks"];
[asset loadValuesAsynchronouslyForKeys:keys completionHandler:^(void) {
NSError *error = nil;
NSTimeInterval now = [self currentPlaybackTime];
CMTime mainFadeIn = CMTimeMakeWithSeconds(now, 6000);
CMTime mainFadeDuration = CMTimeMakeWithSeconds(duration, 6000);
CMTimeRange timerange = CMTimeRangeMake(mainFadeIn, mainFadeDuration);
AVKeyValueStatus trackStatus = [asset statusOfValueForKey:#"tracks" error:&error];
switch (trackStatus) {
case AVKeyValueStatusLoaded:
if( self.av_Player ) {
NSArray *tracks = [asset tracksWithMediaType:AVMediaTypeAudio];
AVMutableAudioMixInputParameters *volumeMixInput = [AVMutableAudioMixInputParameters audioMixInputParameters];
[volumeMixInput setVolumeRampFromStartVolume:0.0 toEndVolume:tovolume timeRange:timerange];
[volumeMixInput setTrackID:[[tracks objectAtIndex:0] trackID]];
AVMutableAudioMix *audioMix = [AVMutableAudioMix audioMix];
[audioMix setInputParameters:[NSArray arrayWithObject:volumeMixInput]];
[self.av_Player.currentItem setAudioMix:audioMix];
}
break;
default:
break;
}
}];

No idea why I was getting glitches with the above method but assume it might have something to do with using blocks. Anyway I used the code below and it now works smoothly for fade in and fade outs - hope this helps others.
NSTimeInterval now = [self currentPlaybackTime];
AVPlayerItem *playerItem = self.av_Player.currentItem;
AVAsset *asset = playerItem.asset;
NSArray *audioTracks = [asset tracksWithMediaType:AVMediaTypeAudio];
NSMutableArray *allAudioParams = [NSMutableArray array];
for (AVAssetTrack *track in audioTracks) {
AVMutableAudioMixInputParameters *audioInputParams = [AVMutableAudioMixInputParameters audioMixInputParameters];
[audioInputParams setVolume:fromvolume atTime:CMTimeMakeWithSeconds(now-0.1, 6000)];
[audioInputParams setVolume:fromvolume atTime:CMTimeMakeWithSeconds(now, 6000)];
[audioInputParams setVolume:tovolume atTime:CMTimeMakeWithSeconds(now+duration, 6000)];
[audioInputParams setTrackID:[track trackID]];
[allAudioParams addObject:audioInputParams];
}
AVMutableAudioMix *audioMix = [AVMutableAudioMix audioMix];
[audioMix setInputParameters:allAudioParams];
[playerItem setAudioMix:audioMix];

Related

How to set AVPlayer Sound Level meter in iOS?

I'm using AVPlayer for my app using HTTP live streaming.Now I want to implement a level meter for that audio stream.
I found several examples using AVAudioPlayer. But I cannot find a solution for getting the required informations off.
AVPlayer.NSURL *url = [NSURL URLWithString:#"http://www.stephaniequinn.com/Music/Allegro%20from%20Duet%20in%20C%20Major.mp3"];
self.playerItem = [AVPlayerItem playerItemWithURL:url];
self.player = [AVPlayer playerWithPlayerItem:self.playerItem];
self.player = [AVPlayer playerWithURL:url];
[self.player play];
Please try this one
if ([mPlayer respondsToSelector:#selector(setVolume:)]) {
mPlayer.volume = 0.0;
} else {
NSArray *audioTracks = mPlayerItem.asset.tracks;
// Mute all the audio tracks
NSMutableArray *allAudioParams = [NSMutableArray array];
for (AVAssetTrack *track in audioTracks) {
AVMutableAudioMixInputParameters *audioInputParams =[AVMutableAudioMixInputParameters audioMixInputParameters];
[audioInputParams setVolume:0.0 atTime:kCMTimeZero];
[audioInputParams setTrackID:[track trackID]];
[allAudioParams addObject:audioInputParams];
}
AVMutableAudioMix *audioZeroMix = [AVMutableAudioMix audioMix];
[audioZeroMix setInputParameters:allAudioParams];
[mPlayerItem setAudioMix:audioZeroMix]; // Mute the player item
}

iOS - AVAssestExportSession can only export maximum 8 tracks after playing with AVPlayer

I'm trying to loop some fragments of a recorded video and merge them into one video.
I've successfully merged and exported a composition with up to 16 tracks. But when I try to play the composition using AVPlayer before merging, I can only export a maximum of 8 tracks.
First, I create AVComposition and AVVideoComposition
+(void)previewUserClipDanceWithAudio:(NSURL*)videoURL audioURL:(NSURL*)audioFile loop:(NSArray*)loopTime slowMotion:(NSArray*)slowFactor showInViewController:(UIViewController*)viewController completion:(void(^)(BOOL success, AVVideoComposition* videoComposition, AVComposition* composition))completion{
AVMutableComposition* mixComposition = [[AVMutableComposition alloc] init];
NSMutableArray *arrayInstruction = [[NSMutableArray alloc] init];
AVMutableVideoCompositionInstruction *videoCompositionInstruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
AVURLAsset *audioAsset = [[AVURLAsset alloc]initWithURL:audioFile options:nil];
//NSLog(#"audio File %#",audioFile);
CMTime duration = kCMTimeZero;
AVAsset *currentAsset = [AVAsset assetWithURL:videoURL];
BOOL isCurrentAssetPortrait = YES;
for(NSInteger i=0;i< [loopTime count]; i++) {
//handle looptime array
NSInteger loopDur = [[loopTime objectAtIndex:i] intValue];
NSInteger value = labs(loopDur);
//NSLog(#"loopInfo %d value %d",loopInfo,value);
//handle slowmotion array
double slowInfo = [[slowFactor objectAtIndex:i] doubleValue];
double videoScaleFactor = fabs(slowInfo);
AVMutableCompositionTrack *currentTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
AVMutableCompositionTrack *audioTrack;
audioTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeAudio
preferredTrackID:kCMPersistentTrackID_Invalid];
if (i==0) {
[currentTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, currentAsset.duration) ofTrack:[[currentAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0] atTime:duration error:nil];
[audioTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, currentAsset.duration) ofTrack:[[currentAsset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0] atTime:duration error:nil];
} else {
[currentTrack insertTimeRange:CMTimeRangeMake(CMTimeSubtract(currentAsset.duration, CMTimeMake(value, 10)), CMTimeMake(value, 10)) ofTrack:[[currentAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0] atTime:duration error:nil];
if (videoScaleFactor==1) {
[audioTrack insertTimeRange:CMTimeRangeMake(CMTimeSubtract(currentAsset.duration, CMTimeMake(value, 10)), CMTimeMake(value, 10)) ofTrack:[[currentAsset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0] atTime:duration error:nil];
}
//slow motion here
if (videoScaleFactor!=1) {
[currentTrack scaleTimeRange:CMTimeRangeMake(CMTimeSubtract(currentAsset.duration, CMTimeMake(value, 10)), CMTimeMake(value, 10))
toDuration:CMTimeMake(value*videoScaleFactor, 10)];
NSLog(#"slowmo %f",value*videoScaleFactor);
}
}
AVMutableVideoCompositionLayerInstruction *currentAssetLayerInstruction = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:currentTrack];
AVAssetTrack *currentAssetTrack = [[currentAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];
BOOL isCurrentAssetPortrait = YES;
//CGFloat assetScaleToFitRatio;
//assetScaleToFitRatio = [self getScaleToFitRatioCurrentTrack:currentTrack];
if(isCurrentAssetPortrait){
//NSLog(#"portrait");
if (slowInfo<0) {
CGRect screenRect = [[UIScreen mainScreen] bounds];
CGFloat ratio = screenRect.size.height / screenRect.size.width;
// we have to adjust the ratio for 16:9 screens
if (ratio == 1.775) ratio = 1.77777777777778;
CGFloat complimentSize = (currentAssetTrack.naturalSize.height*ratio);
CGFloat tx = (currentAssetTrack.naturalSize.width-complimentSize)/2;
// invert translation because of portrait
tx *= -1;
// t1: rotate and position video since it may have been cropped to screen ratio
CGAffineTransform t1 = CGAffineTransformTranslate(currentAssetTrack.preferredTransform, tx, 0);
// t2/t3: mirror video vertically
CGAffineTransform t2 = CGAffineTransformTranslate(t1, currentAssetTrack.naturalSize.width, 0);
CGAffineTransform t3 = CGAffineTransformScale(t2, -1, 1);
[currentAssetLayerInstruction setTransform:t3 atTime:duration];
} else if (loopDur<0) {
CGRect screenRect = [[UIScreen mainScreen] bounds];
CGFloat ratio = screenRect.size.height / screenRect.size.width;
// we have to adjust the ratio for 16:9 screens
if (ratio == 1.775) ratio = 1.77777777777778;
CGFloat complimentSize = (currentAssetTrack.naturalSize.height*ratio);
CGFloat tx = (currentAssetTrack.naturalSize.width-complimentSize)/2;
// invert translation because of portrait
tx *= -1;
// t1: rotate and position video since it may have been cropped to screen ratio
CGAffineTransform t1 = CGAffineTransformTranslate(currentAssetTrack.preferredTransform, tx, 0);
// t2/t3: mirror video horizontally
CGAffineTransform t2 = CGAffineTransformTranslate(t1, 0, currentAssetTrack.naturalSize.height);
CGAffineTransform t3 = CGAffineTransformScale(t2, 1, -1);
[currentAssetLayerInstruction setTransform:t3 atTime:duration];
} else {
[currentAssetLayerInstruction setTransform:currentAssetTrack.preferredTransform atTime:duration];
}
}else{
// CGFloat translateAxisX = (currentTrack.naturalSize.width > MAX_WIDTH )?(0.0):0.0;// if use <, 640 video will be moved left by 10px. (float)(MAX_WIDTH - currentTrack.naturalSize.width)/(float)4.0
// CGAffineTransform FirstAssetScaleFactor = CGAffineTransformMakeScale(assetScaleToFitRatio,assetScaleToFitRatio);
// [currentAssetLayerInstruction setTransform:
// CGAffineTransformConcat(CGAffineTransformConcat(currentAssetTrack.preferredTransform, FirstAssetScaleFactor),CGAffineTransformMakeTranslation(translateAxisX, 0)) atTime:duration];
}
if (i==0) {
duration=CMTimeAdd(duration, currentAsset.duration);
} else {
if (videoScaleFactor!=1) {
duration=CMTimeAdd(duration, CMTimeMake(value*videoScaleFactor, 10));
} else {
duration=CMTimeAdd(duration, CMTimeMake(value, 10));
}
}
[currentAssetLayerInstruction setOpacity:0.0 atTime:duration];
[arrayInstruction addObject:currentAssetLayerInstruction];
}
AVMutableCompositionTrack *AudioBGTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
[AudioBGTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, audioAsset.duration) ofTrack:[[audioAsset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0] atTime:CMTimeSubtract(duration, audioAsset.duration) error:nil];
videoCompositionInstruction.timeRange = CMTimeRangeMake(kCMTimeZero, duration);
videoCompositionInstruction.layerInstructions = arrayInstruction;
CGSize naturalSize;
if(isCurrentAssetPortrait){
naturalSize = CGSizeMake(MAX_HEIGHT,MAX_WIDTH);//currentAssetTrack.naturalSize.height,currentAssetTrack.naturalSize.width);
} else {
naturalSize = CGSizeMake(MAX_WIDTH,MAX_HEIGHT);//currentAssetTrack.naturalSize;
}
AVMutableVideoComposition *videoComposition = [AVMutableVideoComposition videoComposition];
videoComposition.instructions = [NSArray arrayWithObject:videoCompositionInstruction];
videoComposition.frameDuration = CMTimeMake(1, 30);
videoComposition.renderSize = CGSizeMake(naturalSize.width,naturalSize.height);
NSLog(#"prepared");
AVVideoComposition *composition = [videoComposition copy];
AVComposition *mixedComposition = [mixComposition copy];
completion(YES, composition, mixedComposition);
}
Then, I set the AVPlayer
-(void)playVideoWithComposition:(AVVideoComposition*)videoComposition inMutableComposition:(AVComposition*)composition{
MBProgressHUD *hud = [MBProgressHUD showHUDAddedTo:self.view animated:YES];
hud.label.text = myLanguage(#"kMergeClip");
savedComposition = [composition copy];
savedVideoComposition = [videoComposition copy];
playerItem = [AVPlayerItem playerItemWithAsset:composition];
playerItem.videoComposition = videoComposition;
[[NSNotificationCenter defaultCenter] addObserver:self selector:#selector(repeatVideo:) name:AVPlayerItemDidPlayToEndTimeNotification object:playerItem];
if (!player) {
player = [AVPlayer playerWithPlayerItem:playerItem];
layer = [AVPlayerLayer playerLayerWithPlayer:player];
layer.frame = [UIScreen mainScreen].bounds;
[self.ibPlayerView.layer insertSublayer:layer atIndex:0];
NSLog(#"create new player");
}
if (player.currentItem != playerItem ) {
[player replaceCurrentItemWithPlayerItem:playerItem];
}
player.actionAtItemEnd = AVPlayerActionAtItemEndNone;
//[player seekToTime:kCMTimeZero];
[playerItem addObserver:self
forKeyPath:#"status"
options:NSKeyValueObservingOptionInitial | NSKeyValueObservingOptionNew
context:#"AVPlayerStatus"];
}
When user previews all the video they want and hit save. I use this method to export
+(void)mergeUserCLip:(AVVideoComposition*)videoComposition withAsset:(AVComposition*)mixComposition showInViewController:(UIViewController*)viewController completion:(void(^)(BOOL success, NSURL *fileURL))completion{
MBProgressHUD *hud = [MBProgressHUD showHUDAddedTo:viewController.view animated:YES];
hud.mode = MBProgressHUDModeDeterminateHorizontalBar;
hud.label.text = myLanguage(#"kMergeClip");
//Name merge clip using beat name
//NSString* beatName = [[[NSString stringWithFormat:#"%#",audioFile] lastPathComponent] stringByDeletingPathExtension];
NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
NSString *documentsDirectory = [paths objectAtIndex:0];
NSString *tmpDir = [[documentsDirectory stringByDeletingLastPathComponent] stringByAppendingPathComponent:#"tmp"];
NSString *myPathDocs = [tmpDir stringByAppendingPathComponent:[NSString stringWithFormat:#"merge-beat.mp4"]];
//Not remove here, will remove when call previewPlayVC
[[NSFileManager defaultManager] removeItemAtPath:myPathDocs error:nil];
// 1 - set up the overlay
CALayer *overlayLayer = [CALayer layer];
UIImage *overlayImage = [UIImage imageNamed:#"watermark.png"];
[overlayLayer setContents:(id)[overlayImage CGImage]];
overlayLayer.frame = CGRectMake(720-221, 1280-109, 181, 69);
[overlayLayer setMasksToBounds:YES];
// aLayer = [CALayer layer];
// [aLayer addSublayer:labelLogo.layer];
// aLayer.frame = CGRectMake(MAX_WIDTH- labelLogo.width - 10.0, MAX_HEIGHT-50.0, 20.0, 20.0);
// aLayer.opacity = 1;
// 2 - set up the parent layer
CALayer *parentLayer = [CALayer layer];
CALayer *videoLayer = [CALayer layer];
parentLayer.frame = CGRectMake(0, 0, MAX_HEIGHT,MAX_WIDTH);
videoLayer.frame = CGRectMake(0, 0, MAX_HEIGHT,MAX_WIDTH);
[parentLayer addSublayer:videoLayer];
[parentLayer addSublayer:overlayLayer];
// 3 - apply magic
AVMutableVideoComposition *mutableVideoComposition = [videoComposition copy];
mutableVideoComposition.animationTool = [AVVideoCompositionCoreAnimationTool
videoCompositionCoreAnimationToolWithPostProcessingAsVideoLayer:videoLayer inLayer:parentLayer];
NSURL *url = [NSURL fileURLWithPath:myPathDocs];
myLog(#"Path: %#", myPathDocs);
AVAssetExportSession *exporter = [[AVAssetExportSession alloc] initWithAsset:mixComposition presetName:AVAssetExportPreset1280x720];
exporter.outputURL = url;
exporter.outputFileType = AVFileTypeMPEG4;
exporter.videoComposition = mutableVideoComposition;
exporter.shouldOptimizeForNetworkUse = NO;
[exporter exportAsynchronouslyWithCompletionHandler:^ {
//NSLog(#"exporting");
switch (exporter.status) {
case AVAssetExportSessionStatusCompleted: {
NSURL *url = [NSURL fileURLWithPath:myPathDocs];
hud.progress = 1.0f;
dispatch_async(dispatch_get_main_queue(), ^{
[MBProgressHUD hideHUDForView:viewController.view animated:YES];
});
[self checkTmpSize];
if (completion) {
completion(YES, url);
}
}
break;
case AVAssetExportSessionStatusExporting:
myLog(#"Exporting!");
break;
case AVAssetExportSessionStatusWaiting:
myLog(#"Waiting");
break;
default:
break;
}
}];
}
If select options to loop less than 8 times, the above code works fine.
If select options more than 8 times, export session freezes showing export.progress = 0.0000000
If I remove this line
playerItem.videoComposition = videoComposition;
Then I cannot preview the mixed video but enable to export normally (up to 16 tracks).
Or If I remove the line in export code:
exporter.videoComposition = mutableVideoComposition;
Then it's possible to preview the mixed video, and export normally WITHOUT video composition.
So I guess there's something wrong with AVVideoComposition and/or the way I implement it.
I would appreciate any suggestion.
Many thanks.
I highly doubt the reason for this is using AVPlayer to preview video somehow hinders AVAssetExportSession as described in below posts:
iOS 5: Error merging 3 videos with AVAssetExportSession
AVPlayerItem fails with AVStatusFailed and error code “Cannot Decode”
I ran into this issue while attempting to concatenate N videos while playing up to 3 videos I AVPlayer instances in an UICollectionView. It's been discussed in the Stack Overflow question you linked that iOS is capable of only handling so many instances of AVPlayer. Each instance uses up a "render pipeline". I discovered that each instance of AVMutableCompositionTrack also uses up one of these render pipelines.
Therefore if you use too many AVPlayer instances or try to create an AVMutableComposition with too many AVMutableCompositionTrack tracks, you can run out of resources to decode H264 and you will receive the "Cannot Decode" error. I was able to get around the issue by only using two instances of AVMutableCompositionTrack. This way I could "overlap" segments of video while also applying transitions (which requires two video tracks to "play" concurrently).
In short: minimize your usage of AVMutableCompositionTrack as well as AVPlayer. You can check out the AVCustomEdit sample code by Apple for an example of this. Specifically, check out the buildTransitionComposition method inside the APLSimpleEditor class.
try this, clear player item before exporting
[self.player replaceCurrentItemWithPlayerItem:nil];

Separate Audio from Video

float sliderValue = 0.5;
NSURL *audio_url = [[NSBundle mainBundle] pathForResource:#“video_fileName” ofType:#"mp4"]];
AVURLAsset* audio_Asset = [[AVURLAsset alloc]initWithURL:audio_url options:nil];
AVMutableAudioMixInputParameters *audioInputParams = [AVMutableAudioMixInputParameters audioMixInputParameters];
NSString* formattedNumber = [NSString stringWithFormat:#"%.01f", sliderValue];
NSLog(#"formattedNumber %#",formattedNumber);
NSLog(#"formattedNumber %.01f",[formattedNumber floatValue]);
[audioInputParams setVolume:[formattedNumber floatValue] atTime:kCMTimeZero];
[audioInputParams setTrackID:[[[audio_Asset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0] trackID]];
AVMutableAudioMix *audioMix = [AVMutableAudioMix audioMix];
audioMix.inputParameters = [NSArray arrayWithObject:audioInputParams];
AVAssetExportSession *exportSession=[AVAssetExportSession exportSessionWithAsset:audio_Asset presetName:AVAssetExportPresetAppleM4A];
exportSession.audioMix = audioMix;
exportSession.outputURL=[NSURL fileURLWithPath:audioPath];
exportSession.outputFileType=AVFileTypeAppleM4A;
[exportSession exportAsynchronouslyWithCompletionHandler:^{
if (exportSession.status==AVAssetExportSessionStatusFailed) {
NSLog(#"failed");
}
else {
NSLog(#"AudioLocation : %#",audioPath);
}
}];
Issue: get asset blank, so app crashes.
[[audio_Asset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0]
[__NSArrayM objectAtIndex:]: index 0 beyond bounds for empty array';
The crash has described the issue, you are trying to access an array beyond it's bounds. change:
[audioInputParams setTrackID:[[[audio_Asset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0] trackID]];
to:
NSArray * tracks = [audio_Asset tracksWithMediaType:AVMediaTypeAudio];
if([tracks count]) {
[audioInputParams setTrackID:[[tracks firstObject] trackID]];
}
Your main issue may be that loading an AVURLAsset might not immediately load all the tracks. You can wrap your method (after getting audio_Asset) in the load method to have better guarantee:
NSString *tracksKey = #"tracks";
[audio_Asset loadValuesAsynchronouslyForKeys:#[tracksKey] completionHandler: ^{
// rest of the code here

Can I mute/unmute an UIWebview in some way?

I have a UIWebView which generates sound, but I want to remove it. How can I mute the audio output in a UIWebView?
I've found the way to mute/unmute single player UIWebView. Any time UIWebView starts video AVPlayerItemBecameCurrentNotification notification fired. So handle it and track current AVPlayerItem to control playback.
-(void)viewDidLoad
{
…
[[NSNotificationCenter defaultCenter] addObserver:self selector:#selector(playerItemBecameCurrent:)
name:#"AVPlayerItemBecameCurrentNotification"
object:nil];
}
-(void)playerItemBecameCurrent:(NSNotification*)notification
{
if ([notification.object isKindOfClass:[AVPlayerItem class]]) {
self.currentItem = (AVPlayerItem*)notification.object;
}
}
-(void)muteSound:(BOOL)mute
{
AVAsset *asset = self.currentItem.asset;
NSArray *audioTracks = [asset tracksWithMediaType:AVMediaTypeAudio];
NSMutableArray *allAudioParams = [NSMutableArray array];
for (AVAssetTrack *track in audioTracks)
{
AVMutableAudioMixInputParameters *audioInputParams = [AVMutableAudioMixInputParameters audioMixInputParameters];
[audioInputParams setVolume:!mute atTime:kCMTimeZero];
[audioInputParams setTrackID:[track trackID]];
[allAudioParams addObject:audioInputParams];
}
AVMutableAudioMix *audioZeroMix = [AVMutableAudioMix audioMix];
[audioZeroMix setInputParameters:allAudioParams];
self.currentItem.audioMix = audioZeroMix;
}
This will be the easiest way.
-(void)stopAudio
{
[webView loadRequest:[NSURLRequest requestWithURL:[NSURL URLWithString:#"" ]]];
}

Stitching two videos

I need to create one video composed by two stitched videos like the following image:
Actually I'm playing the two videos simultaneously with two instances of AVPlayer, but I need to create the final video with those two videos: Final video = [1 | 2]
Do you have any ideas of how to do this? This is the code I use to play the two videos:
- (void)viewDidLoad
{
[super viewDidLoad];
NSURL *url = [[NSBundle mainBundle] URLForResource:#"video_1" withExtension:#"mp4"];
self.mPlayer = [AVPlayer playerWithURL:url];
self.mPlayer2 = [AVPlayer playerWithURL:url];
[self.mPlayer addObserver:self forKeyPath:#"status" options:0 context:AVPlayerDemoPlaybackViewControllerStatusObservationContext];
[self.mPlayer2 addObserver:self forKeyPath:#"status" options:0 context:AVPlayerDemoPlaybackViewControllerStatusObservationContext];
}
- (void)observeValueForKeyPath:(NSString*)path ofObject:(id)object change:(NSDictionary*)change context:(void*)context
{
if ([object isKindOfClass:[AVPlayer class]]) {
if ([path isEqualToString:#"status"]) {
switch(item.status) {
case AVPlayerItemStatusFailed:
NSLog(#"player item status failed");
break;
case AVPlayerItemStatusReadyToPlay:
NSLog(#"player item status is ready to play");
[self.mPlaybackView2 setPlayer:self.mPlayer2];
[self.mPlaybackView setPlayer:self.mPlayer];
[self.mPlayer play];
[self.mPlayer2 play];
break;
case AVPlayerItemStatusUnknown:
NSLog(#"player item status is unknown");
break;
}
}
}
}
Source: https://abdulazeem.wordpress.com/2012/04/02/
I found the solution. Here I go:
- (void)stitchAudio:(NSURL *)file1 audio2:(NSURL *)file2 {
AVAsset *video1Asset = [AVAsset assetWithURL:file1];
AVAsset *video2Asset = [AVAsset assetWithURL:file2];
AVMutableComposition* mixComposition = [AVMutableComposition composition];
AVMutableCompositionTrack *firstTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
[firstTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, video1Asset.duration)
ofTrack:[[video1Asset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0]
atTime:kCMTimeZero error:nil];
AVMutableCompositionTrack *secondTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
[secondTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, video2Asset.duration)
ofTrack:[[video2Asset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0]
atTime:kCMTimeZero error:nil];
//See how we are creating AVMutableVideoCompositionInstruction object.This object will contain the array of our AVMutableVideoCompositionLayerInstruction objects.You set the duration of the layer.You should add the lenght equal to the lingth of the longer asset in terms of duration.
AVMutableVideoCompositionInstruction * MainInstruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
MainInstruction.timeRange = CMTimeRangeMake(kCMTimeZero, video1Asset.duration);
//We will be creating 2 AVMutableVideoCompositionLayerInstruction objects.Each for our 2 AVMutableCompositionTrack.here we are creating AVMutableVideoCompositionLayerInstruction for out first track.see how we make use of Affinetransform to move and scale our First Track.so it is displayed at the bottom of the screen in smaller size.(First track in the one that remains on top).
AVMutableVideoCompositionLayerInstruction *FirstlayerInstruction = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:firstTrack];
CGAffineTransform Scale = CGAffineTransformMakeScale(0.68f,0.68f);
CGAffineTransform Move = CGAffineTransformMakeTranslation(0,0);
[FirstlayerInstruction setTransform:CGAffineTransformConcat(Scale,Move) atTime:kCMTimeZero];
//Here we are creating AVMutableVideoCompositionLayerInstruction for out second track.see how we make use of Affinetransform to move and scale our second Track.
AVMutableVideoCompositionLayerInstruction *SecondlayerInstruction = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:secondTrack];
CGAffineTransform SecondScale = CGAffineTransformMakeScale(0.68f,0.68f);
CGAffineTransform SecondMove = CGAffineTransformMakeTranslation(318,0);
[SecondlayerInstruction setTransform:CGAffineTransformConcat(SecondScale,SecondMove) atTime:kCMTimeZero];
//Now we add our 2 created AVMutableVideoCompositionLayerInstruction objects to our AVMutableVideoCompositionInstruction in form of an array.
MainInstruction.layerInstructions = [NSArray arrayWithObjects:FirstlayerInstruction,SecondlayerInstruction,nil];
//Now we create AVMutableVideoComposition object.We can add mutiple AVMutableVideoCompositionInstruction to this object.We have only one AVMutableVideoCompositionInstruction object in our example.You can use multiple AVMutableVideoCompositionInstruction objects to add multiple layers of effects such as fade and transition but make sure that time ranges of the AVMutableVideoCompositionInstruction objects dont overlap.
AVMutableVideoComposition *MainCompositionInst = [AVMutableVideoComposition videoComposition];
MainCompositionInst.instructions = [NSArray arrayWithObject:MainInstruction];
MainCompositionInst.frameDuration = CMTimeMake(1, 30);
MainCompositionInst.renderSize = CGSizeMake(640, 480);
//Finally just add the newly created AVMutableComposition with multiple tracks to an AVPlayerItem and play it using AVPlayer.
AVPlayerItem * newPlayerItem = [AVPlayerItem playerItemWithAsset:mixComposition];
newPlayerItem.videoComposition = MainCompositionInst;
self.mPlayer = [AVPlayer playerWithPlayerItem:newPlayerItem];
[self.mPlaybackView setPlayer:self.mPlayer];
[self.mPlayer play];
// [self.mPlayer addObserver:self forKeyPath:#"status" options:0 context:AVPlayerDemoPlaybackViewControllerStatusObservationContext];
// Create the export session with the composition and set the preset to the highest quality.
AVAssetExportSession *exporter = [[AVAssetExportSession alloc] initWithAsset:mixComposition presetName:AVAssetExportPresetHighestQuality];
// Set the desired output URL for the file created by the export process.
exporter.outputURL = [self newUniqueAudioFileURL];
exporter.videoComposition = MainCompositionInst;
// Set the output file type to be a QuickTime movie.
exporter.outputFileType = AVFileTypeQuickTimeMovie;
exporter.shouldOptimizeForNetworkUse = YES;
// Asynchronously export the composition to a video file and save this file to the camera roll once export completes.
[exporter exportAsynchronouslyWithCompletionHandler:^{
dispatch_async(dispatch_get_main_queue(), ^{
if (exporter.status == AVAssetExportSessionStatusCompleted) {
ALAssetsLibrary *assetsLibrary = [[ALAssetsLibrary alloc] init];
self.mPlayer = [AVPlayer playerWithURL:exporter.outputURL];
//self.mPlayer2 = [AVPlayer playerWithURL:url];
//[self.mPlayer addObserver:self forKeyPath:#"status" options:0 context:AVPlayerDemoPlaybackViewControllerStatusObservationContext];
if ([assetsLibrary videoAtPathIsCompatibleWithSavedPhotosAlbum:exporter.outputURL]) {
[assetsLibrary writeVideoAtPathToSavedPhotosAlbum:exporter.outputURL completionBlock:NULL];
}
}
});
}];
}
Source: https://abdulazeem.wordpress.com/2012/04/02/

Resources