I have an application which plays some audio and also records a video+audio while that sound is playing. I would like to figure out a way to process the video so that the audio that was picked up by the microphone is removed from the resulting video.
For example, if I'm playing audioA, and then recording videoB with audioB (from the microphone), I want to somehow cancel out audioA from the resulting audioB, so that audioB is only the ambient noise and not the noise from the device speakers.
Any idea if there's a way to do this?
Bonus points if it can be done without any offline processing.
You have to deal with Playback part. But here is a code to Mix the selected Audio to the Recorded Video.
- (void)mixAudio:(AVAsset*)audioAsset startTime:(CMTime)startTime withVideo:(NSURL*)inputUrl affineTransform:(CGAffineTransform)affineTransform toUrl:(NSURL*)outputUrl outputFileType:(NSString*)outputFileType withMaxDuration:(CMTime)maxDuration withCompletionBlock:(void(^)(NSError *))completionBlock {
NSError * error = nil;
AVMutableComposition * composition = [[AVMutableComposition alloc] init];
AVMutableCompositionTrack * videoTrackComposition = [composition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
AVMutableCompositionTrack * audioTrackComposition = [composition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
AVURLAsset * fileAsset = [AVURLAsset URLAssetWithURL:inputUrl options:[NSDictionary dictionaryWithObject:[NSNumber numberWithBool:YES] forKey:AVURLAssetPreferPreciseDurationAndTimingKey]];
NSArray * videoTracks = [fileAsset tracksWithMediaType:AVMediaTypeVideo];
CMTime duration = ((AVAssetTrack*)[videoTracks objectAtIndex:0]).timeRange.duration;
if (CMTIME_COMPARE_INLINE(duration, >, maxDuration)) {
duration = maxDuration;
}
for (AVAssetTrack * track in [audioAsset tracksWithMediaType:AVMediaTypeAudio]) {
[audioTrackComposition insertTimeRange:CMTimeRangeMake(startTime, duration) ofTrack:track atTime:kCMTimeZero error:&error];
if (error != nil) {
completionBlock(error);
return;
}
}
for (AVAssetTrack * track in videoTracks) {
[videoTrackComposition insertTimeRange:CMTimeRangeMake(kCMTimeZero, duration) ofTrack:track atTime:kCMTimeZero error:&error];
if (error != nil) {
completionBlock(error);
return;
}
}
videoTrackComposition.preferredTransform = affineTransform;
AVAssetExportSession * exportSession = [[AVAssetExportSession alloc] initWithAsset:composition presetName:AVAssetExportPresetPassthrough];
exportSession.outputFileType = outputFileType;
exportSession.shouldOptimizeForNetworkUse = YES;
exportSession.outputURL = outputUrl;
[exportSession exportAsynchronouslyWithCompletionHandler:^ {
NSError * error = nil;
if (exportSession.error != nil) {
NSMutableDictionary * userInfo = [NSMutableDictionary dictionaryWithDictionary:exportSession.error.userInfo];
NSString * subLocalizedDescription = [userInfo objectForKey:NSLocalizedDescriptionKey];
[userInfo removeObjectForKey:NSLocalizedDescriptionKey];
[userInfo setObject:#"Failed to mix audio and video" forKey:NSLocalizedDescriptionKey];
[userInfo setObject:exportSession.outputFileType forKey:#"OutputFileType"];
[userInfo setObject:exportSession.outputURL forKey:#"OutputUrl"];
[userInfo setObject:subLocalizedDescription forKey:#"CauseLocalizedDescription"];
[userInfo setObject:[AVAssetExportSession allExportPresets] forKey:#"AllExportSessions"];
error = [NSError errorWithDomain:#"Error" code:500 userInfo:userInfo];
}
completionBlock(error);
}];
}
Related
Using SCRecorder want to save video after recording, with different playback speed chosen by user ex: 2x, 3x. Using AVPlayer, it can be achieved using this code:
//create mutable composition
AVMutableComposition *mixComposition = [AVMutableComposition composition];
AVMutableCompositionTrack *compositionVideoTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo
preferredTrackID:kCMPersistentTrackID_Invalid];
NSError *videoInsertError = nil;
BOOL videoInsertResult = [compositionVideoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, asset.duration)
ofTrack:[[asset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0]
atTime:kCMTimeZero
error:&videoInsertError];
if (!videoInsertResult || nil != videoInsertError) {
//handle error
return;
}
//slow down whole video by 2.0
double videoScaleFactor = 2.0;
CMTime videoDuration = asset.duration;
[compositionVideoTrack scaleTimeRange:CMTimeRangeMake(kCMTimeZero, videoDuration)
toDuration:CMTimeMake(videoDuration.value*videoScaleFactor, videoDuration.timescale)];
//export
AVAssetExportSession* assetExport = [[AVAssetExportSession alloc] initWithAsset:mixComposition
// presetName:AVAssetExportPresetLowQuality];
But, not getting how to achieve the same using SCRecorder library. Please guide.
Thanks in advance.
Finally got answer my self:
- (void)SlowMotion:(NSURL *)URl
{
AVURLAsset* videoAsset = [AVURLAsset URLAssetWithURL:URl options:nil]; //self.inputAsset;
AVAsset *currentAsset = [AVAsset assetWithURL:URl];
AVAssetTrack *vdoTrack = [[currentAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];
//create mutable composition
AVMutableComposition *mixComposition = [AVMutableComposition composition];
AVMutableCompositionTrack *compositionVideoTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
AVMutableCompositionTrack *compositionAudioTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
NSError *videoInsertError = nil;
BOOL videoInsertResult = [compositionVideoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, videoAsset.duration)
ofTrack:[[videoAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0]
atTime:kCMTimeZero
error:&videoInsertError];
if (!videoInsertResult || nil != videoInsertError) {
//handle error
return;
}
NSError *audioInsertError =nil;
BOOL audioInsertResult =[compositionAudioTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, videoAsset.duration)
ofTrack:[[currentAsset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0]
atTime:kCMTimeZero
error:&audioInsertError];
if (!audioInsertResult || nil != audioInsertError) {
//handle error
return;
}
CMTime duration =kCMTimeZero;
duration=CMTimeAdd(duration, currentAsset.duration);
//slow down whole video by 2.0
double videoScaleFactor = 2.0;
CMTime videoDuration = videoAsset.duration;
[compositionVideoTrack scaleTimeRange:CMTimeRangeMake(kCMTimeZero, videoDuration)
toDuration:CMTimeMake(videoDuration.value*videoScaleFactor, videoDuration.timescale)];
[compositionAudioTrack scaleTimeRange:CMTimeRangeMake(kCMTimeZero, videoDuration)
toDuration:CMTimeMake(videoDuration.value*videoScaleFactor, videoDuration.timescale)];
[compositionVideoTrack setPreferredTransform:vdoTrack.preferredTransform];
NSArray *dirPaths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
NSString *docsDir = [dirPaths objectAtIndex:0];
NSString *outputFilePath = [docsDir stringByAppendingPathComponent:[NSString stringWithFormat:#"slowMotion.mov"]];
if ([[NSFileManager defaultManager] fileExistsAtPath:outputFilePath])
[[NSFileManager defaultManager] removeItemAtPath:outputFilePath error:nil];
NSURL *_filePath = [NSURL fileURLWithPath:outputFilePath];
//export
AVAssetExportSession* assetExport = [[AVAssetExportSession alloc] initWithAsset:mixComposition
presetName:AVAssetExportPresetLowQuality];
assetExport.outputURL=_filePath;
assetExport.outputFileType = AVFileTypeQuickTimeMovie;
exporter.shouldOptimizeForNetworkUse = YES;
[assetExport exportAsynchronouslyWithCompletionHandler:^
{
switch ([assetExport status]) {
case AVAssetExportSessionStatusFailed:
{
NSLog(#"Export session faiied with error: %#", [assetExport error]);
dispatch_async(dispatch_get_main_queue(), ^{
// completion(nil);
});
}
break;
case AVAssetExportSessionStatusCompleted:
{
NSLog(#"Successful");
NSURL *outputURL = assetExport.outputURL;
ALAssetsLibrary *library = [[ALAssetsLibrary alloc] init];
if ([library videoAtPathIsCompatibleWithSavedPhotosAlbum:outputURL]) {
[self writeExportedVideoToAssetsLibrary:outputURL];
}
dispatch_async(dispatch_get_main_queue(), ^{
// completion(_filePath);
});
}
break;
default:
break;
}
}];
}
- (void)writeExportedVideoToAssetsLibrary :(NSURL *)url {
NSURL *exportURL = url;
ALAssetsLibrary *library = [[ALAssetsLibrary alloc] init];
if ([library videoAtPathIsCompatibleWithSavedPhotosAlbum:exportURL]) {
[library writeVideoAtPathToSavedPhotosAlbum:exportURL completionBlock:^(NSURL *assetURL, NSError *error){
dispatch_async(dispatch_get_main_queue(), ^{
if (error) {
UIAlertView *alertView = [[UIAlertView alloc] initWithTitle:[error localizedDescription]
message:[error localizedRecoverySuggestion]
delegate:nil
cancelButtonTitle:#"OK"
otherButtonTitles:nil];
[alertView show];
}
if(!error)
{
// [activityView setHidden:YES];
UIAlertView *alertView = [[UIAlertView alloc] initWithTitle:#"Sucess"
message:#"video added to gallery successfully"
delegate:nil
cancelButtonTitle:#"OK"
otherButtonTitles:nil];
[alertView show];
}
#if !TARGET_IPHONE_SIMULATOR
[[NSFileManager defaultManager] removeItemAtURL:exportURL error:nil];
#endif
});
}];
} else {
NSLog(#"Video could not be exported to assets library.");
}
}
SCVideoConfiguration having one property timeScale.
You cant try SCAssetExportSession. which takes SCVideoConfiguration as input.
You can also use SCAssetExportSession, which is the SCRecorder counterpart of AVAssetExportSession.(SCRecorder docs)
/* The time scale of the video
A value more than 1 will make the buffers last longer, it creates
a slow motion effect. A value less than 1 will make the buffers be
shorter, it creates a timelapse effect.
Only used in SCRecorder.
*/
#property (assign, nonatomic) CGFloat timeScale;
I'm getting AVAssetExportSessionStatusFailed when trying to concat video sequences with this message:
Export Failed with error messsage: Error
Domain=AVFoundationErrorDomain Code=-11841 "Operation Stopped"
UserInfo=0x170675cc0 {NSLocalizedDescription=Operation Stopped,
NSLocalizedFailureReason=The video could not be composed.}, Operation
Stopped
Here is my code:
self.finalComposition = [AVMutableComposition composition];
self.finalCompositionTrack = [_finalComposition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
self.finalCompositionAudioTrack = [_finalComposition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
CMTime currentTime = kCMTimeZero;
AVURLAsset *asset = nil;
NSMutableArray *audioTracks = [[NSMutableArray alloc] init];
NSMutableArray *videos = [[NSMutableArray alloc] init];
for (int videoCounter = 0; videoCounter < _videoArray.count ; videoCounter++)
{
id object = [_videoArray objectAtIndex:videoCounter];
if ([object isKindOfClass:[MVideoRecord class]])
{
MVideoRecord *video = object;
NSURL *url = [NSURL fileURLWithPath:video.pathToVideo];
NSFileManager *fileManager = [NSFileManager defaultManager];
if (![fileManager fileExistsAtPath:video.pathToVideo])
{
[self showError:#"Invalid video"];
}
NSDictionary *options = [NSDictionary dictionaryWithObject:[NSNumber numberWithBool:YES] forKey:AVURLAssetPreferPreciseDurationAndTimingKey];
asset = [AVURLAsset URLAssetWithURL:url options:options];
NSError *error = nil;
if (!self.videoCompostion)
{
self.videoCompostion = [AVMutableVideoComposition videoComposition];
self.videoCompostion.frameDuration = CMTimeMake(1, 30);
self.videoCompostion.renderSize = CGSizeMake(640, 360);
self.videoCompositionInstruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
videoCompositionInstruction.timeRange = CMTimeRangeMake(kCMTimeZero, totalTime);
self.videoCompositionLayerInstruction = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:self.finalCompositionTrack];
}
for (AVAssetTrack *track in asset.tracks)
{
CGSize size = track.naturalSize;
if(track.naturalTimeScale == 600)
{
CGAffineTransform transform = [track preferredTransform];
int orientation = [self orientationForTrack: asset];
if (orientation < 2)
{
float x = 640/size.width;
float y = 360/size.height;
CGAffineTransform videoScale = CGAffineTransformMakeScale(x, y);
[_videoCompositionLayerInstruction setTransform:CGAffineTransformConcat(transform, videoScale) atTime:currentTime]; }
else
{
float s = 480/size.height;
CGAffineTransform new = CGAffineTransformConcat(transform, CGAffineTransformMakeScale(s,s));
[_videoCompositionLayerInstruction setTransform:new atTime:currentTime];
}
if (![_finalCompositionTrack insertTimeRange:CMTimeRangeMake(CMTimeMakeWithSeconds(video.videoStart.doubleValue, 600), CMTimeMakeWithSeconds(video.videoEnd.doubleValue, 600)) ofTrack:track atTime:currentTime error:&error])
{
[self showError:error.localizedFailureReason];
}
}
else if (track.naturalTimeScale == 44100)
{
CMTime start = kCMTimeZero;
CMTime duration = CMTimeMakeWithSeconds(video.videoEnd.doubleValue, 600);
NSError *error;
[finalCompositionAudioTrack insertTimeRange:CMTimeRangeMake(start, duration)
ofTrack:[[track.asset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0] atTime:currentTime error:&error];
NSLog(#"%#", error);
}
}
currentTime = CMTimeAdd(currentTime, CMTimeMake(video.videoEnd.doubleValue*600, 600));
}
}
//apply the translation to video composition
_videoCompositionInstruction.layerInstructions = [NSArray arrayWithObject: _videoCompositionLayerInstruction];
_videoCompostion.instructions = [NSArray arrayWithObject:_videoCompositionInstruction];
//get filepath of last object...
MVideoRecord *lastRecord = [_videoArray objectAtIndex:_videoArray.count - 2];
NSString *finalExportURLString = [lastRecord.pathToVideo stringByReplacingOccurrencesOfString:#".MOV" withString:#"_finalExport.mp4"];
//testing fix for video missing audio after final export
//string = [exportURL.absoluteString stringByReplacingOccurrencesOfString:#".MOV" withString:#"_finalExport.MOV"];
// File Management
NSFileManager *fileManager = [NSFileManager defaultManager];
self.finalExportURL = [NSURL fileURLWithPath:finalExportURLString];
self.finalExportSession = [[AVAssetExportSession alloc] initWithAsset:_finalComposition presetName:TEST_EXPORT_SESSION_QUALITY];
if ([fileManager fileExistsAtPath:finalExportURL.path])
{
NSError *fileError = nil;
if (![fileManager removeItemAtPath:finalExportURLString error:&fileError])
{
DCLog(#"Error removing old path: %#", fileError.localizedDescription);
}
}
_finalExportSession.outputURL = self.finalExportURL;
_finalExportSession.outputFileType = #"public.mpeg-4";
_finalExportSession.videoComposition = self.videoCompostion;
[self.finalExportSession exportAsynchronouslyWithCompletionHandler:^{
switch (_finalExportSession.status)
{ case AVAssetExportSessionStatusFailed:
{
DCLog(#"Export Failed with error messsage: %#, %#", _finalExportSession.error, _finalExportSession.error.localizedDescription);
break;
}
case AVAssetExportSessionStatusCompleted:
{
DCLog(#"Export Success");
break;
}
};
}];
What am I doing wrong?
The most weird, if I change:
[finalCompositionAudioTrack insertTimeRange:CMTimeRangeMake(start, duration)
ofTrack:[[track.asset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0] atTime:currentTime error:&error];
to:
[finalCompositionAudioTrack insertTimeRange:CMTimeRangeMake(start, duration)
ofTrack:[[track.asset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0] atTime:kCMTimeZero error:&error];
It works, but of course, the audio is played wrong. The audio for first video plays for the second one.
I fixed my problem by creating AVMutableCompositionTrack for each audio track. I moved the code below to inside the loop and it worked.
compositionAudioTrack = [_finalComposition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
I'm trying to show a video from a composition of both video and audio. However, I seem to have a problem, once the video status never reaches AVPlayerStatusReadyToPlay.
If I include the video asset or the audio asset directly to the player item it will work. Thus, I know there is no problem with the assets.
This is my code:
- (void) loadPlayer {
NSURL *videoURL = **;
AVURLAsset *videoAsset = [AVURLAsset URLAssetWithURL:videoURL options:nil];
NSURL *audioURL = **;
AVURLAsset *audioAsset = [AVURLAsset URLAssetWithURL:audioURL options:nil];
NSArray *keys = [NSArray arrayWithObject:#"duration"];
[videoAsset loadValuesAsynchronouslyForKeys:keys completionHandler:^() {
NSError *error = nil;
AVKeyValueStatus durationStatus = [videoAsset statusOfValueForKey:#"duration" error:&error];
switch (durationStatus) {
case AVKeyValueStatusLoaded:;
_videoDuration = videoAsset.duration;
if (_audioDuration.flags == kCMTimeFlags_Valid) {
[self loadPlayWithVideoAsset:videoAsset withDuration:_videoDuration andAudioAsset:audioAsset withDuration:_audioDuration];
}
break;
}
}];
[audioAsset loadValuesAsynchronouslyForKeys:keys completionHandler:^() {
NSError *error = nil;
AVKeyValueStatus durationStatus = [audioAsset statusOfValueForKey:#"duration" error:&error];
switch (durationStatus) {
case AVKeyValueStatusLoaded:;
_audioDuration = audioAsset.duration;
if (_videoDuration.flags == kCMTimeFlags_Valid) {
[self loadPlayWithVideoAsset:videoAsset withDuration:_videoDuration andAudioAsset:audioAsset withDuration:_audioDuration];
}
break;
}
}];
}
- (void) loadPlayWithVideoAsset:(AVURLAsset *)videoAsset withDuration:(CMTime)videoDuration andAudioAsset:(AVURLAsset *)audioAsset withDuration:(CMTime)audioDuration {
AVMutableComposition *composition = [AVMutableComposition composition];
//Video
AVMutableCompositionTrack *compositionVideoTrack = [composition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
AVAssetTrack *videoTrack = [[videoAsset tracksWithMediaType:AVMediaTypeVideo] lastObject];
NSError *videoError = nil;
if (![compositionVideoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero,videoDuration)
ofTrack:videoTrack
atTime:kCMTimeZero
error:&videoError]) {
NSLog(#"videoError: %#",videoError);
}
//Audio
AVMutableCompositionTrack *compositionAudioTrack = [composition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
AVAssetTrack *audioTrack = [[audioAsset tracksWithMediaType:AVMediaTypeAudio] lastObject];
NSError *audioError = nil;
if (![compositionAudioTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero,audioDuration)
ofTrack:audioTrack
atTime:kCMTimeZero
error:&audioError]) {
NSLog(#"audioError: %#",audioError);
}
NSInteger compare = CMTimeCompare(videoDuration, audioDuration);
if (compare == 1) {
//The video is larger
CMTime timeDiff = CMTimeSubtract(videoDuration, audioDuration);
[compositionAudioTrack insertEmptyTimeRange:CMTimeRangeMake(audioDuration, timeDiff)];
}
else {
CMTime timeDiff = CMTimeSubtract(audioDuration, videoDuration);
[compositionVideoTrack insertEmptyTimeRange:CMTimeRangeMake(videoDuration, timeDiff)];
}
AVPlayerItem * playerItem = [AVPlayerItem playerItemWithAsset:composition];
self.mPlayer = [AVPlayer playerWithPlayerItem:playerItem];
self.mPlaybackView = [[AVPlayerPlaybackView alloc] initWithFrame:CGRectZero];
[self.view addSubview:self.mPlaybackView];
[self.mPlayer addObserver:self forKeyPath:#"status" options:0 context:AVPlayerPlaybackViewControllerStatusObservationContext];
}
- (void)observeValueForKeyPath:(NSString*) path ofObject:(id)object change:(NSDictionary*)change context:(void*)context
{
if (self.mPlayer.status == AVPlayerStatusReadyToPlay) {
[self.mPlaybackView setPlayer:self.mPlayer];
isReadyToPlay = YES;
_playVideoBtn.hidden = NO;
}
}
- (void) playVideo {
if (YES || isReadyToPlay) {
[self.mPlayer play];
}
}
From my experience, AVPlayer works with AVMutableComposition only if resource/video is bundled with app. If video resource is on network, then AVPlayer won't play with AVMUtableComposition, despite status reported by AVPlayerItem and AVPlayer as "Ready to Play".
I created a video using Array of images.
It creates the video successfully, then I add audio to that created video file.
I create AVMuatableComposition object, I add Video and audio by creating AVAssetTracks and finally export into single video file with help of AVAssetsExportSession.
Suppose the first video without Audio is vdo.mp4 and final (After adding audio) is final.mp4, So my final.mp4 is lower in size and resolution than the vdo.mp4
Here is my code which combines the both file,
NSAutoreleasePool *pool = [[NSAutoreleasePool alloc] init];
NSError * error = nil;
AVMutableComposition * composition = [AVMutableComposition composition];
NSURL *url = [NSURL fileURLWithPath:filePath];
AVURLAsset * videoAsset = [AVURLAsset URLAssetWithURL:url options:nil];
AVAssetTrack * videoAssetTrack = [[videoAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];
AVMutableCompositionTrack *compositionVideoTrack = [composition addMutableTrackWithMediaType:AVMediaTypeVideo
preferredTrackID: kCMPersistentTrackID_Invalid];
[compositionVideoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero,videoAsset.duration) ofTrack:videoAssetTrack atTime:kCMTimeZero error:&error];
CMTime audioStartTime = kCMTimeZero;
for ( NSInteger i = 0;i< [mArrAudioFileNames count];i++ )
{
NSString *audioFileName = nil;
NSString *docsDir = nil;
if ( [mArrAudioFileNames objectAtIndex:i] != [NSNull null]) {
audioFileName = [mArrAudioFileNames objectAtIndex:i];
docsDir = [[self dataFolderPathForAudio] stringByAppendingPathComponent:audioFileName];
}else{
//audioFileName = #" ";
docsDir = [[NSBundle mainBundle] pathForResource:#"sample" ofType:#"mp3"];
}
// NSString *docsDir = [[self dataFolderPathForAudio] stringByAppendingPathComponent:audioFileName];
AVURLAsset * urlAsset = [AVURLAsset URLAssetWithURL:[NSURL fileURLWithPath:docsDir] options:nil];
AVAssetTrack * audioAssetTrack = [[urlAsset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0];
AVMutableCompositionTrack *compositionAudioTrack = [composition addMutableTrackWithMediaType:AVMediaTypeAudio
preferredTrackID: kCMPersistentTrackID_Invalid];
[compositionAudioTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero,urlAsset.duration) ofTrack:audioAssetTrack atTime:audioStartTime error:&error];
Float64 duration = CMTimeGetSeconds(urlAsset.duration);
audioStartTime = CMTimeAdd(audioStartTime, CMTimeMake((int) ((duration * kRecordingFPS) + 0.5), kRecordingFPS));
}
AVAssetExportSession* assetExport = [[AVAssetExportSession alloc] initWithAsset:composition presetName:AVAssetExportPresetMediumQuality];
//assetExport.videoComposition = compositionVideoTrack;
assetExport.outputFileType = AVFileTypeQuickTimeMovie;// #"com.apple.quicktime-movie";
assetExport.outputURL = [NSURL fileURLWithPath:outFilePath];
[assetExport exportAsynchronouslyWithCompletionHandler:
^(void ) {
switch (assetExport.status)
{
case AVAssetExportSessionStatusCompleted:
// export complete
NSLog(#"Export Complete");
[self performSelectorOnMainThread:#selector(creatingVideoDone:)
withObject:outFilePath waitUntilDone:NO];
[assetExport release];
break;
case AVAssetExportSessionStatusFailed:
NSLog(#"Export Failed");
NSLog(#"ExportSessionError: %#", [assetExport.error localizedDescription]);
// Set delegate to move to view
if ( mDelegate!= nil && [mDelegate respondsToSelector:#selector(errorAlert:)])
{
[self performSelectorOnMainThread:#selector(errorOccured:)
withObject:[assetExport.error
localizedDescription]
waitUntilDone:NO];
}
[assetExport release];
break;
case AVAssetExportSessionStatusCancelled:
NSLog(#"Export Failed");
NSLog(#"ExportSessionError: %#", [assetExport.error localizedDescription]);
// Set delegate to move to view
if ( mDelegate!= nil && [mDelegate respondsToSelector:#selector(errorAlert:)])
{
[self performSelectorOnMainThread:#selector(errorOccured:)
withObject:[assetExport.error
localizedDescription]
waitUntilDone:NO];
}
[assetExport release];
break;
}
}];
Any help is appreciated.
Thanks.
I have a weird problem. In my app I am combining multiple audio and video files using the code below. The resulted video seems to work fine once I downloaded it from the device to the computer and play with Quick Time, but whenever I am trying to play the newly composed video using either UIWebView or AVPLayer I can only see first part of merged video files.
Furthermore when I tried to use MPMoviePlayerController to play it hangs on "Loading".
I can hear audio for all composition. To make it clear I have two arrays:
1- audioPieces with paths to audio files [song1, song2, song3];
2- moviePieces with paths to video files [movie1,movie2,movie3];
After merging those files I can see only movie1 but I can hear song1 + song2 + song3.
P.S. songs and movies have different lengths (Less than 0.2s difference).
Any help will be appreciated.
Thank you in advance,
Janusz
-(void)putFilesTogether{
AVMutableComposition *mixComposition = [AVMutableComposition composition];
AVMutableCompositionTrack *videoCompositionTrack =[[AVMutableCompositionTrack alloc]init];
AVMutableCompositionTrack *audioCompositionTrack =[[AVMutableCompositionTrack alloc]init];
NSLog(#" movie %# audio %# ", moviePieces, audioPieces);
NSError * error;
for(int i=0;i<moviePieces.count;i++)
{
NSFileManager * fm = [NSFileManager defaultManager];
NSString * movieFilePath;
NSString * audioFilePath;
movieFilePath = [moviePieces objectAtIndex:i];
audioFilePath = [audioPieces objectAtIndex:i];
if(![fm fileExistsAtPath:movieFilePath]){
NSLog(#"Movie doesn't exist %# ",movieFilePath);
}
else{
NSLog(#"Movie exist %# ",movieFilePath);
}
if(![fm fileExistsAtPath:audioFilePath]){
NSLog(#"Audio doesn't exist %# ",audioFilePath);
}
else{
NSLog(#"Audio exists %# ",audioFilePath);
}
NSURL *videoUrl = [NSURL fileURLWithPath:movieFilePath];
NSURL *audioUrl = [NSURL fileURLWithPath:audioFilePath];
AVURLAsset *videoasset = [[AVURLAsset alloc]initWithURL:videoUrl options:nil];
AVAssetTrack *videoAssetTrack= [[videoasset tracksWithMediaType:AVMediaTypeVideo] lastObject];
AVURLAsset *audioasset = [[AVURLAsset alloc]initWithURL:audioUrl options:nil];
AVAssetTrack *audioAssetTrack= [[audioasset tracksWithMediaType:AVMediaTypeAudio] lastObject];
videoCompositionTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
audioCompositionTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
CMTime tempTime = mixComposition.duration;
[audioCompositionTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, audioasset.duration) ofTrack:audioAssetTrack atTime:tempTime error:&error];
[videoCompositionTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, videoasset.duration) ofTrack:videoAssetTrack atTime:tempTime error:&error];
if(error)
{
NSLog(#"Ups. Something went wrong! %#", [error debugDescription]);
}
}
NSDate *now = [NSDate dateWithTimeIntervalSinceNow:0];
NSString *caldate = [now description];
float ran = arc4random()%1000;
NSString * pathToSave = [NSString stringWithFormat:#"Output%#%f.mp4",caldate,ran];
pathToSave =[DOCUMENTS_FOLDER stringByAppendingPathComponent:pathToSave];
NSURL *movieUrl = [NSURL fileURLWithPath:pathToSave];
AVAssetExportSession *exporter =[[AVAssetExportSession alloc] initWithAsset:mixComposition presetName:AVAssetExportPresetPassthrough];
exporter.outputFileType=AVFileTypeQuickTimeMovie;
exporter.outputURL=movieUrl;
exporter.shouldOptimizeForNetworkUse=YES;
CMTimeValue val = mixComposition.duration.value;
CMTime start=CMTimeMake(0, 600);
CMTime duration=CMTimeMake(val, 600);
CMTimeRange range=CMTimeRangeMake(start, duration);
exporter.timeRange=range;
[exporter exportAsynchronouslyWithCompletionHandler:^{
switch ([exporter status]) {
case AVAssetExportSessionStatusFailed:{
NSLog(#"Export failed: %# %#", [[exporter error] localizedDescription],[[exporter error]debugDescription]);
NSString * message = #"Movie wasn't created. Try again later.";
[self performSelectorOnMainThread:#selector(dismissMe:) withObject:message waitUntilDone:NO];
break;}
case AVAssetExportSessionStatusCancelled:{ NSLog(#"Export canceled");
NSString * message1 = #"Movie wasn't created. Try again later.";
[self performSelectorOnMainThread:#selector(dismissMe:) withObject:message1 waitUntilDone:NO];
break;}
case AVAssetExportSessionStatusCompleted:
{
NSString * message = #"Movie was successfully created.";
CMTime duration = mixComposition.duration;
[self saveData:duration ofPath:pathToSave];
[self cleanFiles];
[self performSelectorOnMainThread:#selector(dismissMe:) withObject:message waitUntilDone:NO];
}
}}];
}
The problem lays in:
videoCompositionTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
audioCompositionTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
They need to be moved outside the for loop body.