AVPlayer from AVMutableComposition with audio and video won't play - ios

I'm trying to show a video from a composition of both video and audio. However, I seem to have a problem, once the video status never reaches AVPlayerStatusReadyToPlay.
If I include the video asset or the audio asset directly to the player item it will work. Thus, I know there is no problem with the assets.
This is my code:
- (void) loadPlayer {
NSURL *videoURL = **;
AVURLAsset *videoAsset = [AVURLAsset URLAssetWithURL:videoURL options:nil];
NSURL *audioURL = **;
AVURLAsset *audioAsset = [AVURLAsset URLAssetWithURL:audioURL options:nil];
NSArray *keys = [NSArray arrayWithObject:#"duration"];
[videoAsset loadValuesAsynchronouslyForKeys:keys completionHandler:^() {
NSError *error = nil;
AVKeyValueStatus durationStatus = [videoAsset statusOfValueForKey:#"duration" error:&error];
switch (durationStatus) {
case AVKeyValueStatusLoaded:;
_videoDuration = videoAsset.duration;
if (_audioDuration.flags == kCMTimeFlags_Valid) {
[self loadPlayWithVideoAsset:videoAsset withDuration:_videoDuration andAudioAsset:audioAsset withDuration:_audioDuration];
}
break;
}
}];
[audioAsset loadValuesAsynchronouslyForKeys:keys completionHandler:^() {
NSError *error = nil;
AVKeyValueStatus durationStatus = [audioAsset statusOfValueForKey:#"duration" error:&error];
switch (durationStatus) {
case AVKeyValueStatusLoaded:;
_audioDuration = audioAsset.duration;
if (_videoDuration.flags == kCMTimeFlags_Valid) {
[self loadPlayWithVideoAsset:videoAsset withDuration:_videoDuration andAudioAsset:audioAsset withDuration:_audioDuration];
}
break;
}
}];
}
- (void) loadPlayWithVideoAsset:(AVURLAsset *)videoAsset withDuration:(CMTime)videoDuration andAudioAsset:(AVURLAsset *)audioAsset withDuration:(CMTime)audioDuration {
AVMutableComposition *composition = [AVMutableComposition composition];
//Video
AVMutableCompositionTrack *compositionVideoTrack = [composition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
AVAssetTrack *videoTrack = [[videoAsset tracksWithMediaType:AVMediaTypeVideo] lastObject];
NSError *videoError = nil;
if (![compositionVideoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero,videoDuration)
ofTrack:videoTrack
atTime:kCMTimeZero
error:&videoError]) {
NSLog(#"videoError: %#",videoError);
}
//Audio
AVMutableCompositionTrack *compositionAudioTrack = [composition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
AVAssetTrack *audioTrack = [[audioAsset tracksWithMediaType:AVMediaTypeAudio] lastObject];
NSError *audioError = nil;
if (![compositionAudioTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero,audioDuration)
ofTrack:audioTrack
atTime:kCMTimeZero
error:&audioError]) {
NSLog(#"audioError: %#",audioError);
}
NSInteger compare = CMTimeCompare(videoDuration, audioDuration);
if (compare == 1) {
//The video is larger
CMTime timeDiff = CMTimeSubtract(videoDuration, audioDuration);
[compositionAudioTrack insertEmptyTimeRange:CMTimeRangeMake(audioDuration, timeDiff)];
}
else {
CMTime timeDiff = CMTimeSubtract(audioDuration, videoDuration);
[compositionVideoTrack insertEmptyTimeRange:CMTimeRangeMake(videoDuration, timeDiff)];
}
AVPlayerItem * playerItem = [AVPlayerItem playerItemWithAsset:composition];
self.mPlayer = [AVPlayer playerWithPlayerItem:playerItem];
self.mPlaybackView = [[AVPlayerPlaybackView alloc] initWithFrame:CGRectZero];
[self.view addSubview:self.mPlaybackView];
[self.mPlayer addObserver:self forKeyPath:#"status" options:0 context:AVPlayerPlaybackViewControllerStatusObservationContext];
}
- (void)observeValueForKeyPath:(NSString*) path ofObject:(id)object change:(NSDictionary*)change context:(void*)context
{
if (self.mPlayer.status == AVPlayerStatusReadyToPlay) {
[self.mPlaybackView setPlayer:self.mPlayer];
isReadyToPlay = YES;
_playVideoBtn.hidden = NO;
}
}
- (void) playVideo {
if (YES || isReadyToPlay) {
[self.mPlayer play];
}
}

From my experience, AVPlayer works with AVMutableComposition only if resource/video is bundled with app. If video resource is on network, then AVPlayer won't play with AVMUtableComposition, despite status reported by AVPlayerItem and AVPlayer as "Ready to Play".

Related

Memory Increases When Merging and Playing Audio

I'm trying to merge multiple pieces of audio into one simultaneous sound and then play it. I can merge and play, but my app's memory usage keeps increasing over time.
Looking online, it seems like there's ARC/memory issues with AVPlayer.
I've added all relevant code below for setting up the files, merging them and then playing them.
Setting up sound files
- (void) setUpSoundFiles {
AVURLAsset *songAsset = nil;
AVAssetTrack *sourceAudioTrack = nil;
for (int i = 0; i < numberOfSoundsToMerge; i++){
NSURL *url = [NSURL fileURLWithPath:[[NSBundle mainBundle] pathForResource:[#(i) stringValue] ofType:#"mp3"]];
songAsset = [AVURLAsset URLAssetWithURL:url options:nil];
sourceAudioTrack = [[songAsset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0];
[songTracks insertObject:sourceAudioTrack atIndex:i];
[songAssets insertObject:[AVURLAsset URLAssetWithURL:url options:nil] atIndex:i];
//[sourceAudioTrack.asset cancelLoading];
}
}
Merging Sound Files
- (void) mergeAudio: (AVMutableComposition *)composition{
for(int i = 0; i < numberOfSoundsToMerge; i++) {
AVMutableCompositionTrack *track = [composition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
NSError *error = nil;
BOOL ok = NO;
CMTime startTime = CMTimeMakeWithSeconds(0, 1);
CMTime trackDuration = ((AVURLAsset *)[songAssets objectAtIndex:i]).duration;
CMTimeRange tRange = CMTimeRangeMake(startTime, trackDuration);
//Set Volume
AVMutableAudioMixInputParameters *trackMix = [AVMutableAudioMixInputParameters audioMixInputParametersWithTrack:track];
[trackMix setVolume:0.8f atTime:startTime];
[audioMixParams addObject:trackMix];
//Insert audio into track
ok = [track insertTimeRange:tRange ofTrack:(AVAssetTrack *)[songTracks objectAtIndex:i] atTime:CMTimeMake(0, 1) error:&error];
[((AVAssetTrack *)[songTracks objectAtIndex:i]).asset cancelLoading];
}
}
Playing Sound Files
- (void) playSounds {
AVMutableComposition *composition = [AVMutableComposition composition];
audioMixParams = [[NSMutableArray alloc] initWithObjects:nil];
[self mergeAudio:composition];
AVMutableAudioMix *audioMix = [AVMutableAudioMix audioMix];
audioMix.inputParameters = [NSArray arrayWithArray:audioMixParams];
AVPlayerItem *playerItem = [AVPlayerItem playerItemWithAsset:composition];
AVPlayer *player = [[AVPlayer alloc] initWithPlayerItem:playerItem];
[playerItem setAudioMix:audioMix];
[player play]; }

iOS AVAssetExportSession failed Code=-11820 only iPhone 5(c)

I want to export a video file from a composition with two video's (with audio) and one audio track. It works fine for iPhone 5s and later, but it fails on a iPhone 5c (iOS 9.2.1). The error is returned on this:
[_assetExport exportAsynchronouslyWithCompletionHandler:
^(void ) {
if (AVAssetExportSessionStatusCompleted == _assetExport.status) {
[self performSelectorOnMainThread:#selector(videoIsDone) withObject:nil waitUntilDone:YES];
} else {
NSLog(#"Export error: %#", _assetExport.error);
[self performSelectorOnMainThread:#selector(videoHasFailed) withObject:nil waitUntilDone:YES];
}
}
];
The log that it printed:
Export error: Error Domain=AVFoundationErrorDomain Code=-11820 "Cannot Complete Export" UserInfo={NSLocalizedRecoverySuggestion=Try exporting again., NSLocalizedDescription=Cannot Complete Export}
As stated, on my iPhone 5s, 6 and 6s it works very good, but only on my iPhone 5c it returns this error. Hopefully someone has experience with this.
The full code for creating the tracks and composition:
- (void) generateVideoWithInputPath:(NSString*)inputVideo andAudioFileName:(NSString*)audioFileName andVolume:(float)volume {
NSString* introVideoPath = [[NSBundle mainBundle] pathForResource:#"IntroVideo" ofType:#"mp4"];
NSURL* introVideoUrl = [NSURL fileURLWithPath:introVideoPath];
NSURL* video_inputFileUrl = [NSURL fileURLWithPath:inputVideo];
self.outputAssetURL = NULL;
self.outputFilePath = finalVideoPath;
NSURL* outputFileUrl = [NSURL fileURLWithPath:self.outputFilePath];
unlink([self.outputFilePath UTF8String]); // remove existing result
// Create composition
AVMutableComposition* mixComposition = [AVMutableComposition composition];
// Create Asset for introVideo
AVURLAsset* introVideoAsset = [[AVURLAsset alloc] initWithURL:introVideoUrl options:nil];
// Create time ranges
CMTime introStartTime = kCMTimeZero;
CMTime introEndTime = introVideoAsset.duration;
CMTimeRange introVideo_timeRange = CMTimeRangeMake(introStartTime, introEndTime);
//add VideoTrack of introVideo to composition
NSArray* introVideoAssetTracks = [introVideoAsset tracksWithMediaType:AVMediaTypeVideo];
AVAssetTrack* introVideoAssetTrack = ([introVideoAssetTracks count] > 0 ? [introVideoAssetTracks objectAtIndex:0] : nil);
AVMutableCompositionTrack* b_compositionVideoTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
[b_compositionVideoTrack insertTimeRange:introVideo_timeRange ofTrack:introVideoAssetTrack atTime:introStartTime error:nil];
// Add AudioTrack of introVideo to composition
NSArray* audioAssetTracksIntro = [introVideoAsset tracksWithMediaType:AVMediaTypeAudio];
AVAssetTrack* audioAssetTrackIntro = ([audioAssetTracksIntro count] > 0 ? [audioAssetTracksIntro objectAtIndex:0] : nil);
AVMutableCompositionTrack* a_compositionAudioTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
[a_compositionAudioTrack insertTimeRange:introVideo_timeRange ofTrack:audioAssetTrackIntro atTime:introStartTime error:nil];
// Create Asset for inputVideo
CMTime nextClipStartTime = introEndTime;
AVURLAsset* videoAsset = [[AVURLAsset alloc]initWithURL:video_inputFileUrl options:nil];
// Create time ranges
CMTime videoStartTime = kCMTimeZero;
CMTime videoEndTime = videoAsset.duration;
if (CMTIME_IS_INVALID(videoEndTime)) {
NSLog(#"videoEndTime is invalid");
}
CMTimeRange mainVideo_timeRange = CMTimeRangeMake(videoStartTime, videoEndTime);
// Add VideoTrack of inputVideo to composition
NSArray* videoAssetTracks2 = [videoAsset tracksWithMediaType:AVMediaTypeVideo];
AVAssetTrack* videoAssetTrack2 = ([videoAssetTracks2 count] > 0 ? [videoAssetTracks2 objectAtIndex:0] : nil);
// CMTime audioDurationFix = CMTimeAdd(videoAsset.duration, CMTimeMakeWithSeconds(-1.0f, 1));
// CMTimeRange video_timeRange = CMTimeRangeMake(kCMTimeZero, videoAsset.duration);
// CMTimeRange audio_timeRange = CMTimeRangeMake(kCMTimeZero, audioDurationFix);
AVMutableCompositionTrack* a_compositionVideoTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
[a_compositionVideoTrack insertTimeRange:mainVideo_timeRange ofTrack:videoAssetTrack2 atTime:nextClipStartTime error:nil];
// Add AudioTrack of inputVideo to composition
NSArray* audioAssetTracks2 = [videoAsset tracksWithMediaType:AVMediaTypeAudio];
AVAssetTrack* audioAssetTrack2 = ([audioAssetTracks2 count] > 0 ? [audioAssetTracks2 objectAtIndex:0] : nil);
//AVMutableCompositionTrack* a_compositionAudioTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
[a_compositionAudioTrack insertTimeRange:mainVideo_timeRange ofTrack:audioAssetTrack2 atTime:nextClipStartTime error:nil];
AVMutableAudioMix* audioMix = NULL;
if (audioFileName) {
NSURL* audio_inputFileUrl = [NSURL fileURLWithPath:audioFileName];
// Create Asset for audio (song)
AVURLAsset* audioAsset = [[AVURLAsset alloc]initWithURL:audio_inputFileUrl options:nil];
// Add Audio of song to composition
NSArray* audioAssetTracks = [audioAsset tracksWithMediaType:AVMediaTypeAudio];
AVAssetTrack* audioAssetTrack = ([audioAssetTracks count] > 0 ? [audioAssetTracks objectAtIndex:0] : nil);
AVMutableCompositionTrack* b_compositionAudioTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
[b_compositionAudioTrack insertTimeRange:mainVideo_timeRange ofTrack:audioAssetTrack atTime:nextClipStartTime error:nil];
// Set Volume of song
NSArray *tracksToDuck = [mixComposition tracksWithMediaType:AVMediaTypeAudio];
NSMutableArray *trackMixArray = [NSMutableArray array];
// for (int i = 0; i < [tracksToDuck count]; i++) {
AVAssetTrack *leTrack = [tracksToDuck objectAtIndex:0];
AVMutableAudioMixInputParameters *trackMix = [AVMutableAudioMixInputParameters audioMixInputParametersWithTrack:leTrack];
[trackMix setVolume:1 atTime:kCMTimeZero];
[trackMixArray addObject:trackMix];
AVAssetTrack *leTrack2 = [tracksToDuck objectAtIndex:1];
AVMutableAudioMixInputParameters *trackMix2 = [AVMutableAudioMixInputParameters audioMixInputParametersWithTrack:leTrack2];
[trackMix2 setVolume:volume atTime:kCMTimeZero];
[trackMixArray addObject:trackMix2];
// }
audioMix = [AVMutableAudioMix audioMix];
audioMix.inputParameters = trackMixArray;
}
// Export composition to videoFile
AVAssetExportSession* _assetExport = [[AVAssetExportSession alloc] initWithAsset:mixComposition presetName:AVAssetExportPresetHighestQuality];
_assetExport.outputFileType = AVFileTypeQuickTimeMovie; //#"com.apple.quicktime-movie";
_assetExport.outputURL = outputFileUrl;
_assetExport.videoComposition = [self getVideoComposition:videoAsset intro:introVideoAsset composition:mixComposition];
// Set song volume audio
if (audioMix != NULL) {
_assetExport.audioMix = audioMix;
}
[_assetExport exportAsynchronouslyWithCompletionHandler:
^(void ) {
if (AVAssetExportSessionStatusCompleted == _assetExport.status) {
[self performSelectorOnMainThread:#selector(videoIsDone) withObject:nil waitUntilDone:YES];
} else {
NSLog(#"Export error: %#", _assetExport.error);
[self performSelectorOnMainThread:#selector(videoHasFailed) withObject:nil waitUntilDone:YES];
}
}
];
}
-(AVMutableVideoComposition *) getVideoComposition:(AVAsset *)asset intro:(AVAsset *)intro composition:( AVMutableComposition*)composition{
AVMutableCompositionTrack *compositionIntroTrack = [composition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
AVMutableCompositionTrack *compositionVideoTrack = [composition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
NSArray *audioTracksArray = [intro tracksWithMediaType:AVMediaTypeVideo];
AVAssetTrack *introTrack;
if (audioTracksArray.count > 0) {
introTrack = [audioTracksArray objectAtIndex:0];
[compositionIntroTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, intro.duration) ofTrack:introTrack atTime:kCMTimeZero error:nil];
}
NSArray *videoTracksArray = [asset tracksWithMediaType:AVMediaTypeVideo];
AVAssetTrack *videoTrack;
if (videoTracksArray.count > 0) {
videoTrack = [videoTracksArray objectAtIndex:0];
[compositionVideoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, asset.duration) ofTrack:videoTrack atTime:intro.duration error:nil];
}
AVMutableVideoCompositionLayerInstruction *firstLayerInst = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:compositionIntroTrack];
AVMutableVideoCompositionLayerInstruction *secondLayerInst = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:compositionVideoTrack];
CGSize videoSize;
if (videoTrack && introTrack) {
CGSize trackDimensions = [videoTrack naturalSize];
videoSize = CGSizeMake(0, 0);
// turn around for portrait
if (trackDimensions.height>trackDimensions.width) {
videoSize = CGSizeMake(trackDimensions.width, trackDimensions.height);
} else {
videoSize = CGSizeMake(trackDimensions.height, trackDimensions.width);
}
CGAffineTransform transform = videoTrack.preferredTransform;
CGAffineTransform scale = CGAffineTransformMakeScale((videoSize.width/introTrack.naturalSize.width),(videoSize.height/introTrack.naturalSize.height));
[firstLayerInst setTransform:scale atTime:kCMTimeZero];
[secondLayerInst setTransform:transform atTime:kCMTimeZero];
} else {
videoSize = [[FilteringClass sharedFilteringClass] getVideoSize];
}
CMTime totalTime = CMTimeAdd(asset.duration, intro.duration);
NSLog(#"Total videotime: %lld", totalTime.value);
AVMutableVideoCompositionInstruction *inst = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
inst.timeRange = CMTimeRangeMake(kCMTimeZero, totalTime);
inst.layerInstructions = [NSArray arrayWithObjects:firstLayerInst, secondLayerInst, nil];
AVMutableVideoComposition *videoComposition = [AVMutableVideoComposition videoComposition];
videoComposition.instructions = [NSArray arrayWithObject:inst];
videoComposition.renderSize = videoSize;
videoComposition.frameDuration = CMTimeMake(1, 30);
videoComposition.renderScale = 1.0;
return videoComposition;
}
In my opinion you are hitting the decoder limit set in AVFoundation.In iOS 5 the decoder limit is 4 and in iOS 6 it is 16 ,so try to export small size video if it is working means the problem is with your video file ...might be it exceeds the decode set limit.

iOS Remove Particular Sound from a video

I have an application which plays some audio and also records a video+audio while that sound is playing. I would like to figure out a way to process the video so that the audio that was picked up by the microphone is removed from the resulting video.
For example, if I'm playing audioA, and then recording videoB with audioB (from the microphone), I want to somehow cancel out audioA from the resulting audioB, so that audioB is only the ambient noise and not the noise from the device speakers.
Any idea if there's a way to do this?
Bonus points if it can be done without any offline processing.
You have to deal with Playback part. But here is a code to Mix the selected Audio to the Recorded Video.
- (void)mixAudio:(AVAsset*)audioAsset startTime:(CMTime)startTime withVideo:(NSURL*)inputUrl affineTransform:(CGAffineTransform)affineTransform toUrl:(NSURL*)outputUrl outputFileType:(NSString*)outputFileType withMaxDuration:(CMTime)maxDuration withCompletionBlock:(void(^)(NSError *))completionBlock {
NSError * error = nil;
AVMutableComposition * composition = [[AVMutableComposition alloc] init];
AVMutableCompositionTrack * videoTrackComposition = [composition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
AVMutableCompositionTrack * audioTrackComposition = [composition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
AVURLAsset * fileAsset = [AVURLAsset URLAssetWithURL:inputUrl options:[NSDictionary dictionaryWithObject:[NSNumber numberWithBool:YES] forKey:AVURLAssetPreferPreciseDurationAndTimingKey]];
NSArray * videoTracks = [fileAsset tracksWithMediaType:AVMediaTypeVideo];
CMTime duration = ((AVAssetTrack*)[videoTracks objectAtIndex:0]).timeRange.duration;
if (CMTIME_COMPARE_INLINE(duration, >, maxDuration)) {
duration = maxDuration;
}
for (AVAssetTrack * track in [audioAsset tracksWithMediaType:AVMediaTypeAudio]) {
[audioTrackComposition insertTimeRange:CMTimeRangeMake(startTime, duration) ofTrack:track atTime:kCMTimeZero error:&error];
if (error != nil) {
completionBlock(error);
return;
}
}
for (AVAssetTrack * track in videoTracks) {
[videoTrackComposition insertTimeRange:CMTimeRangeMake(kCMTimeZero, duration) ofTrack:track atTime:kCMTimeZero error:&error];
if (error != nil) {
completionBlock(error);
return;
}
}
videoTrackComposition.preferredTransform = affineTransform;
AVAssetExportSession * exportSession = [[AVAssetExportSession alloc] initWithAsset:composition presetName:AVAssetExportPresetPassthrough];
exportSession.outputFileType = outputFileType;
exportSession.shouldOptimizeForNetworkUse = YES;
exportSession.outputURL = outputUrl;
[exportSession exportAsynchronouslyWithCompletionHandler:^ {
NSError * error = nil;
if (exportSession.error != nil) {
NSMutableDictionary * userInfo = [NSMutableDictionary dictionaryWithDictionary:exportSession.error.userInfo];
NSString * subLocalizedDescription = [userInfo objectForKey:NSLocalizedDescriptionKey];
[userInfo removeObjectForKey:NSLocalizedDescriptionKey];
[userInfo setObject:#"Failed to mix audio and video" forKey:NSLocalizedDescriptionKey];
[userInfo setObject:exportSession.outputFileType forKey:#"OutputFileType"];
[userInfo setObject:exportSession.outputURL forKey:#"OutputUrl"];
[userInfo setObject:subLocalizedDescription forKey:#"CauseLocalizedDescription"];
[userInfo setObject:[AVAssetExportSession allExportPresets] forKey:#"AllExportSessions"];
error = [NSError errorWithDomain:#"Error" code:500 userInfo:userInfo];
}
completionBlock(error);
}];
}

How to use removeTimeRange method of AVMutableComposition?

I have composed multiple videos and I want to remove last 0.5 second of all the tracks in the composition. I believe removeTimeRange can be used in situation.
Doc for this methods reads as
Removes a specified timeRange from all tracks of the composition
But not able to figure out the range that should be given to achieve this. My composition code is following:
AVAsset *asset0 = [self currentAsset:0];
AVAsset *asset1 = [self currentAsset:1];
AVAsset *asset2 = [self currentAsset:2];
AVAsset *asset3 = [self currentAsset:3];
AVAsset *asset4 = [self currentAsset:4];
NSArray *assets = #[asset0, asset1, asset2, asset3, asset4];
AVMutableComposition *mutableComposition = [AVMutableComposition composition];
AVMutableCompositionTrack *videoCompositionTrack = [mutableComposition addMutableTrackWithMediaType:AVMediaTypeVideo
preferredTrackID:kCMPersistentTrackID_Invalid];
AVMutableCompositionTrack *audioCompositionTrack = [mutableComposition addMutableTrackWithMediaType:AVMediaTypeAudio
preferredTrackID:kCMPersistentTrackID_Invalid];
NSMutableArray *instructions = [NSMutableArray new];
CGSize size = CGSizeZero;
CMTime time = kCMTimeZero;
for (AVAsset *asset in assets)
{
AVAssetTrack *assetTrack = [asset tracksWithMediaType:AVMediaTypeVideo].firstObject;
AVAssetTrack *audioAssetTrack = [asset tracksWithMediaType:AVMediaTypeAudio].firstObject;
NSError *error;
[videoCompositionTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, assetTrack.timeRange.duration)
ofTrack:assetTrack
atTime:time
error:&error];
if (error) {
NSLog(#"Error - %#", error.debugDescription);
}
[audioCompositionTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, assetTrack.timeRange.duration)
ofTrack:audioAssetTrack
atTime:time
error:&error];
if (error) {
NSLog(#"Error - %#", error.debugDescription);
}
AVMutableVideoCompositionInstruction *videoCompositionInstruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
videoCompositionInstruction.timeRange = CMTimeRangeMake(time, assetTrack.timeRange.duration);
videoCompositionInstruction.layerInstructions = #[[AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:videoCompositionTrack]];
[instructions addObject:videoCompositionInstruction];
time = CMTimeAdd(time, assetTrack.timeRange.duration);
if (CGSizeEqualToSize(size, CGSizeZero)) {
size = assetTrack.naturalSize;;
}
}
AVMutableVideoComposition *mutableVideoComposition = [AVMutableVideoComposition videoComposition];
mutableVideoComposition.instructions = instructions;
mutableVideoComposition.frameDuration = CMTimeMake(1, 30);
mutableVideoComposition.renderSize = size;
pi = [AVPlayerItem playerItemWithAsset:mutableComposition];
pi.videoComposition = mutableVideoComposition;
player = [AVPlayer playerWithPlayerItem:[[CameraEngine engine] pi]];
player.volume = 0.75;
playerLayer = [AVPlayerLayer playerLayerWithPlayer: player];
playerLayer.frame = self.bounds;
[self.layer addSublayer: playerLayer];
[playerLayer setNeedsDisplay];
[player play];
I want to remove 0.5 sec of video from all 5 tracks after composition because I get blank frames in between when track changes.
All the tracks are fine(no black frame at the end).
I have tried removing frames directly from AVMutableCompositionTrack but againg blank frames comes after composition.
So I want to know how to produce this time range?

AVMutableCompositionTrack setVolume not working

I'm having problem to set the audio volume when mixing a recorded video and an audio from resource.
Here is my code:
AVMutableComposition *composition = [[AVMutableComposition alloc] init];
NSString *resourcePath = [[NSBundle mainBundle] pathForResource:#"give-it-away" ofType:#"mp3"];
AVURLAsset *audioAsset = [[AVURLAsset alloc] initWithURL:[NSURL fileURLWithPath:resourcePath] options:[NSDictionary dictionaryWithObjectsAndKeys:[NSNumber numberWithBool:YES],AVURLAssetPreferPreciseDurationAndTimingKey, nil]];
AVMutableCompositionTrack *videoTrack = [composition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
AVMutableCompositionTrack *audioTrack = [composition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
[videoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, videoAsset.duration) ofTrack:[videoAsset.tracks objectAtIndex:0] atTime:kCMTimeZero error:nil];
[audioTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, audioTime) ofTrack:[audioAsset.tracks objectAtIndex:0] atTime:kCMTimeZero error:&videoError];
AVMutableAudioMixInputParameters *audioInputParams = [AVMutableAudioMixInputParameters audioMixInputParametersWithTrack:audioTrack] ;
[audioInputParams setVolume:0.3 atTime:kCMTimeZero];
[audioInputParams setTrackID:audioTrack.trackID];
AVMutableAudioMix *audioMix = [AVMutableAudioMix audioMix];
audioMix.inputParameters = [NSArray arrayWithObject:audioInputParams];
AVAssetExportSession *exportSession = [[AVAssetExportSession alloc] initWithAsset:composition presetName:AVAssetExportPresetPassthrough];
exportSession.outputURL = [NSURL fileURLWithPath:finalVideoWithAudioPath];
exportSession.outputFileType = AVFileTypeQuickTimeMovie;
exportSession.audioMix = audioMix;
[exportSession exportAsynchronouslyWithCompletionHandler:^{
switch (exportSession.status)
{
case AVAssetExportSessionStatusFailed:
{
[self performSelectorOnMainThread:#selector(doPostExportFailed) withObject:nil waitUntilDone:NO];
break;
}
case AVAssetExportSessionStatusCompleted:
{
[self performSelectorOnMainThread:#selector(doPostExportSuccess) withObject:nil waitUntilDone:YES];
break;
}
};
}];
The export completes successfully, but the audio volume does not change.
What am I doing wrong?
Thanks
UPDATED (December 2018)
This code is working for me on iOS 12.1.2:
- (void) combineAudio:(NSString*)audioPath forRecord:(VideoRecord*)record isResource:(BOOL)isResource isSilent:(BOOL)isSilent keepCurrentAudio:(BOOL)keepCurrentAudio withCompletionHandler:(void (^)(AVAssetExportSession* exportSession, NSString* exportPath))handler {
NSString *resourcePath = audioPath;
if (isResource) {
resourcePath = [[NSBundle mainBundle] pathForResource:resourcePath ofType:#"mp3"];
}
NSURL *url = [NSURL fileURLWithPath:resourcePath];
AVURLAsset* audioAsset = [[AVURLAsset alloc]initWithURL:url options:nil];
AVURLAsset* videoAsset = [[AVURLAsset alloc]initWithURL:[NSURL URLWithString:[NSString stringWithFormat:#"file://%#",record.videoPath]] options:nil];
AVMutableComposition *composition = [self getComposition];
AVMutableCompositionTrack *compositionAudioTrack = [composition addMutableTrackWithMediaType:AVMediaTypeAudio
preferredTrackID:kCMPersistentTrackID_Invalid];
AVMutableCompositionTrack *compositionAudioOriginalTrack = nil;
if (keepCurrentAudio && [videoAsset tracksWithMediaType:AVMediaTypeAudio].count > 0) {
compositionAudioOriginalTrack = [composition addMutableTrackWithMediaType:AVMediaTypeAudio
preferredTrackID:kCMPersistentTrackID_Invalid];
[compositionAudioOriginalTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, videoAsset.duration)
ofTrack:[[videoAsset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0]
atTime:kCMTimeZero error:nil];
}
if (isResource) {
CMTime videoDuration = videoAsset.duration;
if(CMTimeCompare(videoDuration, audioAsset.duration) == -1){
[compositionAudioTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, videoAsset.duration)
ofTrack:[[audioAsset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0]
atTime:kCMTimeZero error:nil];
} else if(CMTimeCompare(videoDuration, audioAsset.duration) == 1) {
CMTime currentTime = kCMTimeZero;
while(YES){
CMTime audioDuration = audioAsset.duration;
CMTime totalDuration = CMTimeAdd(currentTime,audioDuration);
if(CMTimeCompare(totalDuration, videoDuration)==1){
audioDuration = CMTimeSubtract(totalDuration,videoDuration);
}
[compositionAudioTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, audioDuration)
ofTrack:[[audioAsset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0]
atTime:currentTime error:nil];
currentTime = CMTimeAdd(currentTime, audioDuration);
if(CMTimeCompare(currentTime, videoDuration) == 1 || CMTimeCompare(currentTime, videoDuration) == 0){
break;
}
}
}
} else {
NSArray<AVAssetTrack *>* aTracks = [audioAsset tracksWithMediaType:AVMediaTypeAudio];
if (aTracks.count > 0) {
[compositionAudioTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, videoAsset.duration)
ofTrack:[aTracks objectAtIndex:0]
atTime:kCMTimeZero error:nil];
}
}
AVMutableCompositionTrack *compositionVideoTrack = [composition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
NSArray *tracks = [videoAsset tracksWithMediaType:AVMediaTypeVideo];
if (tracks.count == 0) {
CLSNSLog(#"%# - combineAudio - video tracks zero", NSStringFromClass([self class]));
// TODO - Handle this error.
return;
}
[compositionVideoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, videoAsset.duration)
ofTrack:[tracks objectAtIndex:0]
atTime:kCMTimeZero error:nil];
AVAssetExportSession* _assetExport = [[AVAssetExportSession alloc] initWithAsset:[composition copy]
presetName:AVAssetExportPresetHighestQuality];
NSString *exportPath = [record.videoPath stringByReplacingOccurrencesOfString:#".mp4" withString:#"_audio_added.mp4"];
if ([record.videoPath containsString:#".MOV"]) {
exportPath = [record.videoPath stringByReplacingOccurrencesOfString:#".MOV" withString:#"_audio_added.mp4"];
}
if ([record.videoPath containsString:#".mov"]) {
exportPath = [record.videoPath stringByReplacingOccurrencesOfString:#".mov" withString:#"_audio_added.mp4"];
}
NSURL *exportUrl = [NSURL fileURLWithPath:exportPath];
if ([[NSFileManager defaultManager] fileExistsAtPath:exportPath]) {
[[NSFileManager defaultManager] removeItemAtPath:exportPath error:nil];
}
AVMutableAudioMix *audioMix = [AVMutableAudioMix audioMix];
float volume = .5f;
if (keepCurrentAudio) {
volume = .6f;
}
AVMutableAudioMixInputParameters *audioInputParams = [AVMutableAudioMixInputParameters audioMixInputParametersWithTrack:compositionAudioTrack] ;
[audioInputParams setVolumeRampFromStartVolume:(isSilent ? .0f : volume) toEndVolume:(isSilent ? .0f : volume) timeRange:CMTimeRangeMake(kCMTimeZero, videoAsset.duration)];
[audioInputParams setTrackID:compositionAudioTrack.trackID];
NSArray *inputParams = [NSArray arrayWithObject:audioInputParams];
AVMutableAudioMixInputParameters *audioOriginalInputParams = nil;
if (keepCurrentAudio) {
audioOriginalInputParams = [AVMutableAudioMixInputParameters audioMixInputParametersWithTrack:compositionAudioOriginalTrack] ;
[audioInputParams setVolumeRampFromStartVolume:(isSilent ? .0f : .06f) toEndVolume:(isSilent ? .0f : .06f) timeRange:CMTimeRangeMake(kCMTimeZero, videoAsset.duration)];
[audioInputParams setTrackID:compositionAudioOriginalTrack.trackID];
inputParams = [NSArray arrayWithObjects:audioInputParams, audioOriginalInputParams, nil];
}
audioMix.inputParameters = inputParams;
_assetExport.outputFileType = AVFileTypeQuickTimeMovie;
_assetExport.audioMix = audioMix;
_assetExport.outputURL = exportUrl;
_assetExport.shouldOptimizeForNetworkUse = YES;
[_assetExport exportAsynchronouslyWithCompletionHandler:^{
handler(_assetExport, exportPath);
}];
}
I call this function like this (sound is the name of the resource file):
[[FBVideoEditor shared] combineAudio:sound forRecord:self.record isResource:YES isSilent:NO withCompletionHandler:^(AVAssetExportSession *exportSession, NSString *exportPath) {
switch (exportSession.status)
{
case AVAssetExportSessionStatusCancelled:
CLSNSLog(#"AVAssetExportSessionStatusCancelled");
break;
case AVAssetExportSessionStatusExporting:
CLSNSLog(#"AVAssetExportSessionStatusExporting");
break;
case AVAssetExportSessionStatusUnknown:
CLSNSLog(#"AVAssetExportSessionStatusUnknown");
break;
case AVAssetExportSessionStatusWaiting:
CLSNSLog(#"AVAssetExportSessionStatusWaiting");
break;
case AVAssetExportSessionStatusFailed:
{
CLSNSLog(#"Export Failed with error messsage: %#", exportSession.error.userInfo );
break;
}
case AVAssetExportSessionStatusCompleted:
{
// Success
}
};
}];
The VideoRecord class contains all necessary data for my videos. But in this case, it only uses the video path, so you can change it for a simple NSString.
Hope it helps.

Resources