AVErrorInvalidVideoComposition on export a video - ios

Before post my question, I will describe the reason of I'm compose a video with a single track. In short terms, I need a video with constant frame rate and I not found another ways to do that without export video and compose it with AVMutableComposition.
Ok, I'm very confusing with this, because sometimes my video are successful exported and sometimes not, when the video isn't exported, I got the error code -11841 (AVErrorInvalidVideoComposition). For me, this error is very generic and don't describe the reason why the video composition is invalid.
I checked all attributes but I can't find the reason why my AVMutableVideoComposition is incorrect.
There is the main code I've made to create video composition and export that.
- (void)saveVideoAtURL:(NSURL *)url withCompletionBlock:(CompletionBlock)completionBlock
{
NSError *error;
NSArray *videoTracks;
NSArray *audioTracks;
AVURLAsset *asset;
AVAssetTrack *videoAssetTrack;
AVAssetTrack *audioAssetTrack;
AVMutableComposition *composition;
AVMutableVideoComposition *videoComposition;
AVMutableCompositionTrack *videoCompositionTrack;
AVMutableCompositionTrack *audioCompositionTrack;
AVMutableVideoCompositionInstruction *videoCompositionInstruction;
AVMutableVideoCompositionLayerInstruction *videoCompositionLayerInstruction;
self.completionBlock = completionBlock;
NSDictionary *options = #{
AVURLAssetPreferPreciseDurationAndTimingKey:#YES
};
asset = [[AVURLAsset alloc] initWithURL:url options:options];
composition = [AVMutableComposition new];
videoCompositionTrack = [composition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
audioCompositionTrack = [composition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
CMTimeRange timeRange = CMTimeRangeMake(kCMTimeZero, asset.duration);
videoTracks = [asset tracksWithMediaType:AVMediaTypeVideo];
videoAssetTrack = videoTracks[0];
CGSize renderSize = videoAssetTrack.naturalSize;
audioTracks = [asset tracksWithMediaType:AVMediaTypeAudio];
audioAssetTrack = audioTracks[0];
[videoCompositionTrack insertTimeRange:timeRange ofTrack:videoAssetTrack atTime:kCMTimeZero error:&error];
if(error)
{
DLog(#"Error: %#", [error localizedDescription]);
self.completionBlock(NO, error);
return;
}
[audioCompositionTrack insertTimeRange:timeRange ofTrack:audioAssetTrack atTime:kCMTimeZero error:&error];
if(error)
{
DLog(#"Error: %#", [error localizedDescription]);
self.completionBlock(NO, error);
return;
}
// There is a very important instruction, without this, the video will be blank.
videoCompositionLayerInstruction = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:videoAssetTrack];
// Instructions about the video composition
videoCompositionInstruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
// Setup the video composition
videoComposition = [AVMutableVideoComposition videoComposition];
videoComposition.frameDuration = CMTimeMake(1, 30);
videoComposition.renderSize = renderSize;
videoComposition.renderScale = 1.0;
videoCompositionInstruction.timeRange = CMTimeRangeMake(kCMTimeZero, composition.duration);
videoCompositionInstruction.layerInstructions = #[videoCompositionLayerInstruction];
videoComposition.instructions = #[videoCompositionInstruction];
// Get a new url to export the video
NSURL *outputURL = [self generateOutputURL];
// Create and exporter and save the video locally
AVAssetExportSession *exporter = [[AVAssetExportSession alloc] initWithAsset:[composition copy] presetName:AVAssetExportPresetHighestQuality];
exporter.outputURL = outputURL;
exporter.videoComposition = videoComposition;
exporter.outputFileType = AVFileTypeQuickTimeMovie;
exporter.shouldOptimizeForNetworkUse = YES;
[exporter exportAsynchronouslyWithCompletionHandler:^{
switch (exporter.status) {
case AVAssetExportSessionStatusFailed:
case AVAssetExportSessionStatusCancelled:
DLog(#"Failed %#", exporter.error);
self.completionBlock(NO, exporter.error);
break;
case AVAssetExportSessionStatusExporting:
DLog(#"Exporting");
break;
case AVAssetExportSessionStatusCompleted:
[self exportDidFinish:exporter];
break;
default:
break;
}
}];
}

Related

Rotating Video with PBJVision and AVFoundation

I'm currently developing an iOS app using PBJVision and am trying to save landscape videos in the correct orientation.
I use the writeVideoAtPathToSavedPhotosAlbum: completionBlock: method to save the photo to the device:
[self.library writeVideoAtPathToSavedPhotosAlbum: videoPathURL
completionBlock:^(NSURL* assetURL, NSError* error) {
[self encodeVideoOrientation:assetURL]; // method below
if (error.code == 0) {
[self.library assetForURL:assetURL
resultBlock:^(ALAsset *asset) {
// assign the video to the album
}
failureBlock:^(NSError* error) {
NSLog(#"Failed to save video to library");
}];
} else {
}
}];
and then I use the saved video's assetURL in a method I wrote called encodeVideoOrientation with the intention of overwriting the file with the rotated video. Note that I apply the same rotation (90 degrees) to all videos for testing - yet no transformation is applied to the video, regardless of the orientation.
- (void)encodeVideoOrientation:(NSURL *)anOutputFileURL
{
CGAffineTransform rotationTransform;
CGAffineTransform transformToApply = videoAsset.preferredTransform;
CGSize renderSize;
CGFloat newWidth;
CGFloat newHeight;
float currentVideoRotation;
AVURLAsset * videoAsset = [[AVURLAsset alloc]initWithURL:anOutputFileURL options:nil];
AVAssetTrack *sourceVideoTrack = [[videoAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];
AVAssetTrack *sourceAudioTrack = [[videoAsset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0];
CGSize size = [sourceVideoTrack naturalSize];
// below, I test rotating the video regardless of orientation to check if portrait videos become rotated, which they do not
if (self.currentOrientationByAccelerometer == LANDSCAPE_RIGHT){
currentVideoRotation = M_PI; // for testing, video should appear upside down
newWidth = size.height;
newHeight = size.width;
NSLog(#"Should be landscape right");
} else if (self.currentOrientationByAccelerometer == LANDSCAPE_LEFT){
currentVideoRotation = M_PI; //for testing, should appear upside down
newWidth = size.height;
newHeight = size.width;
NSLog(#"Should be landscape left");
} else { //default to portrait
currentVideoRotation = M_PI;
newWidth = size.width;
newHeight = size.height;
NSLog(#"Should be portrait");
}
renderSize = CGSizeMake(newWidth, newHeight);
rotationTransform = CGAffineTransformMakeRotation(currentVideoRotation);
AVMutableComposition* composition = [AVMutableComposition composition];
AVMutableCompositionTrack *compositionVideoTrack = [composition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
[compositionVideoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, videoAsset.duration) ofTrack:sourceVideoTrack atTime:kCMTimeZero error:nil];
[compositionVideoTrack setPreferredTransform:rotationTransform]; //Rotations here?
AVMutableCompositionTrack *compositionAudioTrack = [composition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
[compositionAudioTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, videoAsset.duration) ofTrack:sourceAudioTrack atTime:kCMTimeZero error:nil];
AVMutableVideoCompositionInstruction *instruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
AVMutableVideoCompositionLayerInstruction *layerInstruction = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:compositionVideoTrack];
[layerInstruction setTransform:transformToApply atTime:kCMTimeZero]; //Rotations here?
instruction.layerInstructions = [NSArray arrayWithObject: layerInstruction];
AVMutableVideoComposition *videoComposition = [AVMutableVideoComposition videoComposition];
videoComposition.frameDuration = CMTimeMake(1,30);
videoComposition.renderScale = 1.0;
videoComposition.renderSize = renderSize;
instruction.timeRange = CMTimeRangeMake(kCMTimeZero, videoAsset.duration);
videoComposition.instructions = [NSArray arrayWithObject: instruction];
AVAssetExportSession * assetExport = [[AVAssetExportSession alloc] initWithAsset:composition presetName:AVAssetExportPresetMediumQuality];
NSString* videoName = #"export.mov";
NSString *anOutputFileURLPath = [NSTemporaryDirectory() stringByAppendingPathComponent:videoName];
NSURL * exportUrl = [NSURL fileURLWithPath:anOutputFileURLPath];
if ([[NSFileManager defaultManager] fileExistsAtPath:anOutputFileURLPath])
{
[[NSFileManager defaultManager] removeItemAtPath:anOutputFileURLPath error:nil];
}
assetExport.outputFileType = AVFileTypeMPEG4;
assetExport.outputURL = exportUrl;
assetExport.shouldOptimizeForNetworkUse = YES;
assetExport.videoComposition = videoComposition;
[assetExport exportAsynchronouslyWithCompletionHandler: ^(void ) {
switch (assetExport.status)
{
case AVAssetExportSessionStatusCompleted:
// export complete
NSLog(#"Export Complete");
break;
case AVAssetExportSessionStatusFailed:
NSLog(#"Export Failed");
NSLog(#"ExportSessionError: %#", [assetExport.error localizedDescription]);
// export error (see exportSession.error)
break;
case AVAssetExportSessionStatusCancelled:
NSLog(#"Export Cancelled");
NSLog(#"ExportSessionError: %#", [assetExport.error localizedDescription]);
// export cancelled
break;
}
}];
}
The export completes successfully, but no transformation seems to be applied to the video regardless of the CGAffineTransform.
I referenced this SO post in writing my code:
iOS AVFoundation: Setting Orientation of Video
And this appears to be an open issue according to this post on the project's github:
https://github.com/piemonte/PBJVision/issues/84
How do I save a rotated a video captured using PBJVision?

Video cracked while merging videos using AVMutableComposition

I'm merging the videos using AVMutableComposition with the below code,
- (void)MergeAndSave_internal{
AVMutableComposition *composition = [AVMutableComposition composition];
AVMutableCompositionTrack *compositionVideoTrack = [composition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
AVMutableCompositionTrack *compositionAudioTrack = [composition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
AVMutableVideoComposition *videoComposition = [AVMutableVideoComposition videoComposition];
videoComposition.frameDuration = CMTimeMake(1,30);
videoComposition.renderScale = 1.0;
AVMutableVideoCompositionInstruction *instruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
AVMutableVideoCompositionLayerInstruction *layerInstruction = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:compositionVideoTrack];
NSLog(#"%#",videoPathArray);
float time = 0;
CMTime startTime = kCMTimeZero;
for (int i = 0; i<videoPathArray.count; i++) {
AVURLAsset *sourceAsset = [AVURLAsset URLAssetWithURL:[NSURL fileURLWithPath:[videoPathArray objectAtIndex:i]] options:[NSDictionary dictionaryWithObject:[NSNumber numberWithBool:YES] forKey:AVURLAssetPreferPreciseDurationAndTimingKey]];
NSError *error = nil;
BOOL ok = NO;
AVAssetTrack *sourceVideoTrack = [[sourceAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];
AVAssetTrack *sourceAudioTrack = [[sourceAsset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0];
CGSize temp = CGSizeApplyAffineTransform(sourceVideoTrack.naturalSize, sourceVideoTrack.preferredTransform);
CGSize size = CGSizeMake(fabsf(temp.width), fabsf(temp.height));
CGAffineTransform transform = sourceVideoTrack.preferredTransform;
videoComposition.renderSize = sourceVideoTrack.naturalSize;
if (size.width > size.height) {
[layerInstruction setTransform:transform atTime:CMTimeMakeWithSeconds(time, 30)];
} else {
float s = size.width/size.height;
CGAffineTransform newe = CGAffineTransformConcat(transform, CGAffineTransformMakeScale(s,s));
float x = (size.height - size.width*s)/2;
CGAffineTransform newer = CGAffineTransformConcat(newe, CGAffineTransformMakeTranslation(x, 0));
[layerInstruction setTransform:newer atTime:CMTimeMakeWithSeconds(time, 30)];
}
if(i==0){
[compositionVideoTrack setPreferredTransform:sourceVideoTrack.preferredTransform];
}
ok = [compositionVideoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, [sourceAsset duration]) ofTrack:sourceVideoTrack atTime:startTime error:&error];
ok = [compositionAudioTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, [sourceAsset duration]) ofTrack:sourceAudioTrack atTime:startTime error:nil];
if (!ok) {
{
[radialView4 setHidden:YES];
NSLog(#"Export failed: %#", [[self.exportSession error] localizedDescription]);
UIAlertView *alert = [[UIAlertView alloc] initWithTitle:#"Error" message:#"Something Went Wrong :(" delegate:nil cancelButtonTitle:#"Ok" otherButtonTitles: nil, nil];
[alert show];
[radialView4 setHidden:YES];
break;
}
}
startTime = CMTimeAdd(startTime, [sourceAsset duration]);
}
instruction.layerInstructions = [NSArray arrayWithObject:layerInstruction];
instruction.timeRange = compositionVideoTrack.timeRange;
videoComposition.instructions = [NSArray arrayWithObject:instruction];
NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
NSString *documentsDirectory = [paths objectAtIndex:0];
NSString *myPathDocs = [documentsDirectory stringByAppendingPathComponent:
[NSString stringWithFormat:#"RampMergedVideo.mov"]];
unlink([myPathDocs UTF8String]);
NSURL *url = [NSURL fileURLWithPath:myPathDocs];
AVAssetExportSession *exporter = [[AVAssetExportSession alloc] initWithAsset:composition
presetName:AVAssetExportPreset1280x720];
exporter.outputURL=url;
exporter.outputFileType = AVFileTypeQuickTimeMovie;
exporter.shouldOptimizeForNetworkUse = YES;
[exporter exportAsynchronouslyWithCompletionHandler:^{
dispatch_async(dispatch_get_main_queue(), ^{
switch ([exporter status]) {
case AVAssetExportSessionStatusFailed:
NSLog(#"Export failed: %#", [exporter error]);
break;
case AVAssetExportSessionStatusCancelled:
NSLog(#"Export canceled");
break;
case AVAssetExportSessionStatusCompleted:{
NSLog(#"Export successfully");
}
default:
break;
}
if (exporter.status != AVAssetExportSessionStatusCompleted){
NSLog(#"Retry export");
}
});
}];
}
But video looks cracked while saving to system and playing in quick time player. I think that the problem in CFAffline transform. Can anyone please advice ?
Here's the cracked screen in the middle of the video :
You haven't set videoComposition to the AVAssetExportSession. Try doing this exporter.videoComposition = videoComposition;. Havent tried this though but should work.

Merging audio and video file in ios

I am trying to merge Audio and Video file, below is my code :
AVURLAsset* audioAsset = [[AVURLAsset alloc]initWithURL:_BAckgroungMusicFileURL options:nil];
AVURLAsset* videoAsset = [AVURLAsset URLAssetWithURL:[NSURL fileURLWithPath:_AppDel._RecordedVideoPath] options:nil];
AVMutableComposition* composition = [AVMutableComposition composition];
AVMutableVideoComposition *videoComposition = [AVMutableVideoComposition videoComposition];
videoComposition.frameDuration = CMTimeMake(1,24);
videoComposition.renderScale = 1.0;
AVMutableCompositionTrack *compositionCommentaryTrack = [composition addMutableTrackWithMediaType:AVMediaTypeAudio
preferredTrackID:kCMPersistentTrackID_Invalid];
[compositionCommentaryTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, videoAsset.duration)
ofTrack:[[audioAsset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0]
atTime:kCMTimeZero error:nil];
AVMutableCompositionTrack *compositionVideoTrack = [composition addMutableTrackWithMediaType:AVMediaTypeVideo
preferredTrackID:kCMPersistentTrackID_Invalid];
[compositionVideoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, videoAsset.duration)
ofTrack:[[videoAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0]
atTime:kCMTimeZero error:nil];
AVMutableVideoCompositionInstruction *instruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
AVMutableVideoCompositionLayerInstruction *layerInstruction = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:compositionVideoTrack];
AVAssetTrack *sourceVideoTrack = [[videoAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];
CGAffineTransform rotationTransform = CGAffineTransformMakeRotation(M_PI/2);
CGAffineTransform rotateTranslate = CGAffineTransformTranslate(rotationTransform,320,0);
[compositionVideoTrack setPreferredTransform:sourceVideoTrack.preferredTransform];
[layerInstruction setTransform:rotateTranslate atTime:kCMTimeZero];
instruction.layerInstructions = [NSArray arrayWithObject: layerInstruction];
videoComposition.instructions = [NSArray arrayWithObject: instruction];
AVAssetExportSession* _assetExport = [[AVAssetExportSession alloc] initWithAsset:composition
presetName:AVAssetExportPresetPassthrough];
NSDate *_TodayDate = [NSDate dateWithTimeIntervalSinceNow:0];
_CalenderDate = [_TodayDate description];
_CombinedVideoPath = [NSString stringWithFormat:#"%#/%#.mov", DOCUMENTS_FOLDER, _CalenderDate] ;
NSURL *exportUrl = [NSURL fileURLWithPath:_CombinedVideoPath];
_assetExport.outputFileType = #"com.apple.quicktime-movie";
NSLog(#"file type %#",_assetExport.outputFileType);
_assetExport.outputURL = exportUrl;
_assetExport.shouldOptimizeForNetworkUse = YES;
[_assetExport exportAsynchronouslyWithCompletionHandler:
^(void ) {
switch (_assetExport.status)
{
case AVAssetExportSessionStatusCompleted:
NSLog(#"AVAssetExportSessionStatusCompleted");
[self SaveVideo];
break;
case AVAssetExportSessionStatusFailed:
NSLog(#"Export Failed");
NSLog(#"ExportSessionError: %#", [_assetExport.error localizedDescription]);
[self mergingCompleted];
break;
case AVAssetExportSessionStatusCancelled:
[self mergingCompleted];
NSLog(#"Export Session Status: %d", _assetExport.status);
NSLog(#"Export Failed");
NSLog(#"ExportSessionError: %#", [_assetExport.error localizedDescription]);
break;
}
}];
Here the video to be merged is recorded video. But my problem is that, when I am trying to merge this audio and video the resultant video contains the recorded video and the audio. It does not contain the audio of the recorded video. How can I have the video its audio to be merged with another audio.
Any help will be appreciated.
Thanks in advance

AVAssetExportSession rotation error's when video is from camera roll

I am trying to convert .mov video to .mp4 and at the same time correcting the orientation.
The code I use below works great when recording a video using UIImagePickerController however if the video is selected from the camera roll I get this error and I don't see why:
Export failed: Operation Stopped : Error
Domain=AVFoundationErrorDomain Code=-11841 "Operation Stopped"
UserInfo=0x1815ca50 {NSLocalizedDescription=Operation Stopped,
NSLocalizedFailureReason=The video could not be composed.}
I have tried first saving the video to another file, but it made no difference.
Here is the code I am using to convert the video:
- (void)convertVideoToLowQuailtyAndFixRotationWithInputURL:(NSURL*)inputURL handler:(void (^)(NSURL *outURL))handler
{
if ([[inputURL pathExtension] isEqualToString:#"MOV"])
{
NSURL *outputURL = [inputURL URLByDeletingPathExtension];
outputURL = [outputURL URLByAppendingPathExtension:#"mp4"];
AVURLAsset *avAsset = [AVURLAsset URLAssetWithURL:inputURL options:nil];
AVAssetTrack *sourceVideoTrack = [[avAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];
AVAssetTrack *sourceAudioTrack = [[avAsset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0];
AVMutableComposition* composition = [AVMutableComposition composition];
AVMutableCompositionTrack *compositionVideoTrack = [composition addMutableTrackWithMediaType:AVMediaTypeVideo
preferredTrackID:kCMPersistentTrackID_Invalid];
[compositionVideoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, avAsset.duration)
ofTrack:sourceVideoTrack
atTime:kCMTimeZero error:nil];
[compositionVideoTrack setPreferredTransform:sourceVideoTrack.preferredTransform];
AVMutableCompositionTrack *compositionAudioTrack = [composition addMutableTrackWithMediaType:AVMediaTypeAudio
preferredTrackID:kCMPersistentTrackID_Invalid];
[compositionAudioTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, avAsset.duration)
ofTrack:sourceAudioTrack
atTime:kCMTimeZero error:nil];
AVMutableVideoComposition *videoComposition = [self getVideoComposition:avAsset];
NSArray *compatiblePresets = [AVAssetExportSession exportPresetsCompatibleWithAsset:avAsset];
if ([compatiblePresets containsObject:AVAssetExportPresetMediumQuality])
{
AVAssetExportSession *exportSession = [[AVAssetExportSession alloc]initWithAsset:composition presetName:AVAssetExportPresetMediumQuality];
exportSession.outputURL = outputURL;
exportSession.outputFileType = AVFileTypeMPEG4;
exportSession.shouldOptimizeForNetworkUse = YES;
exportSession.videoComposition = videoComposition;
[exportSession exportAsynchronouslyWithCompletionHandler:^{
switch ([exportSession status])
{
case AVAssetExportSessionStatusFailed:
NSLog(#"Export failed: %# : %#", [[exportSession error] localizedDescription], [exportSession error]);
handler(nil);
break;
case AVAssetExportSessionStatusCancelled:
NSLog(#"Export canceled");
handler(nil);
break;
default:
handler(outputURL);
break;
}
}];
}
} else {
handler(inputURL);
}
}
- (AVMutableVideoComposition *)getVideoComposition:(AVAsset *)asset
{
AVAssetTrack *videoTrack = [[asset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];
AVMutableComposition *composition = [AVMutableComposition composition];
AVMutableVideoComposition *videoComposition = [AVMutableVideoComposition videoComposition];
CGSize videoSize = videoTrack.naturalSize;
BOOL isPortrait_ = [self isVideoPortrait:asset];
if(isPortrait_) {
// NSLog(#"video is portrait ");
videoSize = CGSizeMake(videoSize.height, videoSize.width);
}
composition.naturalSize = videoSize;
videoComposition.renderSize = videoSize;
videoComposition.frameDuration = CMTimeMakeWithSeconds( 1 / videoTrack.nominalFrameRate, 600);
AVMutableCompositionTrack *compositionVideoTrack;
compositionVideoTrack = [composition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
[compositionVideoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, asset.duration) ofTrack:videoTrack atTime:kCMTimeZero error:nil];
AVMutableVideoCompositionLayerInstruction *layerInst;
layerInst = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:videoTrack];
[layerInst setTransform:videoTrack.preferredTransform atTime:kCMTimeZero];
AVMutableVideoCompositionInstruction *inst = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
inst.timeRange = CMTimeRangeMake(kCMTimeZero, asset.duration);
inst.layerInstructions = [NSArray arrayWithObject:layerInst];
videoComposition.instructions = [NSArray arrayWithObject:inst];
return videoComposition;
}
AVFoundation Error Constant -11841 means that you have an invalid video composition. See this link if you'd like more info on the error constants:
https://developer.apple.com/library/ios/documentation/AVFoundation/Reference/AVFoundation_ErrorConstants/Reference/reference.html
While no major errors pop out at me immediately, I can suggest the following ways to narrow down the source of your problem.
First, instead of passing nil for the error parameter in these calls:
[compositionVideoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, avAsset.duration)
ofTrack:sourceVideoTrack
atTime:kCMTimeZero error:nil];
create an NSError object and pass the reference to it like so:
NSError *error = nil;
[compositionVideoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, avAsset.duration)
ofTrack:sourceVideoTrack
atTime:kCMTimeZero error:&error];
Examine the error to make sure your video and audio tracks get inserted in the composition track correctly. The error should be nil if all goes well.
if(error)
NSLog(#"Insertion error: %#", error);
You may also want to check your AVAsset's composable and exportable, and hasProtectedContent properties. If these are not YES, YES, and NO, respectively, you may have a problem creating your new video file.
Occasionally I've seen an issue where creating a time range for an audio track does not like the 600 timescale when used in a composition with a video track. You may want to create a new CMTime for the duration (avAsset.duration) in
CMTimeRangeMake(kCMTimeZero, avAsset.duration)
only for inserting the audio track. In the new CMTime,use a timescale of 44100 (or whatever the sample rate of the audio track is.) The same goes for your videoComposition.frameDuration. Depending on the nominalFrameRate of your video track, your time may not be represented correctly with 600 timescale.
Finally, there's a helpful tool provided by Apple to debug video compositions:
https://developer.apple.com/library/mac/samplecode/AVCompositionDebugViewer/Introduction/Intro.html
It gives a visual representation of your composition and you can see where things don't look like they should.
You should definitely use the method isValidForAsset:timeRange:validationDelegate: of AVVideoCompostion, it will diagnose any issue with your video composition.
I had the same problem and the solution for me was to create the layerInstruction with the AVMutableCompositionTrack instead of the original track:
layerInst = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:compositionVideoTrack];
Try commenting the below line and run your project
exportSession.videoComposition = videoComposition;

iOS AVExportSession fails with trimmed video only

i've created a method that trims and exports videos based on a given time range. It also rotates the video to landscape.
For some reason though, the AVAssetExportSession fails when attempting to process a video that previously was trimmed using UIVideoEditorController.
Anyone encountered this issue before?
I get this error:
AVAssetExportSessionStatusFailed: Error Domain=AVFoundationErrorDomain Code=-11841 "The operation couldn’t be completed. (AVFoundationErrorDomain error -11841.)"
For this method:
- (void) trimVideoWithRange: (CMTimeRange)range fromInputURL: (NSURL *)inputURL withCompletionHandler:(void (^)(BOOL success, NSURL *outputURL))handler;
{
AVAsset *asset = [AVURLAsset assetWithURL:inputURL];
AVAssetTrack *videoTrack = [asset tracksWithMediaType:AVMediaTypeVideo][0];
AVAssetTrack *audioTrack = [asset tracksWithMediaType:AVMediaTypeAudio][0];
NSLog(#"%#, %#, %#", asset, videoTrack, audioTrack);
NSError *error;
// Create a video composition
AVMutableComposition *composition = [AVMutableComposition composition];
AVMutableCompositionTrack *videoCompositionTrack = [composition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
error = nil;
[videoCompositionTrack insertTimeRange:videoTrack.timeRange ofTrack:videoTrack atTime:CMTimeMakeWithSeconds(0, NSEC_PER_SEC) error:&error];
NSLog(#"videoCompositionTrack timeRange: %lld, %lld", videoCompositionTrack.timeRange.start.value, videoCompositionTrack.timeRange.duration.value);
if(error)
NSLog(#"videoCompositionTrack error: %#", error);
AVMutableCompositionTrack *audioCompositionTrack = [composition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
error = nil;
[audioCompositionTrack insertTimeRange:audioTrack.timeRange ofTrack:audioTrack atTime:CMTimeMakeWithSeconds(0, NSEC_PER_SEC) error:&error];
NSLog(#"audioCompositionTrack timeRange: %lld, %lld", audioCompositionTrack.timeRange.start.value, audioCompositionTrack.timeRange.duration.value);
if(error)
NSLog(#"audioCompositionTrack error: %#", error);
// Rotate video if needed
CGAffineTransform rotationTransform = videoTrack.preferredTransform;
// Create video composition
AVMutableVideoComposition *videoComposition = [AVMutableVideoComposition videoComposition];
videoComposition.renderScale = 1.0;
videoComposition.renderSize = videoTrack.naturalSize;
videoComposition.frameDuration = CMTimeMake(1, 30);
// Apply the transform which may have been changed
AVMutableVideoCompositionLayerInstruction *instruction = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:videoTrack];
[instruction setTransform:rotationTransform atTime:kCMTimeZero];
// Set the time range and layer instructions for the video composition
AVMutableVideoCompositionInstruction *videoTrackInstruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
videoTrackInstruction.layerInstructions = [NSArray arrayWithObject:instruction];
videoTrackInstruction.timeRange = range;
videoComposition.instructions = #[videoTrackInstruction];
// Check so that we can proceed with our desired output preset
NSArray *compatiblePresets = [AVAssetExportSession exportPresetsCompatibleWithAsset:composition];
if (![compatiblePresets containsObject:AVAssetExportPreset960x540])
{
// Nope.
if(handler)
handler(NO, nil);
return;
}
// Create export session with composition
AVAssetExportSession *exportSession = [[AVAssetExportSession alloc] initWithAsset:composition presetName:AVAssetExportPreset960x540];
// Configure export session
exportSession.outputURL = [NSURL fileURLWithPath:pathToTemporaryOutput];
exportSession.outputFileType = AVFileTypeQuickTimeMovie;
exportSession.videoComposition = videoComposition;
exportSession.shouldOptimizeForNetworkUse = YES;
// Export async
[exportSession exportAsynchronouslyWithCompletionHandler:^{
switch ([exportSession status])
{
case AVAssetExportSessionStatusCompleted:
{
dispatch_async(dispatch_get_main_queue(), ^{
// Everything OK. Execute completion block with URL to rendered video
if(handler)
handler(exportSession.status == AVAssetExportSessionStatusCompleted, [NSURL fileURLWithPath:pathToTemporaryOutput]);
});
}
break;
case AVAssetExportSessionStatusFailed:
{
NSError *exportError = exportSession.error;
NSLog(#"AVAssetExportSessionStatusFailed: %#", exportError.description);
dispatch_async(dispatch_get_main_queue(), ^{
// No go. Execute handler with fail.
if(handler)
handler(NO, nil);
});
}
break;
}
}];
}
This works for me. Here exportSession is AVAssetExportSession
NSURL *videoFileUrl = [NSURL fileURLWithPath:self.originalVideoPath];
AVAsset *anAsset = [[AVURLAsset alloc] initWithURL:videoFileUrl options:nil];
NSArray *compatiblePresets = [AVAssetExportSession exportPresetsCompatibleWithAsset:anAsset];
if ([compatiblePresets containsObject:AVAssetExportPresetMediumQuality]) {
self.exportSession = [[AVAssetExportSession alloc]
initWithAsset:anAsset presetName:AVAssetExportPresetPassthrough];
// Implementation continues.
NSURL *furl = [NSURL fileURLWithPath:self.tmpVideoPath];
self.exportSession.outputURL = furl;
//provide outputFileType acording to video format extension
self.exportSession.outputFileType = AVFileTypeQuickTimeMovie;
CMTime start = CMTimeMakeWithSeconds(self.startTime, anAsset.duration.timescale);
CMTime duration = CMTimeMakeWithSeconds(self.stopTime-self.startTime, anAsset.duration.timescale);
CMTimeRange range = CMTimeRangeMake(start, duration);
self.exportSession.timeRange = range;
self.self.btnTrim.hidden = YES;
self.myActivityIndicator.hidden = NO;
[self.myActivityIndicator startAnimating];
[self.exportSession exportAsynchronouslyWithCompletionHandler:^{
switch ([self.exportSession status]) {
case AVAssetExportSessionStatusFailed:
NSLog(#"Export failed: %#", [[self.exportSession error] localizedDescription]);
break;
case AVAssetExportSessionStatusCancelled:
NSLog(#"Export canceled");
break;
default:
NSLog(#"Triming Completed");
dispatch_async(dispatch_get_main_queue(), ^{
[self.myActivityIndicator stopAnimating];
self.myActivityIndicator.hidden = YES;
});
break;
}
}];
}

Resources