Issue in iOS8 while merging two videos - ios

My application merges two videos.
I am using following code to merge two videos using AVVideoComposition
- (void)buildSequenceComposition:(AVMutableComposition *)mixComposition andVideoComposition:(AVMutableVideoComposition *)videoComposition withAudioMix:(AVMutableAudioMix *)audioMix
{
CMTime nextClipStartTime = kCMTimeZero;
NSInteger i;
// No transitions: place clips into one video track and one audio track in composition.
AVMutableVideoCompositionInstruction * MainInstruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
NSMutableArray*arrLayerInstruction = [NSMutableArray array];
for (i = 0; i < [_clips count]; i++ )
{
AVMutableCompositionTrack *compositionVideoTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
AVMutableCompositionTrack *compositionAudioTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
AVURLAsset *asset = [[_clips objectAtIndex:i] objectForKey:#"videoURL"];
CMTimeRange timeRangeInAsset;
timeRangeInAsset = CMTimeRangeMake(kCMTimeZero, [asset duration]);
AVAssetTrack *clipVideoTrack = [[asset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];
NSError*err = nil;
[compositionVideoTrack insertTimeRange:timeRangeInAsset ofTrack:clipVideoTrack atTime:nextClipStartTime error:&err];
if ([[asset tracksWithMediaType:AVMediaTypeAudio] count] != 0)
{
AVAssetTrack *clipAudioTrack = [[asset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0];
[compositionAudioTrack insertTimeRange:timeRangeInAsset ofTrack:clipAudioTrack atTime:nextClipStartTime error:nil];
AVMutableAudioMixInputParameters *exportAudioMixInputParameters = [AVMutableAudioMixInputParameters audioMixInputParametersWithTrack:[[asset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0]];
[exportAudioMixInputParameters setVolume:[[[_clips objectAtIndex:i] objectForKey:#"videoSoundLevel"] floatValue] atTime:nextClipStartTime];
exportAudioMixInputParameters.trackID = compositionAudioTrack.trackID;
audioMix.inputParameters=[NSArray arrayWithObject:exportAudioMixInputParameters];
}
//FIXING ORIENTATION//
AVMutableVideoCompositionLayerInstruction *FirstlayerInstruction = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:compositionVideoTrack];
UIImageOrientation FirstAssetOrientation_ = UIImageOrientationUp;
BOOL isFirstAssetPortrait_ = NO;
CGAffineTransform firstTransform = clipVideoTrack.preferredTransform;
if(firstTransform.a == 0 && firstTransform.b == 1.0 && firstTransform.c == -1.0 && firstTransform.d == 0)
{
FirstAssetOrientation_= UIImageOrientationRight;
isFirstAssetPortrait_ = YES;
}
if(firstTransform.a == 0 && firstTransform.b == -1.0 && firstTransform.c == 1.0 && firstTransform.d == 0)
{
FirstAssetOrientation_ = UIImageOrientationLeft;
isFirstAssetPortrait_ = YES;
}
if(firstTransform.a == 1.0 && firstTransform.b == 0 && firstTransform.c == 0 && firstTransform.d == 1.0)
{
FirstAssetOrientation_ = UIImageOrientationUp;
}
if(firstTransform.a == -1.0 && firstTransform.b == 0 && firstTransform.c == 0 && firstTransform.d == -1.0)
{
FirstAssetOrientation_ = UIImageOrientationDown;
}
CGFloat tHeight = [clipVideoTrack naturalSize].height;
CGFloat tWidth = [clipVideoTrack naturalSize].width;
if(isFirstAssetPortrait_)
{
tHeight = [clipVideoTrack naturalSize].height;
tWidth = [clipVideoTrack naturalSize].width;
CGFloat temp = tHeight;
tHeight = tWidth;
tWidth = temp;
}
CGFloat FirstAssetScaleToFitRatioWidth = [mixComposition naturalSize].width/tWidth;
CGFloat FirstAssetScaleToFitRatioHeight = [mixComposition naturalSize].height/tHeight;
CGFloat FirstAssetScaleToFitRatio = FirstAssetScaleToFitRatioWidth>FirstAssetScaleToFitRatioHeight?FirstAssetScaleToFitRatioHeight:FirstAssetScaleToFitRatioWidth;
CGAffineTransform FirstAssetScaleFactor = CGAffineTransformMakeScale(FirstAssetScaleToFitRatio,FirstAssetScaleToFitRatio);
CGSize naturalSize = CGSizeApplyAffineTransform(CGSizeMake(tWidth, tHeight), FirstAssetScaleFactor);
CGAffineTransform transform = CGAffineTransformIdentity;
CGSize translateSize = CGSizeMake(0, 0);
if (FirstAssetScaleToFitRatioWidth<FirstAssetScaleToFitRatioHeight)
{
transform = CGAffineTransformMakeTranslation(0, ([mixComposition naturalSize].height-naturalSize.height)/2);
translateSize.height = ([mixComposition naturalSize].height-naturalSize.height)/2;
}
else if (FirstAssetScaleToFitRatioWidth==FirstAssetScaleToFitRatioHeight)
{
}
else
{
transform = CGAffineTransformMakeTranslation(([mixComposition naturalSize].width-naturalSize.width)/2, 0);
translateSize.width = ([mixComposition naturalSize].width-naturalSize.width)/2;
}
[FirstlayerInstruction setTransform:CGAffineTransformConcat(CGAffineTransformConcat(clipVideoTrack.preferredTransform, FirstAssetScaleFactor),transform) atTime:kCMTimeZero];
[FirstlayerInstruction setOpacity:0.0 atTime:CMTimeAdd(nextClipStartTime, timeRangeInAsset.duration)];
[FirstlayerInstruction setOpacity:1.0 atTime:nextClipStartTime];
[arrLayerInstruction addObject:FirstlayerInstruction];
nextClipStartTime = CMTimeAdd(nextClipStartTime, timeRangeInAsset.duration);
}
MainInstruction.timeRange = CMTimeRangeMake(kCMTimeZero, nextClipStartTime);
MainInstruction.layerInstructions = arrLayerInstruction;;
videoComposition.instructions = [NSArray arrayWithObject:MainInstruction];
}
Although it works fine with iOS7, while exporting video with AVVideoCompositon in iOS8, it gives me following error:
Title :Error Domain=AVFoundationErrorDomain Code=-11821 “Cannot Decode” { NSLocalizedFailureReason=The media data could not be decoded. It may be damaged.}
It works fine with iOS7 and other prior to iOS versions, but not in iOS8.
I have also tried Apple's sample code from AVSampleEditor and it also gives me same error while exporting video in iOS8.
Kindly help me to solve the problem. Thanks.

Check this demo code.
Working for me

Related

AVMutableComposition - Portrait video gets landscaped [duplicate]

I have used below code to add image overlay over video and then export the new generated video to document directory. But strangely,video gets rotated by 90 degrees.
- (void)buildTransitionComposition:(AVMutableComposition *)composition andVideoComposition:(AVMutableVideoComposition *)videoComposition
{
CMTime nextClipStartTime = kCMTimeZero;
NSInteger i;
// Make transitionDuration no greater than half the shortest clip duration.
CMTime transitionDuration = self.transitionDuration;
for (i = 0; i < [_clips count]; i++ ) {
NSValue *clipTimeRange = [_clipTimeRanges objectAtIndex:i];
if (clipTimeRange) {
CMTime halfClipDuration = [clipTimeRange CMTimeRangeValue].duration;
halfClipDuration.timescale *= 2; // You can halve a rational by doubling its denominator.
transitionDuration = CMTimeMinimum(transitionDuration, halfClipDuration);
}
}
// Add two video tracks and two audio tracks.
AVMutableCompositionTrack *compositionVideoTracks[2];
AVMutableCompositionTrack *compositionAudioTracks[2];
compositionVideoTracks[0] = [composition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
compositionVideoTracks[1] = [composition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
compositionAudioTracks[0] = [composition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
compositionAudioTracks[1] = [composition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
CMTimeRange *passThroughTimeRanges = alloca(sizeof(CMTimeRange) * [_clips count]);
CMTimeRange *transitionTimeRanges = alloca(sizeof(CMTimeRange) * [_clips count]);
// Place clips into alternating video & audio tracks in composition, overlapped by transitionDuration.
for (i = 0; i < [_clips count]; i++ ) {
NSInteger alternatingIndex = i % 2; // alternating targets: 0, 1, 0, 1, ...
AVURLAsset *asset = [_clips objectAtIndex:i];
NSValue *clipTimeRange = [_clipTimeRanges objectAtIndex:i];
CMTimeRange timeRangeInAsset;
if (clipTimeRange)
timeRangeInAsset = [clipTimeRange CMTimeRangeValue];
else
timeRangeInAsset = CMTimeRangeMake(kCMTimeZero, [asset duration]);
AVAssetTrack *clipVideoTrack = [[asset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];
[compositionVideoTracks[alternatingIndex] insertTimeRange:timeRangeInAsset ofTrack:clipVideoTrack atTime:nextClipStartTime error:nil];
/*
CGAffineTransform t = clipVideoTrack.preferredTransform;
NSLog(#"Transform1 : %#",t);
*/
AVAssetTrack *clipAudioTrack = [[asset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0];
[compositionAudioTracks[alternatingIndex] insertTimeRange:timeRangeInAsset ofTrack:clipAudioTrack atTime:nextClipStartTime error:nil];
// Remember the time range in which this clip should pass through.
// Every clip after the first begins with a transition.
// Every clip before the last ends with a transition.
// Exclude those transitions from the pass through time ranges.
passThroughTimeRanges[i] = CMTimeRangeMake(nextClipStartTime, timeRangeInAsset.duration);
if (i > 0) {
passThroughTimeRanges[i].start = CMTimeAdd(passThroughTimeRanges[i].start, transitionDuration);
passThroughTimeRanges[i].duration = CMTimeSubtract(passThroughTimeRanges[i].duration, transitionDuration);
}
if (i+1 < [_clips count]) {
passThroughTimeRanges[i].duration = CMTimeSubtract(passThroughTimeRanges[i].duration, transitionDuration);
}
// The end of this clip will overlap the start of the next by transitionDuration.
// (Note: this arithmetic falls apart if timeRangeInAsset.duration < 2 * transitionDuration.)
nextClipStartTime = CMTimeAdd(nextClipStartTime, timeRangeInAsset.duration);
nextClipStartTime = CMTimeSubtract(nextClipStartTime, transitionDuration);
// Remember the time range for the transition to the next item.
transitionTimeRanges[i] = CMTimeRangeMake(nextClipStartTime, transitionDuration);
}
// Set up the video composition if we are to perform crossfade or push transitions between clips.
NSMutableArray *instructions = [NSMutableArray array];
// Cycle between "pass through A", "transition from A to B", "pass through B", "transition from B to A".
for (i = 0; i < [_clips count]; i++ ) {
NSInteger alternatingIndex = i % 2; // alternating targets
// Pass through clip i.
AVMutableVideoCompositionInstruction *passThroughInstruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
passThroughInstruction.timeRange = passThroughTimeRanges[i];
AVMutableVideoCompositionLayerInstruction *passThroughLayer = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:compositionVideoTracks[alternatingIndex]];
/*
CGAffineTransform rotationTransform = CGAffineTransformMakeRotation(M_PI_2);
CGAffineTransform rotateTranslate = CGAffineTransformTranslate(rotationTransform,320,0);
[passThroughLayer setTransform:rotateTranslate atTime:kCMTimeZero];
*/
passThroughInstruction.layerInstructions = [NSArray arrayWithObject:passThroughLayer];
[instructions addObject:passThroughInstruction];
if (i+1 < [_clips count]) {
// Add transition from clip i to clip i+1.
AVMutableVideoCompositionInstruction *transitionInstruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
transitionInstruction.timeRange = transitionTimeRanges[i];
AVMutableVideoCompositionLayerInstruction *fromLayer = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:compositionVideoTracks[alternatingIndex]];
AVMutableVideoCompositionLayerInstruction *toLayer = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:compositionVideoTracks[1-alternatingIndex]];
if (self.transitionType == SimpleEditorTransitionTypeCrossFade) {
// Fade out the fromLayer by setting a ramp from 1.0 to 0.0.
[fromLayer setOpacityRampFromStartOpacity:1.0 toEndOpacity:0.0 timeRange:transitionTimeRanges[i]];
}
else if (self.transitionType == SimpleEditorTransitionTypePush) {
// Set a transform ramp on fromLayer from identity to all the way left of the screen.
[fromLayer setTransformRampFromStartTransform:CGAffineTransformIdentity toEndTransform:CGAffineTransformMakeTranslation(-composition.naturalSize.width, 0.0) timeRange:transitionTimeRanges[i]];
// Set a transform ramp on toLayer from all the way right of the screen to identity.
[toLayer setTransformRampFromStartTransform:CGAffineTransformMakeTranslation(+composition.naturalSize.width, 0.0) toEndTransform:CGAffineTransformIdentity timeRange:transitionTimeRanges[i]];
}
transitionInstruction.layerInstructions = [NSArray arrayWithObjects:fromLayer, toLayer, nil];
[instructions addObject:transitionInstruction];
}
}
videoComposition.instructions = instructions;
}
Please help,as I am not able to export portrait video in proper mode.Any help appreciated.
Thank you.
By default, when you export video using AVAssetExportSession then video will be rotated from its original orientation. You have to apply its transform to set it exact orientation.You please try below code to do the same.
- (AVMutableVideoCompositionLayerInstruction *)layerInstructionAfterFixingOrientationForAsset:(AVAsset *)inAsset
forTrack:(AVMutableCompositionTrack *)inTrack
atTime:(CMTime)inTime
{
//FIXING ORIENTATION//
AVMutableVideoCompositionLayerInstruction *videolayerInstruction = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:inTrack];
AVAssetTrack *videoAssetTrack = [[inAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];
UIImageOrientation videoAssetOrientation_ = UIImageOrientationUp;
BOOL isVideoAssetPortrait_ = NO;
CGAffineTransform videoTransform = videoAssetTrack.preferredTransform;
if(videoTransform.a == 0 && videoTransform.b == 1.0 && videoTransform.c == -1.0 && videoTransform.d == 0) {videoAssetOrientation_= UIImageOrientationRight; isVideoAssetPortrait_ = YES;}
if(videoTransform.a == 0 && videoTransform.b == -1.0 && videoTransform.c == 1.0 && videoTransform.d == 0) {videoAssetOrientation_ = UIImageOrientationLeft; isVideoAssetPortrait_ = YES;}
if(videoTransform.a == 1.0 && videoTransform.b == 0 && videoTransform.c == 0 && videoTransform.d == 1.0) {videoAssetOrientation_ = UIImageOrientationUp;}
if(videoTransform.a == -1.0 && videoTransform.b == 0 && videoTransform.c == 0 && videoTransform.d == -1.0) {videoAssetOrientation_ = UIImageOrientationDown;}
CGFloat FirstAssetScaleToFitRatio = 320.0 / videoAssetTrack.naturalSize.width;
if(isVideoAssetPortrait_) {
FirstAssetScaleToFitRatio = 320.0/videoAssetTrack.naturalSize.height;
CGAffineTransform FirstAssetScaleFactor = CGAffineTransformMakeScale(FirstAssetScaleToFitRatio,FirstAssetScaleToFitRatio);
[videolayerInstruction setTransform:CGAffineTransformConcat(videoAssetTrack.preferredTransform, FirstAssetScaleFactor) atTime:kCMTimeZero];
}else{
CGAffineTransform FirstAssetScaleFactor = CGAffineTransformMakeScale(FirstAssetScaleToFitRatio,FirstAssetScaleToFitRatio);
[videolayerInstruction setTransform:CGAffineTransformConcat(CGAffineTransformConcat(videoAssetTrack.preferredTransform, FirstAssetScaleFactor),CGAffineTransformMakeTranslation(0, 160)) atTime:kCMTimeZero];
}
[videolayerInstruction setOpacity:0.0 atTime:inTime];
return videolayerInstruction;
}
I hope this will help you.
AVAssetTrack *assetTrack = [[inAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];
AVMutableCompositionTrack *mutableTrack = [mergeComposition mutableTrackCompatibleWithTrack:assetTrack];
AVMutableVideoCompositionLayerInstruction *assetInstruction = [self layerInstructionAfterFixingOrientationForAsset:inAsset forTrack:myLocalVideoTrack atTime:videoTotalDuration];
Above is the code to call mentioned method where inAsset is your video asset and videoTotalDuration is your video total duration in CMTime.mergeComposition is object of AVMutableComposition class.
Hope this will help.
EDIT: This is not any callback method or event, you have to call it expectedly with required parameters as mentioned above.
Here's a slightly easier way if you simply want to maintain original rotation.
// Grab the source track from AVURLAsset for example.
AVAssetTrack *assetVideoTrack = [asset tracksWithMediaType:AVMediaTypeVideo].lastObject;
// Grab the composition video track from AVMutableComposition you already made.
AVMutableCompositionTrack *compositionVideoTrack = [composition tracksWithMediaType:AVMediaTypeVideo].lastObject;
// Apply the original transform.
if (assetVideoTrack && compositionVideoTrack) {
[compositionVideoTrack setPreferredTransform:assetVideoTrack.preferredTransform];
}
// Export...
Use these below method to set correct orientation according to video asset orientation in AVMutableVideoComposition
-(AVMutableVideoComposition *) getVideoComposition:(AVAsset *)asset
{
AVAssetTrack *videoTrack = [[asset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];
AVMutableComposition *composition = [AVMutableComposition composition];
AVMutableVideoComposition *videoComposition = [AVMutableVideoComposition videoComposition];
CGSize videoSize = videoTrack.naturalSize;
BOOL isPortrait_ = [self isVideoPortrait:asset];
if(isPortrait_) {
NSLog(#"video is portrait ");
videoSize = CGSizeMake(videoSize.height, videoSize.width);
}
composition.naturalSize = videoSize;
videoComposition.renderSize = videoSize;
// videoComposition.renderSize = videoTrack.naturalSize; //
videoComposition.frameDuration = CMTimeMakeWithSeconds( 1 / videoTrack.nominalFrameRate, 600);
AVMutableCompositionTrack *compositionVideoTrack;
compositionVideoTrack = [composition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
[compositionVideoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, asset.duration) ofTrack:videoTrack atTime:kCMTimeZero error:nil];
AVMutableVideoCompositionLayerInstruction *layerInst;
layerInst = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:videoTrack];
[layerInst setTransform:videoTrack.preferredTransform atTime:kCMTimeZero];
AVMutableVideoCompositionInstruction *inst = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
inst.timeRange = CMTimeRangeMake(kCMTimeZero, asset.duration);
inst.layerInstructions = [NSArray arrayWithObject:layerInst];
videoComposition.instructions = [NSArray arrayWithObject:inst];
return videoComposition;
}
-(BOOL) isVideoPortrait:(AVAsset *)asset
{
BOOL isPortrait = FALSE;
NSArray *tracks = [asset tracksWithMediaType:AVMediaTypeVideo];
if([tracks count] > 0) {
AVAssetTrack *videoTrack = [tracks objectAtIndex:0];
CGAffineTransform t = videoTrack.preferredTransform;
// Portrait
if(t.a == 0 && t.b == 1.0 && t.c == -1.0 && t.d == 0)
{
isPortrait = YES;
}
// PortraitUpsideDown
if(t.a == 0 && t.b == -1.0 && t.c == 1.0 && t.d == 0) {
isPortrait = YES;
}
// LandscapeRight
if(t.a == 1.0 && t.b == 0 && t.c == 0 && t.d == 1.0)
{
isPortrait = NO;
}
// LandscapeLeft
if(t.a == -1.0 && t.b == 0 && t.c == 0 && t.d == -1.0)
{
isPortrait = NO;
}
}
return isPortrait;
}
In swift dizy answer..this works for me
var assetVideoTrack = (sourceAsset.tracksWithMediaType(AVMediaTypeVideo)).last as! AVAssetTrack
var compositionVideoTrack = (composition.tracksWithMediaType(AVMediaTypeVideo)).last as! AVMutableCompositionTrack
if (assetVideoTrack.playable && compositionVideoTrack.playable) {
compositionVideoTrack.preferredTransform = assetVideoTrack.preferredTransform
}
This is the code for swift 4 works perfectly to merge videos in best orientation
let mainComposition = AVMutableComposition()
let compositionVideoTrack = mainComposition.addMutableTrack(withMediaType: .video, preferredTrackID: kCMPersistentTrackID_Invalid)
let soundtrackTrack = mainComposition.addMutableTrack(withMediaType: .audio, preferredTrackID: kCMPersistentTrackID_Invalid)
var insertTime = kCMTimeZero
for videoAsset in arrayVideos {
try! compositionVideoTrack?.insertTimeRange(CMTimeRangeMake(kCMTimeZero, videoAsset.duration), of: videoAsset.tracks(withMediaType: .video)[0], at: insertTime)
var assetVideoTrack = (videoAsset.tracks(withMediaType: AVMediaType.video)).last as! AVAssetTrack
var compositionVideoTrack = (mainComposition.tracks(withMediaType: AVMediaType.video)).last as! AVMutableCompositionTrack
compositionVideoTrack.preferredTransform = assetVideoTrack.preferredTransform
try! soundtrackTrack?.insertTimeRange(CMTimeRangeMake(kCMTimeZero, videoAsset.duration), of: videoAsset.tracks(withMediaType: .audio)[0], at: insertTime)
insertTime = CMTimeAdd(insertTime, videoAsset.duration)
}
let outputFileURL = URL(fileURLWithPath: NSTemporaryDirectory() + "merge1uupo.MOV")
let fileManager = FileManager()
//fileManager.removeItemIfExisted(outputFileURL)
let exporter = AVAssetExportSession(asset: mainComposition, presetName: AVAssetExportPresetHighestQuality)
exporter?.outputURL = outputFileURL
exporter?.outputFileType = AVFileType.mov
exporter?.shouldOptimizeForNetworkUse = true
exporter?.exportAsynchronously {
DispatchQueue.main.async {
// Do what you want at the end
}
}
swift 2:
do {
let paths = NSSearchPathForDirectoriesInDomains(
NSSearchPathDirectory.DocumentDirectory, NSSearchPathDomainMask.UserDomainMask, true)
let documentsDirectory: AnyObject = paths[0]
//this will be changed to accommodate dynamic videos
let dataPath = documentsDirectory.stringByAppendingPathComponent(videoFileName+".MOV")
let videoAsset = AVURLAsset(URL: NSURL(fileURLWithPath: dataPath), options: nil)
let imgGenerator = AVAssetImageGenerator(asset: videoAsset)
imgGenerator.appliesPreferredTrackTransform = true
let cgImage = try imgGenerator.copyCGImageAtTime(CMTimeMake(0, 1), actualTime: nil)
let uiImage = UIImage(CGImage: cgImage)
videoThumb.image = uiImage
} catch let err as NSError {
print("Error generating thumbnail: \(err)")
}

Portrait Video plays in black screen

Update
Don't use simulator to test video translations. But I have codec issue
asked in another question here. Any help is appreciated.
I am using AVURLAsset to create my videos and they work fine as long the videos picked from gallery are in landscape mode. But when I use a portrait video it either plays in black screen(audio plays) or the frames are twisted (see image).
Update:
I tried using the CGAffineTransform still no luck.
Here's the code:
-(void) createVideo{
AVMutableComposition* mixComposition = [AVMutableComposition composition];
NSDictionary *options = #{AVURLAssetPreferPreciseDurationAndTimingKey:#YES};
_videoAsset = [[AVURLAsset alloc]initWithURL:video_url options:options];
CMTime startTimeV=CMTimeMakeWithSeconds(videoStartTime.floatValue, 1);
CMTime endTimeV=CMTimeMakeWithSeconds(videoEndTime.floatValue, 1);
CMTimeRange video_timeRange = CMTimeRangeMake(startTimeV,endTimeV);
AVMutableCompositionTrack *a_compositionVideoTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
[a_compositionVideoTrack insertTimeRange:video_timeRange ofTrack:[[_videoAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0] atTime:kCMTimeZero error:nil];
AVMutableVideoCompositionInstruction *mainInstruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
mainInstruction.timeRange = CMTimeRangeMake(kCMTimeZero, self.videoAsset.duration);
AVMutableVideoCompositionLayerInstruction *videolayerInstruction = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:a_compositionVideoTrack];
AVAssetTrack *videoAssetTrack = [[self.videoAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];
CGSize trackDimensions = {
.width = 0.0,
.height = 0.0,
};
trackDimensions = [videoAssetTrack naturalSize];
int width = trackDimensions.width;
int height = trackDimensions.height;
UIImageOrientation videoAssetOrientation_ = UIImageOrientationUp;
BOOL isVideoAssetPortrait_ = NO;
CGAffineTransform videoTransform = videoAssetTrack.preferredTransform;
if (videoTransform.a == 0 && videoTransform.b == 1.0 && videoTransform.c == -1.0 && videoTransform.d == 0) {
videoAssetOrientation_ = UIImageOrientationRight;
isVideoAssetPortrait_ = YES;
}
if (videoTransform.a == 0 && videoTransform.b == -1.0 && videoTransform.c == 1.0 && videoTransform.d == 0) {
videoAssetOrientation_ = UIImageOrientationLeft;
isVideoAssetPortrait_ = YES;
}
if (videoTransform.a == 1.0 && videoTransform.b == 0 && videoTransform.c == 0 && videoTransform.d == 1.0) {
videoAssetOrientation_ = UIImageOrientationUp;
}
if (videoTransform.a == -1.0 && videoTransform.b == 0 && videoTransform.c == 0 && videoTransform.d == -1.0) {
videoAssetOrientation_ = UIImageOrientationDown;
}
// CGAffineTransform transformToApply=CGAffineTransformMakeRotation(DEGREES_TO_RADIANS(90.0));
[videolayerInstruction setTransform:videoAssetTrack.preferredTransform atTime:kCMTimeZero];
[videolayerInstruction setOpacity:0.0 atTime:self.videoAsset.duration];
mainInstruction.layerInstructions = [NSArray arrayWithObjects:videolayerInstruction,nil];
AVMutableVideoComposition *mainCompositionInst = [AVMutableVideoComposition videoComposition];
CGSize naturalSize;
if(isVideoAssetPortrait_){
naturalSize = CGSizeMake(videoAssetTrack.naturalSize.height, videoAssetTrack.naturalSize.width);
} else {
naturalSize = videoAssetTrack.naturalSize;
}
float renderWidth, renderHeight;
renderWidth = naturalSize.width;
renderHeight = naturalSize.height;
mainCompositionInst.renderSize = CGSizeMake(renderWidth, renderHeight);
mainCompositionInst.instructions = [NSArray arrayWithObject:mainInstruction];
mainCompositionInst.frameDuration = CMTimeMake(1, 30);
And the export :
-(void)export{
NSArray *dirPaths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
NSString *docsDir = [dirPaths objectAtIndex:0];
NSString* fileName=[NSString stringWithFormat:#"myvideo%lld.mp4",[#(floor([[NSDate date] timeIntervalSince1970])) longLongValue]+1];
NSString *outputFilePath = [docsDir stringByAppendingPathComponent:[NSString stringWithFormat:#"%#",fileName]];
NSURL *outputFileUrl = [NSURL fileURLWithPath:outputFilePath];
if ([[NSFileManager defaultManager] fileExistsAtPath:outputFilePath])
[[NSFileManager defaultManager] removeItemAtPath:outputFilePath error:nil];
AVAssetExportSession* _assetExport = [[AVAssetExportSession alloc] initWithAsset:mixComposition presetName:AVAssetExportPresetHighestQuality];
_assetExport.outputFileType = AVFileTypeMPEG4;
_assetExport.outputURL = outputFileUrl;
_assetExport.videoComposition = mainCompositionInst;
ShareViewController *newViewController = [self.storyboard instantiateViewControllerWithIdentifier:#"vidShare"];
[_assetExport exportAsynchronouslyWithCompletionHandler:
^(void ) {
dispatch_async(dispatch_get_main_queue(), ^{
newViewController.videoFilePath=outputFileUrl;
[self.navigationController pushViewController:newViewController animated:YES];
});
}
];
}
Landscape = OK
Portrait = FAIL
I've had this issue before. Set videolayerInstruction's transform using videoAssetTrack.preferredTransform works in most cases. But sometimes preferredTransform may lack of value of tx(ty) (you should check CGAffineTransform In Apple API Reference if you don't know what is tx(ty)) or tx(ty) has inaccurate value which results in wrong video positioning(such as playing in black screen).
So the point is: you should use preferredTransform to determine the origin video orientation and make the transform of your own.
here is the code of getting the orientation of track:
- (UIInterfaceOrientation)orientationForTrack:(AVAsset *)asset {
UIInterfaceOrientation orientation = UIInterfaceOrientationPortrait;
NSArray *tracks = [asset tracksWithMediaType:AVMediaTypeVideo];
if([tracks count] > 0) {
AVAssetTrack *videoTrack = [tracks objectAtIndex:0];
CGAffineTransform t = videoTrack.preferredTransform;
// Portrait
if(t.a == 0 && t.b == 1.0 && t.c == -1.0 && t.d == 0) {
orientation = UIInterfaceOrientationPortrait;
}
// PortraitUpsideDown
if(t.a == 0 && t.b == -1.0 && t.c == 1.0 && t.d == 0) {
orientation = UIInterfaceOrientationPortraitUpsideDown;
}
// LandscapeRight
if(t.a == 1.0 && t.b == 0 && t.c == 0 && t.d == 1.0) {
orientation = UIInterfaceOrientationLandscapeRight;
}
// LandscapeLeft
if(t.a == -1.0 && t.b == 0 && t.c == 0 && t.d == -1.0) {
orientation = UIInterfaceOrientationLandscapeLeft;
}
}
return orientation;
}
And code for getting the needed transform to apply to the videolayerInstruction's transform:
- (CGAffineTransform)transformBasedOnAsset:(AVAsset *)asset {
UIInterfaceOrientation orientation = [AVUtilities orientationForTrack:asset];
AVAssetTrack *assetTrack = [asset tracksWithMediaType:AVMediaTypeVideo][0];
CGSize naturalSize = assetTrack.naturalSize;
CGAffineTransform finalTranform;
switch (orientation) {
case UIInterfaceOrientationLandscapeLeft:
finalTranform = CGAffineTransformMake(-1, 0, 0, -1, naturalSize.width, naturalSize.height);
break;
case UIInterfaceOrientationLandscapeRight:
finalTranform = CGAffineTransformMake(1, 0, 0, 1, 0, 0);
break;
case UIInterfaceOrientationPortrait:
finalTranform = CGAffineTransformMake(0, 1, -1, 0, naturalSize.height, 0);
break;
case UIInterfaceOrientationPortraitUpsideDown:
finalTranform = CGAffineTransformMake(0, -1, 1, 0, 0, naturalSize.width);
break;
default:
break;
}
return finalTranform;
}
Hope it works for you.

ios ELCImagePicker: Connection to assetsd was interrupted or assetsd died

I am using ELCImagePicker to select multiple video from the library and getting this "Connection to assetsd was interrupted or assetsd died" error when I am trying to export multiple recorded video selected from the library. But it works fine if I select all the downloaded video using ELCImagePicker or I use UIImagePicker to select these recorded video from the library. Is there any solution for this type of problem?
My Code:
-(void)elcImagePickerController:(ELCImagePickerController *)picker didFinishPickingMediaWithInfo:(NSArray *)info
{
[self dismissViewControllerAnimated:YES completion:nil];
for (NSDictionary *dict in info) {
if ([dict objectForKey:UIImagePickerControllerMediaType] == ALAssetTypeVideo){
if ([dict objectForKey:UIImagePickerControllerOriginalImage]){
videoUrl=[dict objectForKey:UIImagePickerControllerReferenceURL];
[self InsertVideoAsset];
}
}
}
[self GetMargedVideo];
}
Some Times my merged composition only play the audio,not the video,but sometimes both audio and video works fine. Is there any problem of the bellow code? Please help me...
-(void)GetMargedVideo{
LastTime=kCMTimeZero;
TotalTime=kCMTimeZero;
mixComposition=nil; // AVMutableComposition
mainCompositionInst=nil; // AVMutableVideoComposition
mixComposition=[AVMutableComposition composition];
mainInstruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
renderWidth=1280;
renderHeight=1280;
[Objects removeAllObjects];
//LayerInstruction used to get video layer Instructions
AVMutableVideoCompositionLayerInstruction *firstlayerInstruction;
self.stokeimage.hidden=YES;
for(int i=0; i<[VideoInfo count];i++)
{
self.stokeimage.hidden=NO;
TargetVideo=i;
VideoTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo
preferredTrackID:kCMPersistentTrackID_Invalid];
AudioTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeAudio
preferredTrackID:kCMPersistentTrackID_Invalid];
VideoProperty *vp =[VideoInfo objectAtIndex:i];
STime=vp.startTime;
ETime=vp.endTime;
TimeDiff=CMTimeSubtract(ETime, STime);
LastTime=TotalTime;
TotalTime=CMTimeAdd(TotalTime, TimeDiff);
vp.appearTime=LastTime;
TargetTime=LastTime;
avasset=[AVAsset assetWithURL:vp.Url];
//Insert Video and Audio to the Composition "mixComposition"
[VideoTrack insertTimeRange:CMTimeRangeMake(STime, TimeDiff)
ofTrack:[[avasset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0] atTime:LastTime error:nil];
if([[avasset tracksWithMediaType:AVMediaTypeAudio] count])
{
if(!GetMusic)
{
[AudioTrack insertTimeRange:CMTimeRangeMake(STime, TimeDiff)
ofTrack:[[avasset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0] atTime:LastTime error:nil];
}
}
// Add instructions
if(vp.GetInstuction)
{
// GET INSTRUCTION: if Video already have instructions
firstlayerInstruction = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:VideoTrack];
[firstlayerInstruction setTransform:vp.LayerInstruction atTime:LastTime];
[firstlayerInstruction setOpacity:0 atTime:TotalTime];
[Objects addObject:firstlayerInstruction];
}
else
{
// GET INSTRUCTION: When a Video add first time to the composition
AVAssetTrack *assetTrack = [[avasset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];
AVMutableVideoCompositionLayerInstruction *layerInstruction = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:VideoTrack];
CGAffineTransform videoTransform = assetTrack.preferredTransform;
CGSize naturalSize = assetTrack.naturalSize;
BOOL bLandscape = NO;
CGSize renderSize = CGSizeMake(self.videoplayer.frame.size.width * [[UIScreen mainScreen] scale], self.videoplayer.frame.size.width * [[UIScreen mainScreen] scale]);
renderSize =CGSizeMake(renderWidth, renderHeight);
if(self.videoplayer.frame.size.width > self.videoplayer.frame.size.height && bIsVideoPortrait)
{
bLandscape = YES;
renderSize = CGSizeMake(renderSize.height, renderSize.width);
naturalSize = CGSizeMake(naturalSize.height, naturalSize.width);
}
else if(self.videoplayer.frame.size.height > self.videoplayer.frame.size.width && !bIsVideoPortrait)
{
bLandscape = YES;
renderSize = CGSizeMake(renderSize.height, renderSize.width);
naturalSize = CGSizeMake(naturalSize.height, naturalSize.width);
}
//Orientation Check
CGAffineTransform firstTransform = assetTrack.preferredTransform;
BOOL PotraitVideo=NO;
if (firstTransform.a == 0 && firstTransform.b == 1.0 && firstTransform.c == -1.0 && firstTransform.d == 0) {
PotraitVideo=YES;
// NSLog(#"Potratit Video");
}
if (firstTransform.a == 0 && firstTransform.b == -1.0 && firstTransform.c == 1.0 && firstTransform.d == 0) {
PotraitVideo=YES;
// NSLog(#"Potratit Video");
}
//Orientation Check Finish
if(bIsVideoPortrait)
naturalSize = CGSizeMake(naturalSize.height, naturalSize.width);
scaleValue = 1;
translationPoint = CGPointMake(self.videoplayer.frame.origin.x, self.videoplayer.frame.origin.y);
CGFloat pointX = translationPoint.x * naturalSize.width / self.videoplayer.frame.size.width;
CGFloat pointY = translationPoint.y * naturalSize.height / self.videoplayer.frame.size.height;
pointY=0;
pointX=0;
CGAffineTransform new = CGAffineTransformConcat(videoTransform, CGAffineTransformMakeScale(scaleValue, scaleValue));
CGAffineTransform newer = CGAffineTransformConcat(new, CGAffineTransformMakeTranslation(pointX, pointY));
CGFloat rotateTranslateX = 0;
CGFloat rotateTranslateY = 0;
if(rotationValue - 0.0f > 0.01f && rotationValue - 180.f < 0.01)
rotateTranslateX = MIN((naturalSize.width * rotationValue) / 90.0f, naturalSize.width);
if(rotationValue - 90.0f > 0.01f && rotationValue < 360.0f)
rotateTranslateY = MIN((naturalSize.height * rotationValue) / 180.0f, naturalSize.height);
CGAffineTransform rotationT = CGAffineTransformConcat(newer, CGAffineTransformMakeRotation(DEGREES_TO_RADIANS(rotationValue)));
CGAffineTransform rotateTranslate = CGAffineTransformConcat(rotationT, CGAffineTransformMakeTranslation(rotateTranslateX, rotateTranslateY));
CGSize temp = CGSizeApplyAffineTransform(assetTrack.naturalSize, videoTransform);
CGSize size = CGSizeMake(fabsf(temp.width), fabsf(temp.height));
if(bLandscape)
{
size = CGSizeMake(size.height, size.width);
}
float s1 = renderSize.width/size.width;
float s2 = renderSize.height/size.height;
float s = MIN(s1, s2);
CGAffineTransform new2 = CGAffineTransformConcat(rotateTranslate, CGAffineTransformMakeScale(s,s));
float x = (renderSize.width - size.width*s)/2;
float y = (renderSize.height - size.height*s)/2;
newer2 = CGAffineTransformIdentity;
if(bLandscape)
newer2 = CGAffineTransformConcat(new2, CGAffineTransformMakeTranslation(x, y));
else
newer2 = CGAffineTransformConcat(new2, CGAffineTransformMakeTranslation(x, y));
//Store layerInstruction to an array "Objects"
[layerInstruction setTransform:newer2 atTime:LastTime];
[layerInstruction setOpacity:0.0 atTime: TotalTime];
[Objects addObject:layerInstruction];
vp.GetInstuction=YES;
vp.LayerInstruction=newer2;
vp.Portrait=PotraitVideo;
[VideoInfo replaceObjectAtIndex:i withObject:vp];
}
}
if(GetMusic)
{
OriginalAsset=mixComposition;
AudioTrack=nil;
AudioTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeAudio
preferredTrackID:kCMPersistentTrackID_Invalid];
[AudioTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, TotalTime)
ofTrack:[[MusicAsset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0] atTime:kCMTimeZero error:nil];
}
//Apply all the instruction to the the Videocomposition "mainCompositionInst"
mainInstruction.timeRange = CMTimeRangeMake(kCMTimeZero, TotalTime);
mainInstruction.layerInstructions = Objects;
mainCompositionInst = [AVMutableVideoComposition videoComposition];
mainCompositionInst.instructions = [NSArray arrayWithObject:mainInstruction];
mainCompositionInst.frameDuration = CMTimeMake(1, 30);
mainCompositionInst.renderSize = CGSizeMake(renderWidth, renderHeight);
[self PlayVideo];
}

How to record video with overlay view

Hi I am trying to record video with overlay.
I have written:
-(void)addOvelayViewToVideo:(NSURL *)videoURL
to add overlay view on recorded video but it is not working.
I written the code to record video in viewDidLoad using AVCaptureSession.
//In ViewDidLoad
//CONFIGURE DISPLAY OUTPUT
self.previewLayer = [AVCaptureVideoPreviewLayer layerWithSession:self.captureSession];
[self.previewLayer setVideoGravity:AVLayerVideoGravityResizeAspectFill];
self.previewLayer.frame = self.view.frame;
[self.view.layer addSublayer:self.previewLayer];
-(void)captureOutput:(AVCaptureFileOutput *)captureOutput didFinishRecordingToOutputFileAtURL:(NSURL *)outputFileURL fromConnections:(NSArray *)connections error:(NSError *)error
{
if(error.code != noErr)
{
id value = [[error userInfo] objectForKey:AVErrorRecordingSuccessfullyFinishedKey];
if(value)
{
isSuccess = [value boolValue];
}
}
if(isSuccess)
{
ALAssetsLibrary *assetsLibrary = [[ALAssetsLibrary alloc] init];
if([assetsLibrary videoAtPathIsCompatibleWithSavedPhotosAlbum:outputFileURL])
{
[self addOverviewToVideo:outputFileURL];
}
else{
NSLog(#"could not saved to photos album.");
}
}
}
-(void)addOvelayViewToVideo:(NSURL *)videoURL
{
AVAsset *asset = [AVAsset assetWithURL:videoURL];
AVMutableComposition *composition = [[AVMutableComposition alloc] init];
AVMutableCompositionTrack *compositionTrack = [composition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
[compositionTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, asset.duration) ofTrack:[[asset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0] atTime:kCMTimeZero error:nil];
AVMutableVideoCompositionInstruction *compositionInstruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
compositionInstruction.timeRange = CMTimeRangeMake(kCMTimeZero, asset.duration);
AVMutableVideoCompositionLayerInstruction *videoLayerInstruction = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:compositionTrack];
AVAssetTrack *assetTrack = [[asset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];
UIImageOrientation videoAssetOrientation_ = UIImageOrientationUp;
BOOL isVideoAssetPortrait_ = NO;
CGAffineTransform videoTransform = assetTrack.preferredTransform;
if (videoTransform.a == 0 && videoTransform.b == 1.0 && videoTransform.c == -1.0 && videoTransform.d == 0) {
videoAssetOrientation_ = UIImageOrientationRight;
isVideoAssetPortrait_ = YES;
}
if (videoTransform.a == 0 && videoTransform.b == -1.0 && videoTransform.c == 1.0 && videoTransform.d == 0) {
videoAssetOrientation_ = UIImageOrientationLeft;
isVideoAssetPortrait_ = YES;
}
if (videoTransform.a == 1.0 && videoTransform.b == 0 && videoTransform.c == 0 && videoTransform.d == 1.0) {
videoAssetOrientation_ = UIImageOrientationUp;
}
if (videoTransform.a == -1.0 && videoTransform.b == 0 && videoTransform.c == 0 && videoTransform.d == -1.0) {
videoAssetOrientation_ = UIImageOrientationDown;
}
[videoLayerInstruction setTransform:assetTrack.preferredTransform atTime:kCMTimeZero];
[videoLayerInstruction setOpacity:0.0 atTime:asset.duration];
compositionInstruction.layerInstructions = [NSArray arrayWithObject:videoLayerInstruction];
AVMutableVideoComposition *videoComposition = [AVMutableVideoComposition videoComposition];
CGSize naturalSize = CGSizeMake(assetTrack.naturalSize.height, assetTrack.naturalSize.width);
float renderWidth, renderHeight;
renderWidth = naturalSize.width;
renderHeight = naturalSize.height;
videoComposition.renderSize = CGSizeMake(renderWidth, renderHeight);
videoComposition.instructions = [NSArray arrayWithObject:compositionInstruction];
videoComposition.frameDuration = CMTimeMake(1, 30);
CALayer *overlayLayer = [CALayer layer];
UIImage *overlayImage = [UIImage imageNamed:#"sampleHUD"];
[overlayLayer setContents:(id)[overlayImage CGImage]];
overlayLayer.frame = CGRectMake(0, 0, naturalSize.width, naturalSize.height);
[overlayLayer setMasksToBounds:YES];
CALayer *parentLayer = [CALayer layer];
CALayer *videoLayer = [CALayer layer];
parentLayer.frame = CGRectMake(0, 0, naturalSize.width, naturalSize.height);
videoLayer.frame = CGRectMake(0, 0, naturalSize.width, naturalSize.height);
[parentLayer addSublayer:videoLayer];
[parentLayer addSublayer:overlayLayer];
videoComposition.animationTool = [AVVideoCompositionCoreAnimationTool videoCompositionCoreAnimationToolWithPostProcessingAsVideoLayer:videoLayer inLayer:parentLayer];
NSLog(#"renderSize:%f,%f", videoComposition.renderSize.width, videoComposition.renderSize.height);
AVAssetExportSession *exportSession = [[AVAssetExportSession alloc] initWithAsset:composition presetName:AVAssetExportPresetHighestQuality];
exportSession.outputURL = videoURL;
exportSession.outputFileType = AVFileTypeQuickTimeMovie;
exportSession.shouldOptimizeForNetworkUse = YES;
exportSession.videoComposition = videoComposition;
[exportSession exportAsynchronouslyWithCompletionHandler:^{
dispatch_async(dispatch_get_main_queue(), ^{
//save the video in photos album
});
}];
}
I am still unable to figure out what is going wrong here. Need guidance on this.
Can I add overlay while recording video?
Any help will be appreciated.

Video orientation issues when exporting with AVMutableVideoComposition

Here is the function I used to export video:
- (void) videoOutput
{
//1 - Early exit if there's no video file selected
if (!self.videoAsset) {
UIAlertView *alert = [[UIAlertView alloc] initWithTitle:#"Error" message:#"Please Load a Video Asset First"
delegate:nil cancelButtonTitle:#"OK" otherButtonTitles:nil];
[alert show];
return;
}
// 2 - Create AVMutableComposition object. This object will hold your AVMutableCompositionTrack instances.
AVMutableComposition *mixComposition = [[AVMutableComposition alloc] init];
// 3 - Video track
AVMutableCompositionTrack *videoTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo
preferredTrackID:kCMPersistentTrackID_Invalid];
[videoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, self.videoAsset.duration)
ofTrack:[[self.videoAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0]
atTime:kCMTimeZero error:nil];
// 3.1 - Create AVMutableVideoCompositionInstruction
AVMutableVideoCompositionInstruction *mainInstruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
mainInstruction.timeRange = CMTimeRangeMake(kCMTimeZero, self.videoAsset.duration);
// 3.2 - Create an AVMutableVideoCompositionLayerInstruction for the video track and fix the orientation.
AVMutableVideoCompositionLayerInstruction *videolayerInstruction = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:videoTrack];
AVAssetTrack *videoAssetTrack = [[self.videoAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];
BOOL isVideoAssetPortrait_ = NO;
CGAffineTransform videoTransform = videoAssetTrack.preferredTransform;
if (videoTransform.a == 0 && videoTransform.b == 1.0 && videoTransform.c == -1.0 && videoTransform.d == 0) {
isVideoAssetPortrait_ = YES;
}
if (videoTransform.a == 0 && videoTransform.b == -1.0 && videoTransform.c == 1.0 && videoTransform.d == 0) {
isVideoAssetPortrait_ = YES;
}
if (videoTransform.a == 1.0 && videoTransform.b == 0 && videoTransform.c == 0 && videoTransform.d == 1.0) {
isVideoAssetPortrait_ = NO;
}
if (videoTransform.a == -1.0 && videoTransform.b == 0 && videoTransform.c == 0 && videoTransform.d == -1.0) {
isVideoAssetPortrait_ = NO;
}
[videolayerInstruction setTransform:videoAssetTrack.preferredTransform atTime:kCMTimeZero];
[videolayerInstruction setOpacity:0.0 atTime:self.videoAsset.duration];
// 3.3 - Add instructions
mainInstruction.layerInstructions = [NSArray arrayWithObjects:videolayerInstruction,nil];
AVMutableVideoComposition *mainCompositionInst = [AVMutableVideoComposition videoComposition];
CGSize naturalSize;
if(isVideoAssetPortrait_){
naturalSize = CGSizeMake(videoAssetTrack.naturalSize.height, videoAssetTrack.naturalSize.width);
} else {
naturalSize = videoAssetTrack.naturalSize;
}
mainCompositionInst.renderSize = naturalSize;
mainCompositionInst.instructions = [NSArray arrayWithObject:mainInstruction];
mainCompositionInst.frameDuration = CMTimeMake(1, 30);
// 4 - Get path
NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
NSString *documentsDirectory = [paths objectAtIndex:0];
NSString *myPathDocs = [documentsDirectory stringByAppendingPathComponent:
[NSString stringWithFormat:#"FinalVideo-%d.mov",arc4random() % 1000]];
NSURL *url = [NSURL fileURLWithPath:myPathDocs];
// 5 - Create exporter
AVAssetExportSession *exporter = [[AVAssetExportSession alloc] initWithAsset:mixComposition
presetName:AVAssetExportPresetHighestQuality];
exporter.outputURL=url;
exporter.outputFileType = AVFileTypeQuickTimeMovie;
exporter.shouldOptimizeForNetworkUse = YES;
exporter.videoComposition = mainCompositionInst;
[exporter exportAsynchronouslyWithCompletionHandler:^{
dispatch_async(dispatch_get_main_queue(), ^{
[self exportDidFinish:exporter];
});
}];
}
The problem is that the first time I use this function to export a portrait video, the variable videoTransform (videoAssetTrack.preferredTransform) are:
videoTransform.a == 0 && videoTransform.b == 1.0 && videoTransform.c == -1.0 && videoTransform.d == 0
And the variable isVideoAssetPortrait_ equals YES. Everything is right. However, after exporting completed and saved to Camera Roll, I used this function reload the result video. This time, the videoTransform changed:
videoTransform.a == 1.0 && videoTransform.b == 0 && videoTransform.c == 0 && videoTransform.d == 1.0
And isVideoAssetPortrait_ equals NO. It means that after one time export, the videoTransform has changed it's values (orientation from portrait -> landscape)
I googled many questions about video orientation of AV Foundation, but there hasn't found the solution yet.
Thank you for reading my long explanation. If you have any question, please let me know.
There is no such concept as portrait/landscape regarding the video tracks. It has just it's dimensions and a transformation applied in order to present it properly. By default the "portrait" video is encoded as it is produced by the camera (let say landscape) and a 90 degrees rotation used to present it properly.
When you export it, it's orientation is not changed, it is just recoded physically rotated, so no rotation is needed to present it properly.
This is why you get the identity matrix on the second time (it not means that it was changed from portrait to landscape), but this time the it's natural size is also swapped, and everything should be okay based on your code.
Please specify what's wrong in the later case.

Resources