Update
Don't use simulator to test video translations. But I have codec issue
asked in another question here. Any help is appreciated.
I am using AVURLAsset to create my videos and they work fine as long the videos picked from gallery are in landscape mode. But when I use a portrait video it either plays in black screen(audio plays) or the frames are twisted (see image).
Update:
I tried using the CGAffineTransform still no luck.
Here's the code:
-(void) createVideo{
AVMutableComposition* mixComposition = [AVMutableComposition composition];
NSDictionary *options = #{AVURLAssetPreferPreciseDurationAndTimingKey:#YES};
_videoAsset = [[AVURLAsset alloc]initWithURL:video_url options:options];
CMTime startTimeV=CMTimeMakeWithSeconds(videoStartTime.floatValue, 1);
CMTime endTimeV=CMTimeMakeWithSeconds(videoEndTime.floatValue, 1);
CMTimeRange video_timeRange = CMTimeRangeMake(startTimeV,endTimeV);
AVMutableCompositionTrack *a_compositionVideoTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
[a_compositionVideoTrack insertTimeRange:video_timeRange ofTrack:[[_videoAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0] atTime:kCMTimeZero error:nil];
AVMutableVideoCompositionInstruction *mainInstruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
mainInstruction.timeRange = CMTimeRangeMake(kCMTimeZero, self.videoAsset.duration);
AVMutableVideoCompositionLayerInstruction *videolayerInstruction = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:a_compositionVideoTrack];
AVAssetTrack *videoAssetTrack = [[self.videoAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];
CGSize trackDimensions = {
.width = 0.0,
.height = 0.0,
};
trackDimensions = [videoAssetTrack naturalSize];
int width = trackDimensions.width;
int height = trackDimensions.height;
UIImageOrientation videoAssetOrientation_ = UIImageOrientationUp;
BOOL isVideoAssetPortrait_ = NO;
CGAffineTransform videoTransform = videoAssetTrack.preferredTransform;
if (videoTransform.a == 0 && videoTransform.b == 1.0 && videoTransform.c == -1.0 && videoTransform.d == 0) {
videoAssetOrientation_ = UIImageOrientationRight;
isVideoAssetPortrait_ = YES;
}
if (videoTransform.a == 0 && videoTransform.b == -1.0 && videoTransform.c == 1.0 && videoTransform.d == 0) {
videoAssetOrientation_ = UIImageOrientationLeft;
isVideoAssetPortrait_ = YES;
}
if (videoTransform.a == 1.0 && videoTransform.b == 0 && videoTransform.c == 0 && videoTransform.d == 1.0) {
videoAssetOrientation_ = UIImageOrientationUp;
}
if (videoTransform.a == -1.0 && videoTransform.b == 0 && videoTransform.c == 0 && videoTransform.d == -1.0) {
videoAssetOrientation_ = UIImageOrientationDown;
}
// CGAffineTransform transformToApply=CGAffineTransformMakeRotation(DEGREES_TO_RADIANS(90.0));
[videolayerInstruction setTransform:videoAssetTrack.preferredTransform atTime:kCMTimeZero];
[videolayerInstruction setOpacity:0.0 atTime:self.videoAsset.duration];
mainInstruction.layerInstructions = [NSArray arrayWithObjects:videolayerInstruction,nil];
AVMutableVideoComposition *mainCompositionInst = [AVMutableVideoComposition videoComposition];
CGSize naturalSize;
if(isVideoAssetPortrait_){
naturalSize = CGSizeMake(videoAssetTrack.naturalSize.height, videoAssetTrack.naturalSize.width);
} else {
naturalSize = videoAssetTrack.naturalSize;
}
float renderWidth, renderHeight;
renderWidth = naturalSize.width;
renderHeight = naturalSize.height;
mainCompositionInst.renderSize = CGSizeMake(renderWidth, renderHeight);
mainCompositionInst.instructions = [NSArray arrayWithObject:mainInstruction];
mainCompositionInst.frameDuration = CMTimeMake(1, 30);
And the export :
-(void)export{
NSArray *dirPaths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
NSString *docsDir = [dirPaths objectAtIndex:0];
NSString* fileName=[NSString stringWithFormat:#"myvideo%lld.mp4",[#(floor([[NSDate date] timeIntervalSince1970])) longLongValue]+1];
NSString *outputFilePath = [docsDir stringByAppendingPathComponent:[NSString stringWithFormat:#"%#",fileName]];
NSURL *outputFileUrl = [NSURL fileURLWithPath:outputFilePath];
if ([[NSFileManager defaultManager] fileExistsAtPath:outputFilePath])
[[NSFileManager defaultManager] removeItemAtPath:outputFilePath error:nil];
AVAssetExportSession* _assetExport = [[AVAssetExportSession alloc] initWithAsset:mixComposition presetName:AVAssetExportPresetHighestQuality];
_assetExport.outputFileType = AVFileTypeMPEG4;
_assetExport.outputURL = outputFileUrl;
_assetExport.videoComposition = mainCompositionInst;
ShareViewController *newViewController = [self.storyboard instantiateViewControllerWithIdentifier:#"vidShare"];
[_assetExport exportAsynchronouslyWithCompletionHandler:
^(void ) {
dispatch_async(dispatch_get_main_queue(), ^{
newViewController.videoFilePath=outputFileUrl;
[self.navigationController pushViewController:newViewController animated:YES];
});
}
];
}
Landscape = OK
Portrait = FAIL
I've had this issue before. Set videolayerInstruction's transform using videoAssetTrack.preferredTransform works in most cases. But sometimes preferredTransform may lack of value of tx(ty) (you should check CGAffineTransform In Apple API Reference if you don't know what is tx(ty)) or tx(ty) has inaccurate value which results in wrong video positioning(such as playing in black screen).
So the point is: you should use preferredTransform to determine the origin video orientation and make the transform of your own.
here is the code of getting the orientation of track:
- (UIInterfaceOrientation)orientationForTrack:(AVAsset *)asset {
UIInterfaceOrientation orientation = UIInterfaceOrientationPortrait;
NSArray *tracks = [asset tracksWithMediaType:AVMediaTypeVideo];
if([tracks count] > 0) {
AVAssetTrack *videoTrack = [tracks objectAtIndex:0];
CGAffineTransform t = videoTrack.preferredTransform;
// Portrait
if(t.a == 0 && t.b == 1.0 && t.c == -1.0 && t.d == 0) {
orientation = UIInterfaceOrientationPortrait;
}
// PortraitUpsideDown
if(t.a == 0 && t.b == -1.0 && t.c == 1.0 && t.d == 0) {
orientation = UIInterfaceOrientationPortraitUpsideDown;
}
// LandscapeRight
if(t.a == 1.0 && t.b == 0 && t.c == 0 && t.d == 1.0) {
orientation = UIInterfaceOrientationLandscapeRight;
}
// LandscapeLeft
if(t.a == -1.0 && t.b == 0 && t.c == 0 && t.d == -1.0) {
orientation = UIInterfaceOrientationLandscapeLeft;
}
}
return orientation;
}
And code for getting the needed transform to apply to the videolayerInstruction's transform:
- (CGAffineTransform)transformBasedOnAsset:(AVAsset *)asset {
UIInterfaceOrientation orientation = [AVUtilities orientationForTrack:asset];
AVAssetTrack *assetTrack = [asset tracksWithMediaType:AVMediaTypeVideo][0];
CGSize naturalSize = assetTrack.naturalSize;
CGAffineTransform finalTranform;
switch (orientation) {
case UIInterfaceOrientationLandscapeLeft:
finalTranform = CGAffineTransformMake(-1, 0, 0, -1, naturalSize.width, naturalSize.height);
break;
case UIInterfaceOrientationLandscapeRight:
finalTranform = CGAffineTransformMake(1, 0, 0, 1, 0, 0);
break;
case UIInterfaceOrientationPortrait:
finalTranform = CGAffineTransformMake(0, 1, -1, 0, naturalSize.height, 0);
break;
case UIInterfaceOrientationPortraitUpsideDown:
finalTranform = CGAffineTransformMake(0, -1, 1, 0, 0, naturalSize.width);
break;
default:
break;
}
return finalTranform;
}
Hope it works for you.
Related
I am working on a Video making app.
In that I need to record a video in first View and after that display in second View.For recording a video I followed this tutorial.
In that I have made some changes as per my need in didFinishRecordingToOutputFileAtURL method.
Here is my updated method.
- (void)captureOutput:(AVCaptureFileOutput *)captureOutput didFinishRecordingToOutputFileAtURL:(NSURL *)outputFileURL fromConnections:(NSArray *)connections error:(NSError *)error
{
NSLog(#"didFinishRecordingToOutputFileAtURL - enter");
BOOL RecordedSuccessfully = YES;
if ([error code] != noErr)
{
// A problem occurred: Find out if the recording was successful.
id value = [[error userInfo] objectForKey:AVErrorRecordingSuccessfullyFinishedKey];
if (value)
{
RecordedSuccessfully = [value boolValue];
}
}
else {
NSLog(#"didFinishRecordingToOutputFileAtURL error:%#",error);
}
if (RecordedSuccessfully)
{
//----- RECORDED SUCESSFULLY -----
NSLog(#"didFinishRecordingToOutputFileAtURL - success");
ALAssetsLibrary *library = [[ALAssetsLibrary alloc] init];
if ([library videoAtPathIsCompatibleWithSavedPhotosAlbum:outputFileURL])
{
AVMutableComposition *mixComposition = [[AVMutableComposition alloc] init];
AVMutableCompositionTrack *track = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
AVAsset *asset = [AVAsset assetWithURL:outputFileURL];
[track insertTimeRange:CMTimeRangeMake(kCMTimeZero, asset.duration) ofTrack:[[asset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0] atTime:CMTimeMake(0, 1) error:nil];
NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
NSString *documentsDirectory = [paths objectAtIndex:0];
NSString *myPathDocs = [documentsDirectory stringByAppendingPathComponent:
[NSString stringWithFormat:#"%#%d.mov",NSBundle.mainBundle.infoDictionary[#"CFBundleExecutable"],++videoCounter]];
[[NSFileManager defaultManager] removeItemAtPath:myPathDocs error:nil];
NSURL *url = [NSURL fileURLWithPath:myPathDocs];
AVMutableVideoCompositionInstruction *instruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
instruction.timeRange = CMTimeRangeMake(kCMTimeZero, asset.duration);
AVMutableVideoCompositionLayerInstruction *layerInstruction = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:track];
AVAssetTrack *videoAssetTrack = [[asset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];
UIImageOrientation videoAssetOrientation_ = UIImageOrientationUp;
BOOL isVideoAssetPortrait_ = NO;
CGAffineTransform videoTransform = videoAssetTrack.preferredTransform;
if (videoTransform.a == 0 && videoTransform.b == 1.0 && videoTransform.c == -1.0 && videoTransform.d == 0) {
videoAssetOrientation_ = UIImageOrientationRight;
isVideoAssetPortrait_ = YES;
if ([[[NSUserDefaults standardUserDefaults] stringForKey:#"orientation"] isEqualToString:#"landscape"]) {
videoAssetOrientation_ = UIImageOrientationUp;
}
}
if (videoTransform.a == 0 && videoTransform.b == -1.0 && videoTransform.c == 1.0 && videoTransform.d == 0) {
videoAssetOrientation_ = UIImageOrientationLeft;
isVideoAssetPortrait_ = YES;
}
if (videoTransform.a == 1.0 && videoTransform.b == 0 && videoTransform.c == 0 && videoTransform.d == 1.0) {
videoAssetOrientation_ = UIImageOrientationUp;
}
if (videoTransform.a == -1.0 && videoTransform.b == 0 && videoTransform.c == 0 && videoTransform.d == -1.0) {
videoAssetOrientation_ = UIImageOrientationDown;
}
CGSize naturalSize;
if(isVideoAssetPortrait_){
naturalSize = CGSizeMake(videoAssetTrack.naturalSize.height, videoAssetTrack.naturalSize.width);
} else {
naturalSize = videoAssetTrack.naturalSize;
}
float renderWidth, renderHeight;
if (![self.ratioLabel.text isEqualToString:#"16:9"]) {
renderWidth = naturalSize.width;
renderHeight = naturalSize.width;
NSLog(#"Video:: width=%f height=%f",naturalSize.width,naturalSize.height);
}
else {
renderWidth = naturalSize.width;
renderHeight = naturalSize.height;
NSLog(#"Video:: width=%f height=%f",naturalSize.width,naturalSize.height);
}
if (![self.ratioLabel.text isEqualToString:#"16:9"])
{
CGAffineTransform t1 = CGAffineTransformMakeTranslation(videoAssetTrack.naturalSize.height, -(videoAssetTrack.naturalSize.width - videoAssetTrack.naturalSize.height) /2);
CGAffineTransform t2 = CGAffineTransformRotate(t1, M_PI_2);
[layerInstruction setTransform:t2 atTime:kCMTimeZero];
}
else
{
CGAffineTransform t2 = CGAffineTransformMakeRotation( M_PI_2);
[layerInstruction setTransform:t2 atTime:kCMTimeZero];
}
AVCaptureDevicePosition position = [[VideoInputDevice device] position];
if (position == AVCaptureDevicePositionFront)
{
/* For front camera only */
CGAffineTransform t = CGAffineTransformMakeScale(-1.0f, 1.0f);
t = CGAffineTransformTranslate(t, -videoAssetTrack.naturalSize.width, 0);
t = CGAffineTransformRotate(t, (DEGREES_TO_RADIANS(90.0)));
t = CGAffineTransformTranslate(t, 0.0f, -videoAssetTrack.naturalSize.width);
[layerInstruction setTransform:t atTime:kCMTimeZero];
/* For front camera only */
}
[layerInstruction setOpacity:0.0 atTime:asset.duration];
instruction.layerInstructions = [NSArray arrayWithObjects:layerInstruction,nil];
AVMutableVideoComposition *mainCompositionInst = [AVMutableVideoComposition videoComposition];
mainCompositionInst.renderSize = CGSizeMake(renderWidth, renderHeight);
mainCompositionInst.instructions = [NSArray arrayWithObject:instruction];
mainCompositionInst.frameDuration = CMTimeMake(1, 30);
AVAssetExportSession *exporter;
exporter = [[AVAssetExportSession alloc] initWithAsset:mixComposition presetName:AVAssetExportPreset1280x720];
exporter.videoComposition = mainCompositionInst;
exporter.outputURL=url;
exporter.outputFileType = AVFileTypeQuickTimeMovie;
exporter.shouldOptimizeForNetworkUse = YES;
[exporter exportAsynchronouslyWithCompletionHandler:^{
dispatch_async(dispatch_get_main_queue(), ^{
self.doneButton.userInteractionEnabled = YES;
if(videoAddr==nil)
{
videoAddr = [[NSMutableArray alloc] init];
}
[videoAddr addObject:exporter.outputURL];
[[PreviewLayer connection] setEnabled:YES];
AVAsset *asset = [AVAsset assetWithURL:exporter.outputURL];
NSLog(#"remaining seconds before:%f",lastSecond);
double assetDuration = CMTimeGetSeconds(asset.duration);
if (assetDuration>3.0)
assetDuration = 3.0;
lastSecond = lastSecond- assetDuration;
NSLog(#"remaining seconds after:%f",lastSecond);
self.secondsLabel.text = [NSString stringWithFormat:#"%0.1fs",lastSecond];
self.secondsLabel.hidden = NO;
NSData *data = [NSKeyedArchiver archivedDataWithRootObject:videoAddr];
[[NSUserDefaults standardUserDefaults] setObject:data forKey:#"videoAddr"];
[[NSUserDefaults standardUserDefaults] synchronize];
videoURL = outputFileURL;
flagAutorotate = NO;
self.cancelButton.hidden = self.doneButton.hidden = NO;
imgCancel.hidden = imgDone.hidden = NO;
if ([[NSUserDefaults standardUserDefaults] boolForKey:#"Vibration"])
AudioServicesPlayAlertSound(kSystemSoundID_Vibrate);
[[UIApplication sharedApplication] endIgnoringInteractionEvents];
});
}];
}
else {
UIAlertView *alert = [[UIAlertView alloc] initWithTitle:#"Error" message:[NSString stringWithFormat:#"Video can not be saved\nPlease free some storage space"] delegate:self cancelButtonTitle:nil otherButtonTitles:nil, nil];
[alert show];
dispatch_after(dispatch_time(DISPATCH_TIME_NOW, (int64_t)(2.0 * NSEC_PER_SEC)), dispatch_get_main_queue(), ^{
[alert dismissWithClickedButtonIndex:0 animated:YES];
});
}
}
}
But here Is the issue.Video is not being recorded exactly shown in preview.
See these 2 screenShots.
Video recording preview
Video Playing View
The reason is because your iPad screen aspect ratio is not the same as camera aspect ratio.
You can modify camera preview size by setting videoGravity property of AVCaptureVideoPreviewLayer,
which influences how content is viewed relative to the layer bounds:
layer.videoGravity = AVLayerVideoGravityResizeAspect;
But in that case preview won't be fullscreen.
If you want the video with the same aspect ratio as on preview fullscreen, you will have to crop it. Cropping process explained here:
Exporting AVCaptureSession video in a size that matches the preview layer
Video capture with 1:1 aspect ratio in iOS
I am struggling changing the orientation of my video. It is recorded in portrait but then is saved in landscape. Changing the transform is only make the video rotate within a landscape video. In this example with M_PI_2 it disappears since it rotates off the screen or is flat. But if I change it to M_PI_2/2 or something it appears but crooked. I know AVFoundation does this by default. How do I change this? I got a lot of this code from this tutorial: https://www.raywenderlich.com/30200/avfoundation-tutorial-adding-overlays-and-animations-to-videos but using the AVMutableVideoCompositionLayerInstruction is not working.
AVMutableCompositionTrack *videoTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo
preferredTrackID:kCMPersistentTrackID_Invalid];
CMTime insertTime = kCMTimeZero;
for(AVAsset *videoAsset in self.videoArray){
[videoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, videoAsset.duration) ofTrack:[[videoAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0] atTime:insertTime error:nil];
// Updating the insertTime for the next insert
insertTime = CMTimeAdd(insertTime, videoAsset.duration);
}
CGAffineTransform rotationTransform = CGAffineTransformMakeRotation(M_PI_2);
videoTrack.preferredTransform = rotationTransform;
// 3.1 - Create AVMutableVideoCompositionInstruction
AVMutableVideoCompositionInstruction *mainInstruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
mainInstruction.timeRange = videoTrack.timeRange;
// 3.2 - Create an AVMutableVideoCompositionLayerInstruction for the video track and fix the orientation.
AVMutableVideoCompositionLayerInstruction *videolayerInstruction = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:videoTrack];
AVAssetTrack *videoAssetTrack = [[videoTrack.asset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];
UIImageOrientation videoAssetOrientation_ = UIImageOrientationUp;
BOOL isVideoAssetPortrait_ = NO;
CGAffineTransform videoTransform = videoAssetTrack.preferredTransform;
if (videoTransform.a == 0 && videoTransform.b == 1.0 && videoTransform.c == -1.0 && videoTransform.d == 0) {
videoAssetOrientation_ = UIImageOrientationRight;
isVideoAssetPortrait_ = YES;
}
if (videoTransform.a == 0 && videoTransform.b == -1.0 && videoTransform.c == 1.0 && videoTransform.d == 0) {
videoAssetOrientation_ = UIImageOrientationLeft;
isVideoAssetPortrait_ = YES;
}
if (videoTransform.a == 1.0 && videoTransform.b == 0 && videoTransform.c == 0 && videoTransform.d == 1.0) {
videoAssetOrientation_ = UIImageOrientationUp;
}
if (videoTransform.a == -1.0 && videoTransform.b == 0 && videoTransform.c == 0 && videoTransform.d == -1.0) {
videoAssetOrientation_ = UIImageOrientationDown;
}
//CGAffineTransform rotationTransform = videoAssetTrack.preferredTransform;
[videolayerInstruction setTransform:videoAssetTrack.preferredTransform atTime:kCMTimeZero];
[videolayerInstruction setOpacity:0.0 atTime:videoTrack.timeRange.duration];
Is there a way to set an anchor point or to make my own transform?
Check https://developer.apple.com/library/content/qa/qa1744/_index.html, that's the official explanation of setting the orientation of video with AVFoundation. Simply speaking, you need to use an AVMutableVideoCompositionLayerInstruction object to modify the transform to apply to a given track in the video composition.
Then let's talk about the transform you should apply. There're so many people who suggest using AVAssetTrack's preferredTransform. In most cases it works and you should use it to get the video orientation of your assetTrack. But sometimes preferredTransform may lack of value of tx(ty) (you should check CGAffineTransform In Apple API Reference if you don't know what is tx(ty)) or tx(ty) has inaccurate value which results in wrong video positioning.
So the main idea is: use preferredTransform to determine the origin video orientation and make the transform of your own.
here is the code of getting the orientation of track:
- (UIInterfaceOrientation)orientationForTrack:(AVAsset *)asset {
UIInterfaceOrientation orientation = UIInterfaceOrientationPortrait;
NSArray *tracks = [asset tracksWithMediaType:AVMediaTypeVideo];
if([tracks count] > 0) {
AVAssetTrack *videoTrack = [tracks objectAtIndex:0];
CGAffineTransform t = videoTrack.preferredTransform;
// Portrait
if(t.a == 0 && t.b == 1.0 && t.c == -1.0 && t.d == 0) {
orientation = UIInterfaceOrientationPortrait;
}
// PortraitUpsideDown
if(t.a == 0 && t.b == -1.0 && t.c == 1.0 && t.d == 0) {
orientation = UIInterfaceOrientationPortraitUpsideDown;
}
// LandscapeRight
if(t.a == 1.0 && t.b == 0 && t.c == 0 && t.d == 1.0) {
orientation = UIInterfaceOrientationLandscapeRight;
}
// LandscapeLeft
if(t.a == -1.0 && t.b == 0 && t.c == 0 && t.d == -1.0) {
orientation = UIInterfaceOrientationLandscapeLeft;
}
}
return orientation;
}
And code for getting the needed transform to apply:
- (CGAffineTransform)transformBasedOnAsset:(AVAsset *)asset {
UIInterfaceOrientation orientation = [AVUtilities orientationForTrack:asset];
AVAssetTrack *assetTrack = [asset tracksWithMediaType:AVMediaTypeVideo][0];
CGSize naturalSize = assetTrack.naturalSize;
CGAffineTransform finalTranform;
switch (orientation) {
case UIInterfaceOrientationLandscapeLeft:
finalTranform = CGAffineTransformMake(-1, 0, 0, -1, naturalSize.width, naturalSize.height);
break;
case UIInterfaceOrientationLandscapeRight:
finalTranform = CGAffineTransformMake(1, 0, 0, 1, 0, 0);
break;
case UIInterfaceOrientationPortrait:
finalTranform = CGAffineTransformMake(0, 1, -1, 0, naturalSize.height, 0);
break;
case UIInterfaceOrientationPortraitUpsideDown:
finalTranform = CGAffineTransformMake(0, -1, 1, 0, 0, naturalSize.width);
break;
default:
break;
}
return finalTranform;
}
If you get confused, I suggest you taking videos with different orientation first, then print out the preferredTransform of the video using NSStringFromCGAffineTransform() or something else to check for details.
My application merges two videos.
I am using following code to merge two videos using AVVideoComposition
- (void)buildSequenceComposition:(AVMutableComposition *)mixComposition andVideoComposition:(AVMutableVideoComposition *)videoComposition withAudioMix:(AVMutableAudioMix *)audioMix
{
CMTime nextClipStartTime = kCMTimeZero;
NSInteger i;
// No transitions: place clips into one video track and one audio track in composition.
AVMutableVideoCompositionInstruction * MainInstruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
NSMutableArray*arrLayerInstruction = [NSMutableArray array];
for (i = 0; i < [_clips count]; i++ )
{
AVMutableCompositionTrack *compositionVideoTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
AVMutableCompositionTrack *compositionAudioTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
AVURLAsset *asset = [[_clips objectAtIndex:i] objectForKey:#"videoURL"];
CMTimeRange timeRangeInAsset;
timeRangeInAsset = CMTimeRangeMake(kCMTimeZero, [asset duration]);
AVAssetTrack *clipVideoTrack = [[asset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];
NSError*err = nil;
[compositionVideoTrack insertTimeRange:timeRangeInAsset ofTrack:clipVideoTrack atTime:nextClipStartTime error:&err];
if ([[asset tracksWithMediaType:AVMediaTypeAudio] count] != 0)
{
AVAssetTrack *clipAudioTrack = [[asset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0];
[compositionAudioTrack insertTimeRange:timeRangeInAsset ofTrack:clipAudioTrack atTime:nextClipStartTime error:nil];
AVMutableAudioMixInputParameters *exportAudioMixInputParameters = [AVMutableAudioMixInputParameters audioMixInputParametersWithTrack:[[asset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0]];
[exportAudioMixInputParameters setVolume:[[[_clips objectAtIndex:i] objectForKey:#"videoSoundLevel"] floatValue] atTime:nextClipStartTime];
exportAudioMixInputParameters.trackID = compositionAudioTrack.trackID;
audioMix.inputParameters=[NSArray arrayWithObject:exportAudioMixInputParameters];
}
//FIXING ORIENTATION//
AVMutableVideoCompositionLayerInstruction *FirstlayerInstruction = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:compositionVideoTrack];
UIImageOrientation FirstAssetOrientation_ = UIImageOrientationUp;
BOOL isFirstAssetPortrait_ = NO;
CGAffineTransform firstTransform = clipVideoTrack.preferredTransform;
if(firstTransform.a == 0 && firstTransform.b == 1.0 && firstTransform.c == -1.0 && firstTransform.d == 0)
{
FirstAssetOrientation_= UIImageOrientationRight;
isFirstAssetPortrait_ = YES;
}
if(firstTransform.a == 0 && firstTransform.b == -1.0 && firstTransform.c == 1.0 && firstTransform.d == 0)
{
FirstAssetOrientation_ = UIImageOrientationLeft;
isFirstAssetPortrait_ = YES;
}
if(firstTransform.a == 1.0 && firstTransform.b == 0 && firstTransform.c == 0 && firstTransform.d == 1.0)
{
FirstAssetOrientation_ = UIImageOrientationUp;
}
if(firstTransform.a == -1.0 && firstTransform.b == 0 && firstTransform.c == 0 && firstTransform.d == -1.0)
{
FirstAssetOrientation_ = UIImageOrientationDown;
}
CGFloat tHeight = [clipVideoTrack naturalSize].height;
CGFloat tWidth = [clipVideoTrack naturalSize].width;
if(isFirstAssetPortrait_)
{
tHeight = [clipVideoTrack naturalSize].height;
tWidth = [clipVideoTrack naturalSize].width;
CGFloat temp = tHeight;
tHeight = tWidth;
tWidth = temp;
}
CGFloat FirstAssetScaleToFitRatioWidth = [mixComposition naturalSize].width/tWidth;
CGFloat FirstAssetScaleToFitRatioHeight = [mixComposition naturalSize].height/tHeight;
CGFloat FirstAssetScaleToFitRatio = FirstAssetScaleToFitRatioWidth>FirstAssetScaleToFitRatioHeight?FirstAssetScaleToFitRatioHeight:FirstAssetScaleToFitRatioWidth;
CGAffineTransform FirstAssetScaleFactor = CGAffineTransformMakeScale(FirstAssetScaleToFitRatio,FirstAssetScaleToFitRatio);
CGSize naturalSize = CGSizeApplyAffineTransform(CGSizeMake(tWidth, tHeight), FirstAssetScaleFactor);
CGAffineTransform transform = CGAffineTransformIdentity;
CGSize translateSize = CGSizeMake(0, 0);
if (FirstAssetScaleToFitRatioWidth<FirstAssetScaleToFitRatioHeight)
{
transform = CGAffineTransformMakeTranslation(0, ([mixComposition naturalSize].height-naturalSize.height)/2);
translateSize.height = ([mixComposition naturalSize].height-naturalSize.height)/2;
}
else if (FirstAssetScaleToFitRatioWidth==FirstAssetScaleToFitRatioHeight)
{
}
else
{
transform = CGAffineTransformMakeTranslation(([mixComposition naturalSize].width-naturalSize.width)/2, 0);
translateSize.width = ([mixComposition naturalSize].width-naturalSize.width)/2;
}
[FirstlayerInstruction setTransform:CGAffineTransformConcat(CGAffineTransformConcat(clipVideoTrack.preferredTransform, FirstAssetScaleFactor),transform) atTime:kCMTimeZero];
[FirstlayerInstruction setOpacity:0.0 atTime:CMTimeAdd(nextClipStartTime, timeRangeInAsset.duration)];
[FirstlayerInstruction setOpacity:1.0 atTime:nextClipStartTime];
[arrLayerInstruction addObject:FirstlayerInstruction];
nextClipStartTime = CMTimeAdd(nextClipStartTime, timeRangeInAsset.duration);
}
MainInstruction.timeRange = CMTimeRangeMake(kCMTimeZero, nextClipStartTime);
MainInstruction.layerInstructions = arrLayerInstruction;;
videoComposition.instructions = [NSArray arrayWithObject:MainInstruction];
}
Although it works fine with iOS7, while exporting video with AVVideoCompositon in iOS8, it gives me following error:
Title :Error Domain=AVFoundationErrorDomain Code=-11821 “Cannot Decode” { NSLocalizedFailureReason=The media data could not be decoded. It may be damaged.}
It works fine with iOS7 and other prior to iOS versions, but not in iOS8.
I have also tried Apple's sample code from AVSampleEditor and it also gives me same error while exporting video in iOS8.
Kindly help me to solve the problem. Thanks.
Check this demo code.
Working for me
Hi I am trying to record video with overlay.
I have written:
-(void)addOvelayViewToVideo:(NSURL *)videoURL
to add overlay view on recorded video but it is not working.
I written the code to record video in viewDidLoad using AVCaptureSession.
//In ViewDidLoad
//CONFIGURE DISPLAY OUTPUT
self.previewLayer = [AVCaptureVideoPreviewLayer layerWithSession:self.captureSession];
[self.previewLayer setVideoGravity:AVLayerVideoGravityResizeAspectFill];
self.previewLayer.frame = self.view.frame;
[self.view.layer addSublayer:self.previewLayer];
-(void)captureOutput:(AVCaptureFileOutput *)captureOutput didFinishRecordingToOutputFileAtURL:(NSURL *)outputFileURL fromConnections:(NSArray *)connections error:(NSError *)error
{
if(error.code != noErr)
{
id value = [[error userInfo] objectForKey:AVErrorRecordingSuccessfullyFinishedKey];
if(value)
{
isSuccess = [value boolValue];
}
}
if(isSuccess)
{
ALAssetsLibrary *assetsLibrary = [[ALAssetsLibrary alloc] init];
if([assetsLibrary videoAtPathIsCompatibleWithSavedPhotosAlbum:outputFileURL])
{
[self addOverviewToVideo:outputFileURL];
}
else{
NSLog(#"could not saved to photos album.");
}
}
}
-(void)addOvelayViewToVideo:(NSURL *)videoURL
{
AVAsset *asset = [AVAsset assetWithURL:videoURL];
AVMutableComposition *composition = [[AVMutableComposition alloc] init];
AVMutableCompositionTrack *compositionTrack = [composition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
[compositionTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, asset.duration) ofTrack:[[asset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0] atTime:kCMTimeZero error:nil];
AVMutableVideoCompositionInstruction *compositionInstruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
compositionInstruction.timeRange = CMTimeRangeMake(kCMTimeZero, asset.duration);
AVMutableVideoCompositionLayerInstruction *videoLayerInstruction = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:compositionTrack];
AVAssetTrack *assetTrack = [[asset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];
UIImageOrientation videoAssetOrientation_ = UIImageOrientationUp;
BOOL isVideoAssetPortrait_ = NO;
CGAffineTransform videoTransform = assetTrack.preferredTransform;
if (videoTransform.a == 0 && videoTransform.b == 1.0 && videoTransform.c == -1.0 && videoTransform.d == 0) {
videoAssetOrientation_ = UIImageOrientationRight;
isVideoAssetPortrait_ = YES;
}
if (videoTransform.a == 0 && videoTransform.b == -1.0 && videoTransform.c == 1.0 && videoTransform.d == 0) {
videoAssetOrientation_ = UIImageOrientationLeft;
isVideoAssetPortrait_ = YES;
}
if (videoTransform.a == 1.0 && videoTransform.b == 0 && videoTransform.c == 0 && videoTransform.d == 1.0) {
videoAssetOrientation_ = UIImageOrientationUp;
}
if (videoTransform.a == -1.0 && videoTransform.b == 0 && videoTransform.c == 0 && videoTransform.d == -1.0) {
videoAssetOrientation_ = UIImageOrientationDown;
}
[videoLayerInstruction setTransform:assetTrack.preferredTransform atTime:kCMTimeZero];
[videoLayerInstruction setOpacity:0.0 atTime:asset.duration];
compositionInstruction.layerInstructions = [NSArray arrayWithObject:videoLayerInstruction];
AVMutableVideoComposition *videoComposition = [AVMutableVideoComposition videoComposition];
CGSize naturalSize = CGSizeMake(assetTrack.naturalSize.height, assetTrack.naturalSize.width);
float renderWidth, renderHeight;
renderWidth = naturalSize.width;
renderHeight = naturalSize.height;
videoComposition.renderSize = CGSizeMake(renderWidth, renderHeight);
videoComposition.instructions = [NSArray arrayWithObject:compositionInstruction];
videoComposition.frameDuration = CMTimeMake(1, 30);
CALayer *overlayLayer = [CALayer layer];
UIImage *overlayImage = [UIImage imageNamed:#"sampleHUD"];
[overlayLayer setContents:(id)[overlayImage CGImage]];
overlayLayer.frame = CGRectMake(0, 0, naturalSize.width, naturalSize.height);
[overlayLayer setMasksToBounds:YES];
CALayer *parentLayer = [CALayer layer];
CALayer *videoLayer = [CALayer layer];
parentLayer.frame = CGRectMake(0, 0, naturalSize.width, naturalSize.height);
videoLayer.frame = CGRectMake(0, 0, naturalSize.width, naturalSize.height);
[parentLayer addSublayer:videoLayer];
[parentLayer addSublayer:overlayLayer];
videoComposition.animationTool = [AVVideoCompositionCoreAnimationTool videoCompositionCoreAnimationToolWithPostProcessingAsVideoLayer:videoLayer inLayer:parentLayer];
NSLog(#"renderSize:%f,%f", videoComposition.renderSize.width, videoComposition.renderSize.height);
AVAssetExportSession *exportSession = [[AVAssetExportSession alloc] initWithAsset:composition presetName:AVAssetExportPresetHighestQuality];
exportSession.outputURL = videoURL;
exportSession.outputFileType = AVFileTypeQuickTimeMovie;
exportSession.shouldOptimizeForNetworkUse = YES;
exportSession.videoComposition = videoComposition;
[exportSession exportAsynchronouslyWithCompletionHandler:^{
dispatch_async(dispatch_get_main_queue(), ^{
//save the video in photos album
});
}];
}
I am still unable to figure out what is going wrong here. Need guidance on this.
Can I add overlay while recording video?
Any help will be appreciated.
Here is the function I used to export video:
- (void) videoOutput
{
//1 - Early exit if there's no video file selected
if (!self.videoAsset) {
UIAlertView *alert = [[UIAlertView alloc] initWithTitle:#"Error" message:#"Please Load a Video Asset First"
delegate:nil cancelButtonTitle:#"OK" otherButtonTitles:nil];
[alert show];
return;
}
// 2 - Create AVMutableComposition object. This object will hold your AVMutableCompositionTrack instances.
AVMutableComposition *mixComposition = [[AVMutableComposition alloc] init];
// 3 - Video track
AVMutableCompositionTrack *videoTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo
preferredTrackID:kCMPersistentTrackID_Invalid];
[videoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, self.videoAsset.duration)
ofTrack:[[self.videoAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0]
atTime:kCMTimeZero error:nil];
// 3.1 - Create AVMutableVideoCompositionInstruction
AVMutableVideoCompositionInstruction *mainInstruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
mainInstruction.timeRange = CMTimeRangeMake(kCMTimeZero, self.videoAsset.duration);
// 3.2 - Create an AVMutableVideoCompositionLayerInstruction for the video track and fix the orientation.
AVMutableVideoCompositionLayerInstruction *videolayerInstruction = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:videoTrack];
AVAssetTrack *videoAssetTrack = [[self.videoAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];
BOOL isVideoAssetPortrait_ = NO;
CGAffineTransform videoTransform = videoAssetTrack.preferredTransform;
if (videoTransform.a == 0 && videoTransform.b == 1.0 && videoTransform.c == -1.0 && videoTransform.d == 0) {
isVideoAssetPortrait_ = YES;
}
if (videoTransform.a == 0 && videoTransform.b == -1.0 && videoTransform.c == 1.0 && videoTransform.d == 0) {
isVideoAssetPortrait_ = YES;
}
if (videoTransform.a == 1.0 && videoTransform.b == 0 && videoTransform.c == 0 && videoTransform.d == 1.0) {
isVideoAssetPortrait_ = NO;
}
if (videoTransform.a == -1.0 && videoTransform.b == 0 && videoTransform.c == 0 && videoTransform.d == -1.0) {
isVideoAssetPortrait_ = NO;
}
[videolayerInstruction setTransform:videoAssetTrack.preferredTransform atTime:kCMTimeZero];
[videolayerInstruction setOpacity:0.0 atTime:self.videoAsset.duration];
// 3.3 - Add instructions
mainInstruction.layerInstructions = [NSArray arrayWithObjects:videolayerInstruction,nil];
AVMutableVideoComposition *mainCompositionInst = [AVMutableVideoComposition videoComposition];
CGSize naturalSize;
if(isVideoAssetPortrait_){
naturalSize = CGSizeMake(videoAssetTrack.naturalSize.height, videoAssetTrack.naturalSize.width);
} else {
naturalSize = videoAssetTrack.naturalSize;
}
mainCompositionInst.renderSize = naturalSize;
mainCompositionInst.instructions = [NSArray arrayWithObject:mainInstruction];
mainCompositionInst.frameDuration = CMTimeMake(1, 30);
// 4 - Get path
NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
NSString *documentsDirectory = [paths objectAtIndex:0];
NSString *myPathDocs = [documentsDirectory stringByAppendingPathComponent:
[NSString stringWithFormat:#"FinalVideo-%d.mov",arc4random() % 1000]];
NSURL *url = [NSURL fileURLWithPath:myPathDocs];
// 5 - Create exporter
AVAssetExportSession *exporter = [[AVAssetExportSession alloc] initWithAsset:mixComposition
presetName:AVAssetExportPresetHighestQuality];
exporter.outputURL=url;
exporter.outputFileType = AVFileTypeQuickTimeMovie;
exporter.shouldOptimizeForNetworkUse = YES;
exporter.videoComposition = mainCompositionInst;
[exporter exportAsynchronouslyWithCompletionHandler:^{
dispatch_async(dispatch_get_main_queue(), ^{
[self exportDidFinish:exporter];
});
}];
}
The problem is that the first time I use this function to export a portrait video, the variable videoTransform (videoAssetTrack.preferredTransform) are:
videoTransform.a == 0 && videoTransform.b == 1.0 && videoTransform.c == -1.0 && videoTransform.d == 0
And the variable isVideoAssetPortrait_ equals YES. Everything is right. However, after exporting completed and saved to Camera Roll, I used this function reload the result video. This time, the videoTransform changed:
videoTransform.a == 1.0 && videoTransform.b == 0 && videoTransform.c == 0 && videoTransform.d == 1.0
And isVideoAssetPortrait_ equals NO. It means that after one time export, the videoTransform has changed it's values (orientation from portrait -> landscape)
I googled many questions about video orientation of AV Foundation, but there hasn't found the solution yet.
Thank you for reading my long explanation. If you have any question, please let me know.
There is no such concept as portrait/landscape regarding the video tracks. It has just it's dimensions and a transformation applied in order to present it properly. By default the "portrait" video is encoded as it is produced by the camera (let say landscape) and a 90 degrees rotation used to present it properly.
When you export it, it's orientation is not changed, it is just recoded physically rotated, so no rotation is needed to present it properly.
This is why you get the identity matrix on the second time (it not means that it was changed from portrait to landscape), but this time the it's natural size is also swapped, and everything should be okay based on your code.
Please specify what's wrong in the later case.