AVAssetTrack preferredTransform always returns landscape - ios

I having app that records a video.
To handle phone rotation I have the following code:
// called on phone rotation
AVCaptureConnection *previewLayerConnection = [[self previewLayer] connection];
if ([previewLayerConnection isVideoOrientationSupported]) {
[previewLayerConnection setVideoOrientation:[self getVideoOrientation]];
}
and getVideoOrientation function:
- (AVCaptureVideoOrientation) getVideoOrientation {
UIInterfaceOrientation deviceOrientation = [[UIDevice currentDevice] orientation];
AVCaptureVideoOrientation newOrientation = AVCaptureVideoOrientationPortrait;
switch (deviceOrientation) {
case UIInterfaceOrientationPortrait:
NSLog(#"UIInterfaceOrientationPortrait");
newOrientation = AVCaptureVideoOrientationPortrait;
break;
case UIInterfaceOrientationLandscapeLeft:
NSLog(#"UIInterfaceOrientationLandscapeRight");
newOrientation = AVCaptureVideoOrientationLandscapeLeft;
break;
case UIInterfaceOrientationLandscapeRight:
NSLog(#"UIInterfaceOrientationLandscapeLeft");
newOrientation = AVCaptureVideoOrientationLandscapeRight;
break;
default:
NSLog(#"default");
newOrientation = AVCaptureVideoOrientationPortrait;
break;
}
return newOrientation;
}
This part of the app works properly (I see video as I should on any device orientation).
But when I try to make a thumbnail (or play video) I have problems.
As I've read in other questions I do the following for each track:
AVAssetTrack* videoTrack = [[asset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];
CGAffineTransform txf = [videoTrack preferredTransform];
CGFloat videoAngleInDegree = RadiansToDegrees(atan2(txf.b, txf.a));
if (txf.a == 0 && txf.b == 1.0 && txf.c == -1.0 && txf.d == 0) {
thumbOrientation = UIImageOrientationLeft;
}
if (txf.a == 0 && txf.b == -1.0 && txf.c == 1.0 && txf.d == 0) {
thumbOrientation = UIImageOrientationRight;
}
if (txf.a == 1.0 && txf.b == 0 && txf.c == 0 && txf.d == 1.0) {
thumbOrientation = UIImageOrientationUp;
}
if (txf.a == -1.0 && txf.b == 0 && txf.c == 0 && txf.d == -1.0) {
thumbOrientation = UIImageOrientationDown;
}
UIImage *image = [UIImage imageWithCGImage:im scale:1.0 orientation:thumbOrientation];
I have 2 sample files: 1 - landscape right, 2 - landscape left. I expect they have different orientations in code above, but unexpectedly they have the same (and videoAngleInDegree is the same for both of them).
Are there any workarounds?

Are you sure it's working properly? The getVideoOrientation there looks wrong -- AVCapture and UIDevice have opposite meanings for landscape left and right.
Here is a correct implementation from this question -- check the discussion there out and see if it helps.
- (void)deviceOrientationDidChange{
UIDeviceOrientation deviceOrientation = [[UIDevice currentDevice] orientation];
AVCaptureVideoOrientation newOrientation;
if (deviceOrientation == UIDeviceOrientationPortrait){
NSLog(#"deviceOrientationDidChange - Portrait");
newOrientation = AVCaptureVideoOrientationPortrait;
}
else if (deviceOrientation == UIDeviceOrientationPortraitUpsideDown){
NSLog(#"deviceOrientationDidChange - UpsideDown");
newOrientation = AVCaptureVideoOrientationPortraitUpsideDown;
}
// AVCapture and UIDevice have opposite meanings for landscape left and right (AVCapture orientation is the same as UIInterfaceOrientation)
else if (deviceOrientation == UIDeviceOrientationLandscapeLeft){
NSLog(#"deviceOrientationDidChange - LandscapeLeft");
newOrientation = AVCaptureVideoOrientationLandscapeRight;
}
else if (deviceOrientation == UIDeviceOrientationLandscapeRight){
NSLog(#"deviceOrientationDidChange - LandscapeRight");
newOrientation = AVCaptureVideoOrientationLandscapeLeft;
}
else if (deviceOrientation == UIDeviceOrientationUnknown){
NSLog(#"deviceOrientationDidChange - Unknown ");
newOrientation = AVCaptureVideoOrientationPortrait;
}
else{
NSLog(#"deviceOrientationDidChange - Face Up or Down");
newOrientation = AVCaptureVideoOrientationPortrait;
}
[self setOrientation:newOrientation];
}

Related

Portrait Video plays in black screen

Update
Don't use simulator to test video translations. But I have codec issue
asked in another question here. Any help is appreciated.
I am using AVURLAsset to create my videos and they work fine as long the videos picked from gallery are in landscape mode. But when I use a portrait video it either plays in black screen(audio plays) or the frames are twisted (see image).
Update:
I tried using the CGAffineTransform still no luck.
Here's the code:
-(void) createVideo{
AVMutableComposition* mixComposition = [AVMutableComposition composition];
NSDictionary *options = #{AVURLAssetPreferPreciseDurationAndTimingKey:#YES};
_videoAsset = [[AVURLAsset alloc]initWithURL:video_url options:options];
CMTime startTimeV=CMTimeMakeWithSeconds(videoStartTime.floatValue, 1);
CMTime endTimeV=CMTimeMakeWithSeconds(videoEndTime.floatValue, 1);
CMTimeRange video_timeRange = CMTimeRangeMake(startTimeV,endTimeV);
AVMutableCompositionTrack *a_compositionVideoTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
[a_compositionVideoTrack insertTimeRange:video_timeRange ofTrack:[[_videoAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0] atTime:kCMTimeZero error:nil];
AVMutableVideoCompositionInstruction *mainInstruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
mainInstruction.timeRange = CMTimeRangeMake(kCMTimeZero, self.videoAsset.duration);
AVMutableVideoCompositionLayerInstruction *videolayerInstruction = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:a_compositionVideoTrack];
AVAssetTrack *videoAssetTrack = [[self.videoAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];
CGSize trackDimensions = {
.width = 0.0,
.height = 0.0,
};
trackDimensions = [videoAssetTrack naturalSize];
int width = trackDimensions.width;
int height = trackDimensions.height;
UIImageOrientation videoAssetOrientation_ = UIImageOrientationUp;
BOOL isVideoAssetPortrait_ = NO;
CGAffineTransform videoTransform = videoAssetTrack.preferredTransform;
if (videoTransform.a == 0 && videoTransform.b == 1.0 && videoTransform.c == -1.0 && videoTransform.d == 0) {
videoAssetOrientation_ = UIImageOrientationRight;
isVideoAssetPortrait_ = YES;
}
if (videoTransform.a == 0 && videoTransform.b == -1.0 && videoTransform.c == 1.0 && videoTransform.d == 0) {
videoAssetOrientation_ = UIImageOrientationLeft;
isVideoAssetPortrait_ = YES;
}
if (videoTransform.a == 1.0 && videoTransform.b == 0 && videoTransform.c == 0 && videoTransform.d == 1.0) {
videoAssetOrientation_ = UIImageOrientationUp;
}
if (videoTransform.a == -1.0 && videoTransform.b == 0 && videoTransform.c == 0 && videoTransform.d == -1.0) {
videoAssetOrientation_ = UIImageOrientationDown;
}
// CGAffineTransform transformToApply=CGAffineTransformMakeRotation(DEGREES_TO_RADIANS(90.0));
[videolayerInstruction setTransform:videoAssetTrack.preferredTransform atTime:kCMTimeZero];
[videolayerInstruction setOpacity:0.0 atTime:self.videoAsset.duration];
mainInstruction.layerInstructions = [NSArray arrayWithObjects:videolayerInstruction,nil];
AVMutableVideoComposition *mainCompositionInst = [AVMutableVideoComposition videoComposition];
CGSize naturalSize;
if(isVideoAssetPortrait_){
naturalSize = CGSizeMake(videoAssetTrack.naturalSize.height, videoAssetTrack.naturalSize.width);
} else {
naturalSize = videoAssetTrack.naturalSize;
}
float renderWidth, renderHeight;
renderWidth = naturalSize.width;
renderHeight = naturalSize.height;
mainCompositionInst.renderSize = CGSizeMake(renderWidth, renderHeight);
mainCompositionInst.instructions = [NSArray arrayWithObject:mainInstruction];
mainCompositionInst.frameDuration = CMTimeMake(1, 30);
And the export :
-(void)export{
NSArray *dirPaths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
NSString *docsDir = [dirPaths objectAtIndex:0];
NSString* fileName=[NSString stringWithFormat:#"myvideo%lld.mp4",[#(floor([[NSDate date] timeIntervalSince1970])) longLongValue]+1];
NSString *outputFilePath = [docsDir stringByAppendingPathComponent:[NSString stringWithFormat:#"%#",fileName]];
NSURL *outputFileUrl = [NSURL fileURLWithPath:outputFilePath];
if ([[NSFileManager defaultManager] fileExistsAtPath:outputFilePath])
[[NSFileManager defaultManager] removeItemAtPath:outputFilePath error:nil];
AVAssetExportSession* _assetExport = [[AVAssetExportSession alloc] initWithAsset:mixComposition presetName:AVAssetExportPresetHighestQuality];
_assetExport.outputFileType = AVFileTypeMPEG4;
_assetExport.outputURL = outputFileUrl;
_assetExport.videoComposition = mainCompositionInst;
ShareViewController *newViewController = [self.storyboard instantiateViewControllerWithIdentifier:#"vidShare"];
[_assetExport exportAsynchronouslyWithCompletionHandler:
^(void ) {
dispatch_async(dispatch_get_main_queue(), ^{
newViewController.videoFilePath=outputFileUrl;
[self.navigationController pushViewController:newViewController animated:YES];
});
}
];
}
Landscape = OK
Portrait = FAIL
I've had this issue before. Set videolayerInstruction's transform using videoAssetTrack.preferredTransform works in most cases. But sometimes preferredTransform may lack of value of tx(ty) (you should check CGAffineTransform In Apple API Reference if you don't know what is tx(ty)) or tx(ty) has inaccurate value which results in wrong video positioning(such as playing in black screen).
So the point is: you should use preferredTransform to determine the origin video orientation and make the transform of your own.
here is the code of getting the orientation of track:
- (UIInterfaceOrientation)orientationForTrack:(AVAsset *)asset {
UIInterfaceOrientation orientation = UIInterfaceOrientationPortrait;
NSArray *tracks = [asset tracksWithMediaType:AVMediaTypeVideo];
if([tracks count] > 0) {
AVAssetTrack *videoTrack = [tracks objectAtIndex:0];
CGAffineTransform t = videoTrack.preferredTransform;
// Portrait
if(t.a == 0 && t.b == 1.0 && t.c == -1.0 && t.d == 0) {
orientation = UIInterfaceOrientationPortrait;
}
// PortraitUpsideDown
if(t.a == 0 && t.b == -1.0 && t.c == 1.0 && t.d == 0) {
orientation = UIInterfaceOrientationPortraitUpsideDown;
}
// LandscapeRight
if(t.a == 1.0 && t.b == 0 && t.c == 0 && t.d == 1.0) {
orientation = UIInterfaceOrientationLandscapeRight;
}
// LandscapeLeft
if(t.a == -1.0 && t.b == 0 && t.c == 0 && t.d == -1.0) {
orientation = UIInterfaceOrientationLandscapeLeft;
}
}
return orientation;
}
And code for getting the needed transform to apply to the videolayerInstruction's transform:
- (CGAffineTransform)transformBasedOnAsset:(AVAsset *)asset {
UIInterfaceOrientation orientation = [AVUtilities orientationForTrack:asset];
AVAssetTrack *assetTrack = [asset tracksWithMediaType:AVMediaTypeVideo][0];
CGSize naturalSize = assetTrack.naturalSize;
CGAffineTransform finalTranform;
switch (orientation) {
case UIInterfaceOrientationLandscapeLeft:
finalTranform = CGAffineTransformMake(-1, 0, 0, -1, naturalSize.width, naturalSize.height);
break;
case UIInterfaceOrientationLandscapeRight:
finalTranform = CGAffineTransformMake(1, 0, 0, 1, 0, 0);
break;
case UIInterfaceOrientationPortrait:
finalTranform = CGAffineTransformMake(0, 1, -1, 0, naturalSize.height, 0);
break;
case UIInterfaceOrientationPortraitUpsideDown:
finalTranform = CGAffineTransformMake(0, -1, 1, 0, 0, naturalSize.width);
break;
default:
break;
}
return finalTranform;
}
Hope it works for you.

AVFoundation Rotating Screen - Obj C

I am struggling changing the orientation of my video. It is recorded in portrait but then is saved in landscape. Changing the transform is only make the video rotate within a landscape video. In this example with M_PI_2 it disappears since it rotates off the screen or is flat. But if I change it to M_PI_2/2 or something it appears but crooked. I know AVFoundation does this by default. How do I change this? I got a lot of this code from this tutorial: https://www.raywenderlich.com/30200/avfoundation-tutorial-adding-overlays-and-animations-to-videos but using the AVMutableVideoCompositionLayerInstruction is not working.
AVMutableCompositionTrack *videoTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo
preferredTrackID:kCMPersistentTrackID_Invalid];
CMTime insertTime = kCMTimeZero;
for(AVAsset *videoAsset in self.videoArray){
[videoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, videoAsset.duration) ofTrack:[[videoAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0] atTime:insertTime error:nil];
// Updating the insertTime for the next insert
insertTime = CMTimeAdd(insertTime, videoAsset.duration);
}
CGAffineTransform rotationTransform = CGAffineTransformMakeRotation(M_PI_2);
videoTrack.preferredTransform = rotationTransform;
// 3.1 - Create AVMutableVideoCompositionInstruction
AVMutableVideoCompositionInstruction *mainInstruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
mainInstruction.timeRange = videoTrack.timeRange;
// 3.2 - Create an AVMutableVideoCompositionLayerInstruction for the video track and fix the orientation.
AVMutableVideoCompositionLayerInstruction *videolayerInstruction = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:videoTrack];
AVAssetTrack *videoAssetTrack = [[videoTrack.asset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];
UIImageOrientation videoAssetOrientation_ = UIImageOrientationUp;
BOOL isVideoAssetPortrait_ = NO;
CGAffineTransform videoTransform = videoAssetTrack.preferredTransform;
if (videoTransform.a == 0 && videoTransform.b == 1.0 && videoTransform.c == -1.0 && videoTransform.d == 0) {
videoAssetOrientation_ = UIImageOrientationRight;
isVideoAssetPortrait_ = YES;
}
if (videoTransform.a == 0 && videoTransform.b == -1.0 && videoTransform.c == 1.0 && videoTransform.d == 0) {
videoAssetOrientation_ = UIImageOrientationLeft;
isVideoAssetPortrait_ = YES;
}
if (videoTransform.a == 1.0 && videoTransform.b == 0 && videoTransform.c == 0 && videoTransform.d == 1.0) {
videoAssetOrientation_ = UIImageOrientationUp;
}
if (videoTransform.a == -1.0 && videoTransform.b == 0 && videoTransform.c == 0 && videoTransform.d == -1.0) {
videoAssetOrientation_ = UIImageOrientationDown;
}
//CGAffineTransform rotationTransform = videoAssetTrack.preferredTransform;
[videolayerInstruction setTransform:videoAssetTrack.preferredTransform atTime:kCMTimeZero];
[videolayerInstruction setOpacity:0.0 atTime:videoTrack.timeRange.duration];
Is there a way to set an anchor point or to make my own transform?
Check https://developer.apple.com/library/content/qa/qa1744/_index.html, that's the official explanation of setting the orientation of video with AVFoundation. Simply speaking, you need to use an AVMutableVideoCompositionLayerInstruction object to modify the transform to apply to a given track in the video composition.
Then let's talk about the transform you should apply. There're so many people who suggest using AVAssetTrack's preferredTransform. In most cases it works and you should use it to get the video orientation of your assetTrack. But sometimes preferredTransform may lack of value of tx(ty) (you should check CGAffineTransform In Apple API Reference if you don't know what is tx(ty)) or tx(ty) has inaccurate value which results in wrong video positioning.
So the main idea is: use preferredTransform to determine the origin video orientation and make the transform of your own.
here is the code of getting the orientation of track:
- (UIInterfaceOrientation)orientationForTrack:(AVAsset *)asset {
UIInterfaceOrientation orientation = UIInterfaceOrientationPortrait;
NSArray *tracks = [asset tracksWithMediaType:AVMediaTypeVideo];
if([tracks count] > 0) {
AVAssetTrack *videoTrack = [tracks objectAtIndex:0];
CGAffineTransform t = videoTrack.preferredTransform;
// Portrait
if(t.a == 0 && t.b == 1.0 && t.c == -1.0 && t.d == 0) {
orientation = UIInterfaceOrientationPortrait;
}
// PortraitUpsideDown
if(t.a == 0 && t.b == -1.0 && t.c == 1.0 && t.d == 0) {
orientation = UIInterfaceOrientationPortraitUpsideDown;
}
// LandscapeRight
if(t.a == 1.0 && t.b == 0 && t.c == 0 && t.d == 1.0) {
orientation = UIInterfaceOrientationLandscapeRight;
}
// LandscapeLeft
if(t.a == -1.0 && t.b == 0 && t.c == 0 && t.d == -1.0) {
orientation = UIInterfaceOrientationLandscapeLeft;
}
}
return orientation;
}
And code for getting the needed transform to apply:
- (CGAffineTransform)transformBasedOnAsset:(AVAsset *)asset {
UIInterfaceOrientation orientation = [AVUtilities orientationForTrack:asset];
AVAssetTrack *assetTrack = [asset tracksWithMediaType:AVMediaTypeVideo][0];
CGSize naturalSize = assetTrack.naturalSize;
CGAffineTransform finalTranform;
switch (orientation) {
case UIInterfaceOrientationLandscapeLeft:
finalTranform = CGAffineTransformMake(-1, 0, 0, -1, naturalSize.width, naturalSize.height);
break;
case UIInterfaceOrientationLandscapeRight:
finalTranform = CGAffineTransformMake(1, 0, 0, 1, 0, 0);
break;
case UIInterfaceOrientationPortrait:
finalTranform = CGAffineTransformMake(0, 1, -1, 0, naturalSize.height, 0);
break;
case UIInterfaceOrientationPortraitUpsideDown:
finalTranform = CGAffineTransformMake(0, -1, 1, 0, 0, naturalSize.width);
break;
default:
break;
}
return finalTranform;
}
If you get confused, I suggest you taking videos with different orientation first, then print out the preferredTransform of the video using NSStringFromCGAffineTransform() or something else to check for details.

Saved video preferred transform is showing landscape for portrait videos

i am using GPUImage movie framework to do save a portrait video. Even the video which is saved in the album is in portrait mode with a resolution of 720*1240 but in the code when i am checking the preferred transform of that video it is showing landscape . Following are the respective code
filterView = [[GPUImageView alloc]initWithFrame:CGRectMake(0, 0, self.view.frame.size.width, self.view.frame.size.height)];
filterView.fillMode = kGPUImageFillModePreserveAspectRatioAndFill;
videoCamera = [[GPUImageVideoCamera alloc] initWithSessionPreset:AVCaptureSessionPresetMedium cameraPosition:AVCaptureDevicePositionBack ];
videoCamera.outputImageOrientation = UIInterfaceOrientationPortrait;
videoCamera.shouldSmoothlyScaleOutput=YES;
videoCamera.frameRate = 30.0;
movieURL = [NSURL fileURLWithPath:pathToMovie];
movieWriter = [[GPUImageMovieWriter alloc] initWithMovieURL:movieURL size:CGSizeMake(720, 1280)];
videoCamera.audioEncodingTarget = movieWriter;
[filter removeAllTargets];
filter = [[GPUImageRGBFilter alloc] init];
[filter forceProcessingAtSize:CGSizeMake(720, 1280)];
[videoCamera addTarget:filter];
[filter addTarget:filterView];
[filter addTarget:movieWriter];
videoCamera.audioEncodingTarget = movieWriter;
[movieWriter startRecording];
then when the recording is done i stop the moviewriter and save it in camera roll. After that when i take that video from camera roll to edit, i first check it's orientation and there i find out that it is in landscape , the code for that part is below
AVURLAsset *assetClip = VD.assetForVideoMemory ;
AVAssetTrack *clipVideoTrack = [[assetClip tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];
CGSize size = [clipVideoTrack naturalSize];
CGAffineTransform rotation;
CGAffineTransform txf = [clipVideoTrack preferredTransform];
CGAffineTransform translateToCenter;
CGAffineTransform mixedTransform;
float scaleFX = 0, scaleFY =0;
CGSize naturalSize;
CGAffineTransform scaleXY;
// changing the height and width of the video
if (txf.a == 0 && txf.b == 1.0 && txf.c == -1.0 && txf.d == 0)// portrait
{
naturalSize = CGSizeMake(clipVideoTrack.naturalSize.height, clipVideoTrack.naturalSize.width);
}
if (txf.a == 0 && txf.b == -1.0 && txf.c == 1.0 && txf.d == 0)// portrait upside down
{
naturalSize = CGSizeMake(clipVideoTrack.naturalSize.height, clipVideoTrack.naturalSize.width);
}
if (txf.a == 1.0 && txf.b == 0 && txf.c == 0 && txf.d == 1.0) // landscape left
{
naturalSize = CGSizeMake(clipVideoTrack.naturalSize.width, clipVideoTrack.naturalSize.height);
}
if (txf.a == -1.0 && txf.b == 0 && txf.c == 0 && txf.d == -1.0)// landscape right
{
naturalSize = CGSizeMake(clipVideoTrack.naturalSize.width, clipVideoTrack.naturalSize.height);
}
now in the above code it is always showing that the video is in landscape left , even for the portrait . Where is the problem ?? why it is not giving the portrait mode
I found that if video is written using AVFileTypeAppleM4V preset, the preferredOrientation, which AVAssetTrack then reads, is incorrect. However, when writing with AVFileTypeQuickTimeMovie preset, AVAssetTrack gets the same orientation that was specified during write. Also Iphone Camera app seems to write with that latter preset.
My guess is that you are writing using AVFileTypeAppleM4V.

Rotation With Animation in GPUImage

I Am using GPUImage.
I Want to Rotate (With 90, 180, 270, 360 degree) Image With Animation using GPUImageFilter.
Please Help.
Thanks in Advance.
Just Check What i did to do For Rotation
int tag - is defined in .h file
-(IBAction)btnRotate_Clicked:(id)sender
{
[self hideFilterView];
[self hideFontView];
[self hideEnhanceView];
if (tag == 0)
{
staticPictureOriginalOrientation = UIImageOrientationRight;
tag = 1;
}
else if (tag == 1)
{
staticPictureOriginalOrientation = UIImageOrientationDown;
tag = 2;
}
else if (tag == 2)
{
staticPictureOriginalOrientation = UIImageOrientationLeft;
tag = 3;
}
else
{
staticPictureOriginalOrientation = UIImageOrientationUp;
tag = 0;
}
UIImageOrientation orientation = staticPictureOriginalOrientation;
switch(orientation){
case UIImageOrientationUp:
[self.MimageView setFillMode:kGPUImageFillModePreserveAspectRatioAndFill];
break;
case UIImageOrientationLeft:
[self.MimageView setFillMode:kGPUImageFillModePreserveAspectRatio];
break;
case UIImageOrientationRight:
[self.MimageView setFillMode:kGPUImageFillModePreserveAspectRatio];
break;
case UIImageOrientationDown:
[self.MimageView setFillMode:kGPUImageFillModePreserveAspectRatioAndFill];
break;
default:
break;
}
[self prepareStaticFilter];
}
-(void) prepareStaticFilter
{
isImageProcessed = TRUE;
[staticPicture addTarget:filter];
if (hasBlur)
{
[filter addTarget:blurFilter];
[blurFilter addTarget:self.MimageView];
}
else
{
[filter addTarget:self.MimageView];
}
GPUImageRotationMode imageViewRotationMode = kGPUImageNoRotation;
switch (staticPictureOriginalOrientation)
{
case UIImageOrientationLeft:
imageViewRotationMode = kGPUImageRotateLeft;
break;
case UIImageOrientationRight:
imageViewRotationMode = kGPUImageRotateRight;
break;
case UIImageOrientationDown:
imageViewRotationMode = kGPUImageRotate180;
break;
default:
imageViewRotationMode = kGPUImageNoRotation;
break;
}
[filter setInputRotation:imageViewRotationMode atIndex:0];
[staticPicture processImage];
}

Cropping AVAsset video with AVFoundation

I am using AVCaptureMovieFileOutput to record some video. I have the preview layer displayed using AVLayerVideoGravityResizeAspectFill which zooms in slightly. The problem I have is that the final video is larger, containing extra image that didn't fit on the screen during preview.
This is the preview and resulting video
Is there a way I can specify a CGRect that I want to cut from the video using AVAssetExportSession?
EDIT ----
When I apply a CGAffineTransformScale to the AVAssetTrack it zooms into the video, and with the AVMutableVideoComposition renderSize set to view.bounds it crops off the ends. Great, there's just 1 problem left. The width of the video does not stretch to the correct width, it just gets filled with black.
EDIT 2 ----
The suggested question/answer is incomplete..
Some of my code:
In my - (void)captureOutput:(AVCaptureFileOutput *)captureOutput didFinishRecordingToOutputFileAtURL:(NSURL *)outputFileURL fromConnections:(NSArray *)connections error:(NSError *)error method I have this to crop and resize the video.
- (void)flipAndSave:(NSURL *)videoURL withCompletionBlock:(void(^)(NSURL *returnURL))completionBlock
{
AVURLAsset *firstAsset = [AVURLAsset assetWithURL:videoURL];
// 1 - Create AVMutableComposition object. This object will hold your AVMutableCompositionTrack instances.
AVMutableComposition *mixComposition = [[AVMutableComposition alloc] init];
// 2 - Video track
AVMutableCompositionTrack *firstTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo
preferredTrackID:kCMPersistentTrackID_Invalid];
[firstTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, firstAsset.duration)
ofTrack:[[firstAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0] atTime:kCMTimeZero error:nil];
// 2.1 - Create AVMutableVideoCompositionInstruction
AVMutableVideoCompositionInstruction *mainInstruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
mainInstruction.timeRange = CMTimeRangeMake(CMTimeMakeWithSeconds(0, 600), firstAsset.duration);
// 2.2 - Create an AVMutableVideoCompositionLayerInstruction for the first track
AVMutableVideoCompositionLayerInstruction *firstlayerInstruction = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:firstTrack];
AVAssetTrack *firstAssetTrack = [[firstAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];
UIImageOrientation firstAssetOrientation_ = UIImageOrientationUp;
BOOL isFirstAssetPortrait_ = NO;
CGAffineTransform firstTransform = firstAssetTrack.preferredTransform;
if (firstTransform.a == 0 && firstTransform.b == 1.0 && firstTransform.c == -1.0 && firstTransform.d == 0) {
firstAssetOrientation_ = UIImageOrientationRight;
isFirstAssetPortrait_ = YES;
}
if (firstTransform.a == 0 && firstTransform.b == -1.0 && firstTransform.c == 1.0 && firstTransform.d == 0) {
firstAssetOrientation_ = UIImageOrientationLeft;
isFirstAssetPortrait_ = YES;
}
if (firstTransform.a == 1.0 && firstTransform.b == 0 && firstTransform.c == 0 && firstTransform.d == 1.0) {
firstAssetOrientation_ = UIImageOrientationUp;
}
if (firstTransform.a == -1.0 && firstTransform.b == 0 && firstTransform.c == 0 && firstTransform.d == -1.0) {
firstAssetOrientation_ = UIImageOrientationDown;
}
// [firstlayerInstruction setTransform:firstAssetTrack.preferredTransform atTime:kCMTimeZero];
// [firstlayerInstruction setCropRectangle:self.view.bounds atTime:kCMTimeZero];
CGFloat scale = [self getScaleFromAsset:firstAssetTrack];
firstTransform = CGAffineTransformScale(firstTransform, scale, scale);
[firstlayerInstruction setTransform:firstTransform atTime:kCMTimeZero];
// 2.4 - Add instructions
mainInstruction.layerInstructions = [NSArray arrayWithObjects:firstlayerInstruction,nil];
AVMutableVideoComposition *mainCompositionInst = [AVMutableVideoComposition videoComposition];
mainCompositionInst.instructions = [NSArray arrayWithObject:mainInstruction];
mainCompositionInst.frameDuration = CMTimeMake(1, 30);
// CGSize videoSize = firstAssetTrack.naturalSize;
CGSize videoSize = self.view.bounds.size;
BOOL isPortrait_ = [self isVideoPortrait:firstAsset];
if(isPortrait_) {
videoSize = CGSizeMake(videoSize.height, videoSize.width);
}
NSLog(#"%#", NSStringFromCGSize(videoSize));
mainCompositionInst.renderSize = videoSize;
// 3 - Audio track
AVMutableCompositionTrack *AudioTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeAudio
preferredTrackID:kCMPersistentTrackID_Invalid];
[AudioTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, firstAsset.duration)
ofTrack:[[firstAsset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0] atTime:kCMTimeZero error:nil];
// 4 - Get path
NSString *outputPath = [[NSString alloc] initWithFormat:#"%#%#", NSTemporaryDirectory(), #"cutoutput.mov"];
NSURL *outputURL = [[NSURL alloc] initFileURLWithPath:outputPath];
NSFileManager *manager = [[NSFileManager alloc] init];
if ([manager fileExistsAtPath:outputPath])
{
[manager removeItemAtPath:outputPath error:nil];
}
// 5 - Create exporter
AVAssetExportSession *exporter = [[AVAssetExportSession alloc] initWithAsset:mixComposition
presetName:AVAssetExportPresetHighestQuality];
exporter.outputURL=outputURL;
exporter.outputFileType = AVFileTypeQuickTimeMovie;
exporter.shouldOptimizeForNetworkUse = YES;
exporter.videoComposition = mainCompositionInst;
[exporter exportAsynchronouslyWithCompletionHandler:^{
switch ([exporter status])
{
case AVAssetExportSessionStatusFailed:
NSLog(#"Export failed: %# : %#", [[exporter error] localizedDescription], [exporter error]);
completionBlock(nil);
break;
case AVAssetExportSessionStatusCancelled:
NSLog(#"Export canceled");
completionBlock(nil);
break;
default: {
NSURL *outputURL = exporter.outputURL;
dispatch_async(dispatch_get_main_queue(), ^{
completionBlock(outputURL);
});
break;
}
}
}];
}
Here is my interpretation of your question: You are capturing video on a device with a screen ratio of 4:3, thus your AVCaptureVideoPreviewLayer is 4:3, but the video input device captures video in 16:9 so the resulting video is 'larger' than seen in the preview.
If you are simply looking to crop the extra pixels not caught by the preview then check out this http://www.netwalk.be/article/record-square-video-ios. This article shows how to crop the video into a square. However you'll only need a few modifications to crop to 4:3. I've gone and tested this, here are the changes I made:
Once you have the AVAssetTrack for the video you will need to calculate a new height.
// we convert the captured height i.e. 1080 to a 4:3 screen ratio and get the new height
CGFloat newHeight = clipVideoTrack.naturalSize.height/3*4;
Then modify these two lines, using newHeight.
videoComposition.renderSize = CGSizeMake(clipVideoTrack.naturalSize.height, newHeight);
CGAffineTransform t1 = CGAffineTransformMakeTranslation(clipVideoTrack.naturalSize.height, -(clipVideoTrack.naturalSize.width - newHeight)/2 );
So what we've done here is set the renderSize to a 4:3 ratio - the exact dimension are based on the input device. We then use a CGAffineTransform to translate the video position so that what we saw in the AVCaptureVideoPreviewLayer is what is rendered to our file.
Edit: If you want to put it all together and crop a video based on the device's screen ratio (3:2, 4:3, 16:9) and take the video orientation into mind we need to add a few things.
First here is the modified sample code with a few critical alterations:
// output file
NSString* docFolder = [NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES) lastObject];
NSString* outputPath = [docFolder stringByAppendingPathComponent:#"output2.mov"];
if ([[NSFileManager defaultManager] fileExistsAtPath:outputPath])
[[NSFileManager defaultManager] removeItemAtPath:outputPath error:nil];
// input file
AVAsset* asset = [AVAsset assetWithURL:outputFileURL];
AVMutableComposition *composition = [AVMutableComposition composition];
[composition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
// input clip
AVAssetTrack *videoTrack = [[asset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];
// crop clip to screen ratio
UIInterfaceOrientation orientation = [self orientationForTrack:asset];
BOOL isPortrait = (orientation == UIInterfaceOrientationPortrait || orientation == UIInterfaceOrientationPortraitUpsideDown) ? YES: NO;
CGFloat complimentSize = [self getComplimentSize:videoTrack.naturalSize.height];
CGSize videoSize;
if(isPortrait) {
videoSize = CGSizeMake(videoTrack.naturalSize.height, complimentSize);
} else {
videoSize = CGSizeMake(complimentSize, videoTrack.naturalSize.height);
}
AVMutableVideoComposition* videoComposition = [AVMutableVideoComposition videoComposition];
videoComposition.renderSize = videoSize;
videoComposition.frameDuration = CMTimeMake(1, 30);
AVMutableVideoCompositionInstruction *instruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
instruction.timeRange = CMTimeRangeMake(kCMTimeZero, CMTimeMakeWithSeconds(60, 30) );
// rotate and position video
AVMutableVideoCompositionLayerInstruction* transformer = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:videoTrack];
CGFloat tx = (videoTrack.naturalSize.width-complimentSize)/2;
if (orientation == UIInterfaceOrientationPortrait || orientation == UIInterfaceOrientationLandscapeRight) {
// invert translation
tx *= -1;
}
// t1: rotate and position video since it may have been cropped to screen ratio
CGAffineTransform t1 = CGAffineTransformTranslate(videoTrack.preferredTransform, tx, 0);
// t2/t3: mirror video horizontally
CGAffineTransform t2 = CGAffineTransformTranslate(t1, isPortrait?0:videoTrack.naturalSize.width, isPortrait?videoTrack.naturalSize.height:0);
CGAffineTransform t3 = CGAffineTransformScale(t2, isPortrait?1:-1, isPortrait?-1:1);
[transformer setTransform:t3 atTime:kCMTimeZero];
instruction.layerInstructions = [NSArray arrayWithObject: transformer];
videoComposition.instructions = [NSArray arrayWithObject: instruction];
// export
exporter = [[AVAssetExportSession alloc] initWithAsset:asset presetName:AVAssetExportPresetHighestQuality] ;
exporter.videoComposition = videoComposition;
exporter.outputURL=[NSURL fileURLWithPath:outputPath];
exporter.outputFileType=AVFileTypeQuickTimeMovie;
[exporter exportAsynchronouslyWithCompletionHandler:^(void){
NSLog(#"Exporting done!");
// added export to library for testing
ALAssetsLibrary *library = [[ALAssetsLibrary alloc] init];
if ([library videoAtPathIsCompatibleWithSavedPhotosAlbum:[NSURL fileURLWithPath:outputPath]]) {
[library writeVideoAtPathToSavedPhotosAlbum:[NSURL fileURLWithPath:outputPath]
completionBlock:^(NSURL *assetURL, NSError *error) {
NSLog(#"Saved to album");
if (error) {
}
}];
}
}];
What we added here is a call to get the new render size of the video based on cropping its dimensions to the screen ratio. Once we crop the size down, we need to translate the position to recenter the video. So we grab its orientation to move it in the proper direction. This will fix the off-center issue we saw with UIInterfaceOrientationLandscapeLeft. Finally CGAffineTransform t2, t3 mirror the video horizontally.
And here are the two new methods that make this happen:
- (CGFloat)getComplimentSize:(CGFloat)size {
CGRect screenRect = [[UIScreen mainScreen] bounds];
CGFloat ratio = screenRect.size.height / screenRect.size.width;
// we have to adjust the ratio for 16:9 screens
if (ratio == 1.775) ratio = 1.77777777777778;
return size * ratio;
}
- (UIInterfaceOrientation)orientationForTrack:(AVAsset *)asset {
UIInterfaceOrientation orientation = UIInterfaceOrientationPortrait;
NSArray *tracks = [asset tracksWithMediaType:AVMediaTypeVideo];
if([tracks count] > 0) {
AVAssetTrack *videoTrack = [tracks objectAtIndex:0];
CGAffineTransform t = videoTrack.preferredTransform;
// Portrait
if(t.a == 0 && t.b == 1.0 && t.c == -1.0 && t.d == 0) {
orientation = UIInterfaceOrientationPortrait;
}
// PortraitUpsideDown
if(t.a == 0 && t.b == -1.0 && t.c == 1.0 && t.d == 0) {
orientation = UIInterfaceOrientationPortraitUpsideDown;
}
// LandscapeRight
if(t.a == 1.0 && t.b == 0 && t.c == 0 && t.d == 1.0) {
orientation = UIInterfaceOrientationLandscapeRight;
}
// LandscapeLeft
if(t.a == -1.0 && t.b == 0 && t.c == 0 && t.d == -1.0) {
orientation = UIInterfaceOrientationLandscapeLeft;
}
}
return orientation;
}
These are pretty straight forward. The only thing to note is that in the getComplimentSize: method we have to manually adjust the ratio for 16:9 since the iPhone5+ resolution is mathematically shy of true 16:9.
AVCaptureVideoDataOutput is a concrete sub-class of AVCaptureOutput you use to process uncompressed frames from the video being captured, or to access compressed frames.
An instance of AVCaptureVideoDataOutput produces video frames you can process using other media APIs. You can access the frames with the captureOutput:didOutputSampleBuffer:fromConnection: delegate method.
Configuring a Session
You use a preset on the session to specify the image quality and resolution you want. A preset is a constant that identifies one of a number of possible configurations; in some cases the actual configuration is device-specific:
https://developer.apple.com/library/mac/documentation/AudioVideo/Conceptual/AVFoundationPG/Articles/04_MediaCapture.html
the actual values these presets represent for various devices, see “Saving to a Movie File” and “Capturing Still Images.”
If you want to set a size-specific configuration, you should check whether it is supported before setting it:
if ([session canSetSessionPreset:AVCaptureSessionPreset1280x720]) {
session.sessionPreset = AVCaptureSessionPreset1280x720;
}
else {
// Handle the failure.
}

Resources