Cropping a specified CGRect from a video in ios - ios

I want to crop a specified region of CGRect from my video file.
I followed this Tutorial .
I'm using renderSize property of AVMutableVideoComposition to get to a specified size. Here is my code that I use to crop a video to a specified size.
- (void) cropVideoAtPath:(NSString *) path
{
//load our movie Asset
AVAsset *asset = [AVAsset assetWithURL:[NSURL fileURLWithPath:path]];
//create an avassetrack with our asset
AVAssetTrack *clipVideoTrack = [[asset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];
//create a video composition and preset some settings
AVMutableVideoComposition* videoComposition = [AVMutableVideoComposition videoComposition];
videoComposition.frameDuration = CMTimeMake(1, 30);
//here we are setting its render size to its height x height (Square)
float deltaHeight = 100;
videoComposition.renderSize = CGSizeMake(clipVideoTrack.naturalSize.width, clipVideoTrack.naturalSize.height-deltaHeight);
//create a video instruction
AVMutableVideoCompositionInstruction *instruction =
[AVMutableVideoCompositionInstruction videoCompositionInstruction];
instruction.timeRange = CMTimeRangeMake(kCMTimeZero, CMTimeMakeWithSeconds(60, 30));
AVMutableVideoCompositionLayerInstruction* transformer [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:clipVideoTrack];
// CGAffineTransform t1 = CGAffineTransformMakeTranslation(0, 1000);
// CGAffineTransform finalTransform = t1;
// [transformer setTransform:finalTransform atTime:kCMTimeZero];
instruction.layerInstructions = [NSArray arrayWithObject:transformer];
videoComposition.instructions = [NSArray arrayWithObject: instruction];
//Create an Export Path to store the cropped video
NSString * documentsPath = [NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES) objectAtIndex:0];
NSString *exportPathLoc = [documentsPath stringByAppendingFormat:#"/CroppedVideo.mp4"];
NSURL *exportUrl = [NSURL fileURLWithPath:exportPathLoc];
//Remove any prevouis videos at that path
[[NSFileManager defaultManager] removeItemAtURL:exportUrl error:nil];
//Export
AVAssetExportSession *exporter = [[AVAssetExportSession alloc] initWithAsset:asset presetName:AVAssetExportPresetHighestQuality] ;
exporter.videoComposition = videoComposition;
exporter.outputURL = exportUrl;
exporter.outputFileType = AVFileTypeQuickTimeMovie;
[exporter exportAsynchronouslyWithCompletionHandler:^
{
dispatch_async(dispatch_get_main_queue(), ^{
NSLog(#"Saved");
UISaveVideoAtPathToSavedPhotosAlbum([[exporter outputURL] path],nil,nil,nil);
});
}];
}
My original video look like this one.
My video after applying renderSize to AVMutableVideoComposition look like this one.
As we can see bottom portion are successfully clipped away from original video.
Lets say I have a video with size (1000,1000) and I want only the center portion (500,500) region of that video. So my CGrect would be (0,250,500,500).
In my case I want only the region where image is present and so I want to crop out top bar and bottom bar.
So I applied CGAffineTransform to the AVMutableVideoCompositionLayerInstruction like this
CGAffineTransform t1 = CGAffineTransformMakeTranslation(0, deltaHeight);
CGAffineTransform finalTransform = t1;
[transformer setTransform:finalTransform atTime:kCMTimeZero];
which resulted like this below image.
So how can I apply this (x,y) value in cropping the video and what am I doing wrong in here . I would appreciate any help.

Related

Is it possible fully crop a video with AVAssetWriter?

I am using AVAssetWriter to create a video file, I want to crop it and came upon the keys for AVVideoCleanApertureKey. This lets me set a viewport for the video, which looks like it is cropping the video. But in reality it doesn't, the full frame is still present in the video file, and wether the "crop" is used seems to be up to the view player.
So, is there a way to make discard the "outside" data around the viewport with AVAssetWriter? Or do I need to change my approach completely.
These images show the same video file, but only QuickTime cares about the viewport.
I believe the direct answer to the question is NO. So in the end I switched over to AVMutableVideoComposition and AVAssetExportSession as other answers around the web suggested.
AVAsset *video = [AVAsset assetWithURL:outputURL];
AVAssetTrack *assetVideoTrack = [[video tracksWithMediaType:AVMediaTypeVideo] lastObject];
AVMutableVideoCompositionLayerInstruction *layerInstruction = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:assetVideoTrack];
CGAffineTransform transform = CGAffineTransformMakeTranslation(-self.rect.origin.x, -self.rect.origin.y);
[layerInstruction setTransform:transform atTime:kCMTimeZero];
AVMutableVideoCompositionInstruction *instruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
instruction.layerInstructions = #[layerInstruction];
instruction.timeRange = assetVideoTrack.timeRange;
AVMutableVideoComposition *videoComposition = [AVMutableVideoComposition videoComposition];
// https://stackoverflow.com/questions/22883525/avassetexportsession-giving-me-a-green-border-on-right-and-bottom-of-output-vide
videoComposition.renderSize = CGSizeMake(floor(self.rect.size.width / 16) * 16,
floor(self.rect.size.height / 16) * 16);
videoComposition.renderScale = 1.0;
videoComposition.frameDuration = CMTimeMake(1, 30);
videoComposition.instructions = #[instruction];
AVAssetExportSession *exportSession = [AVAssetExportSession exportSessionWithAsset:video presetName:AVAssetExportPreset1280x720];
exportSession.shouldOptimizeForNetworkUse = NO;
exportSession.outputFileType = AVFileTypeMPEG4;
exportSession.videoComposition = videoComposition;
exportSession.outputURL = outputURL2;
[exportSession exportAsynchronouslyWithCompletionHandler:^{
NSLog(#"done processing video!");
}];

Merging Clips with Different Resolutions

I have a set of video clips that I would like to merge together and then put a watermark on it.
I am able to do both functions individually, however problems arise when performing the them together.
All clips that will be merged are either 1920x1080 or 960x540.
For some reason, AVAssetExportSession does not display them well together.
Here are the 2 bugs based on 3 different scenarios:
This image is a result of:
Merging Clips together
As you can see, there is nothing wrong here, the output video produces the desired effect.
However, when I then try to add a watermark, it creates the following issue:
This image is a result of:
Merging Clips together
Putting a watermark on it
BUG 1: Some clips in the video get resized for whatever reason while other clips do not.
This image is a result of:
Merging Clips together
Resizing clips that are 960x540 to 1920x1080
Putting a watermark on it
Bug 2 Now the clips that need to be resized get resized, however the old unresized clip is still there.
Merging/Resizing Code:
-(void) mergeClips{
dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0), ^{
AVMutableComposition *mixComposition = [[AVMutableComposition alloc] init];
AVMutableCompositionTrack *mutableVideoTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
AVMutableCompositionTrack *mutableAudioTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
// loop through the list of videos and add them to the track
CMTime currentTime = kCMTimeZero;
NSMutableArray* instructionArray = [[NSMutableArray alloc] init];
if (_clipsArray){
for (int i = 0; i < (int)[_clipsArray count]; i++){
NSURL* url = [_clipsArray objectAtIndex:i];
AVAsset *asset = [AVAsset assetWithURL:url];
AVAssetTrack *videoTrack = [[asset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];
AVAssetTrack *audioTrack = [[asset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0];
CGSize size = videoTrack.naturalSize;
CGFloat widthScale = 1920.0f/size.width;
CGFloat heightScale = 1080.0f/size.height;
// lines that performs resizing
AVMutableVideoCompositionLayerInstruction *layerInstruction = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:mutableVideoTrack];
CGAffineTransform scale = CGAffineTransformMakeScale(widthScale,heightScale);
CGAffineTransform move = CGAffineTransformMakeTranslation(0,0);
[layerInstruction setTransform:CGAffineTransformConcat(scale, move) atTime:currentTime];
[instructionArray addObject:layerInstruction];
[mutableVideoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, asset.duration)
ofTrack:videoTrack
atTime:currentTime error:nil];
[mutableAudioTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, asset.duration)
ofTrack:audioTrack
atTime:currentTime error:nil];
currentTime = CMTimeMakeWithSeconds(CMTimeGetSeconds(asset.duration) + CMTimeGetSeconds(currentTime), asset.duration.timescale);
}
}
AVMutableVideoCompositionInstruction * mainInstruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
mainInstruction.timeRange = CMTimeRangeMake(kCMTimeZero, currentTime);
mainInstruction.layerInstructions = instructionArray;
// 4 - Get path
NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
NSString *documentsDirectory = [paths objectAtIndex:0];
NSString *lastPostedDayPath = [documentsDirectory stringByAppendingPathComponent:#"lastPostedDay"];
//Check if folder exists, if not create folder
if (![[NSFileManager defaultManager] fileExistsAtPath:lastPostedDayPath]){
[[NSFileManager defaultManager] createDirectoryAtPath:lastPostedDayPath withIntermediateDirectories:NO attributes:nil error:nil];
}
NSString *fileName = [NSString stringWithFormat:#"%li_%li_%li.mov", (long)_month, (long)_day, (long)_year];
NSString *finalDayPath = [lastPostedDayPath stringByAppendingPathComponent:fileName];
NSURL *url = [NSURL fileURLWithPath:finalDayPath];
BOOL fileExists = [[NSFileManager defaultManager] fileExistsAtPath:finalDayPath];
if (fileExists){
NSLog(#"file exists");
[[NSFileManager defaultManager] removeItemAtURL:url error:nil];
}
AVMutableVideoComposition *mainComposition = [AVMutableVideoComposition videoComposition];
mainComposition.instructions = [NSArray arrayWithObject:mainInstruction];
mainComposition.frameDuration = CMTimeMake(1, 30);
mainComposition.renderSize = CGSizeMake(1920.0f, 1080.0f);
// 5 - Create exporter
_exportSession = [[AVAssetExportSession alloc] initWithAsset:mixComposition
presetName:AVAssetExportPresetHighestQuality];
_exportSession.outputURL=url;
_exportSession.outputFileType = AVFileTypeQuickTimeMovie;
_exportSession.shouldOptimizeForNetworkUse = YES;
_exportSession.videoComposition = mainComposition;
[_exportSession exportAsynchronouslyWithCompletionHandler:^{
[merge_timer invalidate];
merge_timer = nil;
switch (_exportSession.status) {
case AVAssetExportSessionStatusFailed:
NSLog(#"Export failed -> Reason: %#, User Info: %#",
_exportSession.error.localizedDescription,
_exportSession.error.userInfo.description);
[self showSavingFailedDialog];
break;
case AVAssetExportSessionStatusCancelled:
NSLog(#"Export cancelled");
[self showSavingFailedDialog];
break;
case AVAssetExportSessionStatusCompleted:
NSLog(#"Export finished");
[self addWatermarkToExportSession:_exportSession];
break;
default:
break;
}
}];
});
}
Once it finishes this, I run it through a different Export Session that just simply adds a watermark.
Is there something I am doing wrong in my code or process?
Is there an easier way for achieving this?
Thank you for your time!
I was able to solve my issue.
For some reason, AVAssetExportSession will not actually create a 'flat' video file of the merged clips, so it still recognized the lower resolution clips and their locations when adding the watermark which caused them to resize.
What I did to solve this was, first use AVAssetWriter to merge my clips and create one 'flat' file. I then could add a watermark without having a resizing issue.
Hope this helps anyone who may come across this problem in the future!
I also encountered the same problem,
you can set opacity after one video end like this:
[layerInstruction setOpacity:0.0 atTime:duration];

CGAffineTransform scale not scaling from center - with export

I guess what makes my question/problem different than other postings is that I am not scaling a view, but rather an asset layer instruction (AVMutableVideoCompositionLayerInstruction). So setting anchor points, view.center, CG Rect scaling all do not work.
Beyond moving the asset with CGAffineTransformMakeTranslation to get it to look like it was centered, but is highly inaccurate, I can not figure out how to make it scale from the center. Is there a property I'm missing? Docs and guides aren't very helpful, but maybe I missed something.
Code is below. Thank you all in advance!!! :)
Also, for those looking for way to export avasset with CGTransforms, the below code is all the steps to get there; of course need to fill out details like the CMTimeRanges, but hope this helps someone figure this confusing thing out.
-(void) goAssetsExport {
AVMutableComposition *composition = [[AVMutableComposition alloc] init];
AVMutableCompositionTrack *firstTrack = [composition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
AVMutableVideoComposition *videoComposition = [AVMutableVideoComposition videoComposition];
videoComposition.frameDuration = CMTimeMake(1,30);
videoComposition.renderScale = 1.0;
videoComposition.renderSize = CGSizeMake(self.view.bounds.size.width, self.view.bounds.size.height);
NSURL *movieURL = [[NSBundle mainBundle] URLForResource:[NSString stringWithFormat:#"%#", [preloadEffectsArray objectAtIndex:i]] withExtension:#"mov"];
AVURLAsset *firstAsset = [AVURLAsset URLAssetWithURL:movieURL options:nil];
AVAssetTrack *firstAssetTrack = [[firstAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];
[firstTrack insertTimeRange:CMTimeRangeMake(firstTrackRangeMin, duration) ofTrack:firstAssetTrack atTime:firstTrackRangeMin error:nil];
AVMutableVideoCompositionInstruction *transitionInstruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
AVMutableVideoCompositionLayerInstruction *fromLayer = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:firstTrack];
//**This is where problem might be?**//
CGAffineTransform Scaler = CGAffineTransformMakeScale(scaleNumber,scaleNumber);
CGAffineTransform Mover = CGAffineTransformMakeTranslation(scaleNumber * -100, scaleNumber * -150);
[fromLayer setTransform:CGAffineTransformConcat(Scaler,Mover) atTime:firstTrackRangeMin];
transitionInstruction.layerInstructions = [NSArray arrayWithObject:fromLayer];
videoComposition.instructions = instructionArray;
[self exportVideo:composition withInstructionComposition:videoComposition];
}
CGAffineTransform work in Quartz coordinate system, that's the point from where you should start:
CGAffineTransform translate = CGAffineTransformMakeTranslation((x,y);
CGAffineTransform scale = CGAffineTransformMakeScale(q,z);
CGAffineTransform finalTransform = CGAffineTransformConcat(scale, CGAffineTransformConcat(translate, assetTrackForVideo1.preferredTransform));
[layerInstructionForVideo setTransform:finalTransform atTime:kCMTimeZero];
[layerInstructions addObject:layerInstructionForVideo];
that kind of transform work for me.
EDIT: my mistake, forget to edit all code !

AVAssetExportSession ignoring videoComposition rotation & stripping metadata

I am attempting to rotate video prior to upload on my iOS device because other platforms (such as android) do not properly interpret the rotation information in iOS-recorded videos and, as a result, play them improperly rotated.
I have looked at the following stack posts but have not had success apply any of them to my case:
iOS rotate every frame of video
Rotating Video w/ AVMutableVideoCompositionLayerInstruction
AVMutableVideoComposition rotated video captured in portrait mode
iOS AVFoundation: Setting Orientation of Video
I coped the Apple AVSimpleEditor project sample, but unfortunately all that ever happens is, upon creating an AVAssetExportSession and calling exportAsynchronouslyWithCompletionHandler, no rotation is performed, and what's worse, rotation metadata is stripped out of the resulting file.
Here is the code that runs the export:
AVAssetExportSession *exportSession = [[AVAssetExportSession alloc] initWithAsset:[_mutableComposition copy] presetName:AVAssetExportPresetPassthrough];
exportSession.outputURL = outputURL;
exportSession.outputFileType = AVFileType3GPP;
exportSession.shouldOptimizeForNetworkUse = YES;
exportSession.videoComposition = _mutableVideoComposition;
[exportSession exportAsynchronouslyWithCompletionHandler:^(void)
{
NSLog(#"Status is %d %#", exportSession.status, exportSession.error);
handler(exportSession);
[exportSession release];
}];
The values _mutableComposition and _mutableVideoComposition are initialized by this method here:
- (void) getVideoComposition:(AVAsset*)asset
{
AVMutableComposition *mutableComposition = nil;
AVMutableVideoComposition *mutableVideoComposition = nil;
AVMutableVideoCompositionInstruction *instruction = nil;
AVMutableVideoCompositionLayerInstruction *layerInstruction = nil;
CGAffineTransform t1;
CGAffineTransform t2;
AVAssetTrack *assetVideoTrack = nil;
AVAssetTrack *assetAudioTrack = nil;
// Check if the asset contains video and audio tracks
if ([[asset tracksWithMediaType:AVMediaTypeVideo] count] != 0) {
assetVideoTrack = [asset tracksWithMediaType:AVMediaTypeVideo][0];
}
if ([[asset tracksWithMediaType:AVMediaTypeAudio] count] != 0) {
assetAudioTrack = [asset tracksWithMediaType:AVMediaTypeAudio][0];
}
CMTime insertionPoint = kCMTimeZero;
NSError *error = nil;
// Step 1
// Create a composition with the given asset and insert audio and video tracks into it from the asset
// Check whether a composition has already been created, i.e, some other tool has already been applied
// Create a new composition
mutableComposition = [AVMutableComposition composition];
// Insert the video and audio tracks from AVAsset
if (assetVideoTrack != nil) {
AVMutableCompositionTrack *compositionVideoTrack = [mutableComposition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
[compositionVideoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, [asset duration]) ofTrack:assetVideoTrack atTime:insertionPoint error:&error];
}
if (assetAudioTrack != nil) {
AVMutableCompositionTrack *compositionAudioTrack = [mutableComposition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
[compositionAudioTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, [asset duration]) ofTrack:assetAudioTrack atTime:insertionPoint error:&error];
}
// Step 2
// Translate the composition to compensate the movement caused by rotation (since rotation would cause it to move out of frame)
t1 = CGAffineTransformMakeTranslation(assetVideoTrack.naturalSize.height, 0.0);
// Rotate transformation
t2 = CGAffineTransformRotate(t1, degreesToRadians(90.0));
// Step 3
// Set the appropriate render sizes and rotational transforms
// Create a new video composition
mutableVideoComposition = [AVMutableVideoComposition videoComposition];
mutableVideoComposition.renderSize = CGSizeMake(assetVideoTrack.naturalSize.height,assetVideoTrack.naturalSize.width);
mutableVideoComposition.frameDuration = CMTimeMake(1, 30);
// The rotate transform is set on a layer instruction
instruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
instruction.timeRange = CMTimeRangeMake(kCMTimeZero, [mutableComposition duration]);
layerInstruction = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:(mutableComposition.tracks)[0]];
[layerInstruction setTransform:t2 atTime:kCMTimeZero];
// Step 4
// Add the transform instructions to the video composition
instruction.layerInstructions = #[layerInstruction];
mutableVideoComposition.instructions = #[instruction];
TT_RELEASE_SAFELY(_mutableComposition);
_mutableComposition = [mutableComposition retain];
TT_RELEASE_SAFELY(_mutableVideoComposition);
_mutableVideoComposition = [mutableVideoComposition retain];
}
I pulled this method from AVSERotateCommand from here. Can anyone suggest why this method would not successfully rotate my video by the necessary 90 degrees?
because you are using AVAssetExportPresetPassthrough the AVAssetExportSession will ignore the videoComposition, use any other preset.

Using AVFoundation to crop a video [duplicate]

I'm trying to use AVFoundation to crop videos I'm recording. So lets say i create a AVCaptureVideoPreviewLayer and set the frame to be 300x300.
AVCaptureVideoPreviewLayer *captureVideoPreviewLayer = [AVCaptureVideoPreviewLayer layerWithSession:session];
captureVideoPreviewLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;
captureVideoPreviewLayer.delegate = self;
captureVideoPreviewLayer.frame = CGRectMake(0,0, 300, 300);
[previewView.layer addSublayer:captureVideoPreviewLayer];
The user sees the video cropped. I'd like to save the video exactly the way the user is viewing it. Using AVCaptureMovieFileOutput, the video obviously gets saved without cropping. I was considering using a AVCaptureVideoDataOutput to intercept the frames and crop them myself, but I was wondering if there is a more efficient way to do this, perhaps with AVExportSession and using an AVVideoComposition.
Any guidance would be appreciated.
Soemthing like this. 99% of this code just sets it up to do a custom CGAffineTransform, and then save out the result.
I'm assuming that you want the cropped video to take up full size/width of the output - so that e.g a Scale Affine is the correct solution (you zoom in on the video, giving the effect of having cropped + resized).
AVAsset* asset = // your input
AVAssetTrack *clipVideoTrack = [[asset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];
AVMutableVideoComposition* videoComposition = [[AVMutableVideoComposition videoComposition]retain];
videoComposition.renderSize = CGSizeMake(320, 240);
videoComposition.frameDuration = CMTimeMake(1, 30);
AVMutableVideoCompositionInstruction *instruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
instruction.timeRange = CMTimeRangeMake(kCMTimeZero, CMTimeMakeWithSeconds(60, 30) );
AVMutableVideoCompositionLayerInstruction* transformer = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:clipVideoTrack];
CGAffineTransform finalTransform = // setup a transform that grows the video, effectively causing a crop
[transformer setTransform:finalTransform atTime:kCMTimeZero];
instruction.layerInstructions = [NSArray arrayWithObject:transformer];
videoComposition.instructions = [NSArray arrayWithObject: instruction];
exporter = [[AVAssetExportSession alloc] initWithAsset:saveComposition presetName:AVAssetExportPresetHighestQuality] ;
exporter.videoComposition = videoComposition;
exporter.outputURL=url3;
exporter.outputFileType=AVFileTypeQuickTimeMovie;
[exporter exportAsynchronouslyWithCompletionHandler:^(void){}];
ios7 added a specific Layer instruction just for cropping.
videolayerInstruction setCropRectangle:atTime:
_mike

Resources