AVAsset preferred transform for a custom AVVideoComposition - ios

I have created a custom AVVideoComposition class and used it like this:
AVAsset *asset = ...
AVMutableVideoComposition *videoComposition = [AVMutableVideoComposition videoCompositionWithPropertiesOfAsset:asset];
videoComposition.customVideoCompositorClass = [MyCustomCompositor class];
MyCustomInstruction *instruction = // custom instruction holding CIFilter that is applied to every video frame
videoComposition.instructions = #[instruction];
After export session used like this:
AVAssetExportSession *session = [[AVAssetExportSession alloc] initWithAsset:asset presetName:AVAssetExportPresetHighestQuality];
session.outputURL = ...
session.outputFileType = AVFileTypeQuickTimeMovie;
session.videoComposition = videoComposition;
[session exportAsynchronouslyWithCompletionHandler:^{
...
}];
According to the documentation if I'm using AVVideoComposition, a track's preferredTransform won't work. Also with custom AVVideoCompostion instruction I can't set AVMutableVideoCompositionLayerInstruction with setTransform:atTime:.
How to get video with correct orientation?

Related

Is it possible fully crop a video with AVAssetWriter?

I am using AVAssetWriter to create a video file, I want to crop it and came upon the keys for AVVideoCleanApertureKey. This lets me set a viewport for the video, which looks like it is cropping the video. But in reality it doesn't, the full frame is still present in the video file, and wether the "crop" is used seems to be up to the view player.
So, is there a way to make discard the "outside" data around the viewport with AVAssetWriter? Or do I need to change my approach completely.
These images show the same video file, but only QuickTime cares about the viewport.
I believe the direct answer to the question is NO. So in the end I switched over to AVMutableVideoComposition and AVAssetExportSession as other answers around the web suggested.
AVAsset *video = [AVAsset assetWithURL:outputURL];
AVAssetTrack *assetVideoTrack = [[video tracksWithMediaType:AVMediaTypeVideo] lastObject];
AVMutableVideoCompositionLayerInstruction *layerInstruction = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:assetVideoTrack];
CGAffineTransform transform = CGAffineTransformMakeTranslation(-self.rect.origin.x, -self.rect.origin.y);
[layerInstruction setTransform:transform atTime:kCMTimeZero];
AVMutableVideoCompositionInstruction *instruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
instruction.layerInstructions = #[layerInstruction];
instruction.timeRange = assetVideoTrack.timeRange;
AVMutableVideoComposition *videoComposition = [AVMutableVideoComposition videoComposition];
// https://stackoverflow.com/questions/22883525/avassetexportsession-giving-me-a-green-border-on-right-and-bottom-of-output-vide
videoComposition.renderSize = CGSizeMake(floor(self.rect.size.width / 16) * 16,
floor(self.rect.size.height / 16) * 16);
videoComposition.renderScale = 1.0;
videoComposition.frameDuration = CMTimeMake(1, 30);
videoComposition.instructions = #[instruction];
AVAssetExportSession *exportSession = [AVAssetExportSession exportSessionWithAsset:video presetName:AVAssetExportPreset1280x720];
exportSession.shouldOptimizeForNetworkUse = NO;
exportSession.outputFileType = AVFileTypeMPEG4;
exportSession.videoComposition = videoComposition;
exportSession.outputURL = outputURL2;
[exportSession exportAsynchronouslyWithCompletionHandler:^{
NSLog(#"done processing video!");
}];

Crash when applying filter to AVVideoComposition

I want to apply filter to AVVideoComposition by function:
init(asset: AVAsset, applyingCIFiltersWithHandler: (AVAsynchronousCIImageFilteringRequest) -> Void)
In which, the asset is AVComposition. When an AVPlayerItem plays this composition with the videoComposition, app crashes with error:
reason: '*** -[AVCoreImageFilterCustomVideoCompositor startVideoCompositionRequest:] Expecting video composition to contain only AVCoreImageFilterVideoCompositionInstruction'
I wonder how to fix the crash.
PS: I have two videoTracks in composition, each timeRange has its instruction
You can't use both AVVideoCompositionLayerInstruction and applyingCIFiltersWithHandler.
Intead, you need to apply directly the transform in the filter.
This can be done by applying it to the source image.
request.sourceImage.transformed(by: transform)
I guess you are trying to add AVVideoCompositionLayerInstruction to the AVVideoComposition.
Try the simple approach first and see if you need any changes:
AVURLAsset *asset = [AVURLAsset assetWithURL:videoURL];
CIFilter *filter = [CIFilter filterWithName:#"CIHueAdjust"]; // the filter you want to add: https://developer.apple.com/library/content/documentation/GraphicsImaging/Reference/CoreImageFilterReference/index.html#//apple_ref/doc/filter/ci/
AVMutableVideoComposition *videoComposition = [AVMutableVideoComposition videoCompositionWithAsset:asset applyingCIFiltersWithHandler:^(AVAsynchronousCIImageFilteringRequest * _Nonnull request) {
// set filter input image
[filter setDefaults];
[filter setValue:sourceImage forKey:kCIInputImageKey];
// hue
NSNumber *angle = [NSNumber numberWithFloat:0.8];
[filter setValue:angle forKey:kCIInputAngleKey];
CIImage *outputImage = filter.outputImage;
[request finishWithImage:outputImage context:nil];
}];
AVAssetExportSession *exportSession = [[AVAssetExportSession alloc] initWithAsset:asset presetName:AVAssetExportPreset1920x1080];
exportSession.outputFileType = AVFileTypeQuickTimeMovie;
exportSession.outputURL = outputURL;
exportSession.videoComposition = videoComposition;
// export the session async
[exportSession exportAsynchronouslyWithCompletionHandler:^{
switch (exportSession.status) {
case AVAssetExportSessionStatusCompleted:
NSLog(#"Yeah!");
break;
default:
NSLog(#"Nooo!");
break;
}
}];
AVMutableVideoComposition:videoCompositionWithAsset:applyingCIFiltersWithHandler only support iOS 9+ and if you only need export video with CIFilter, that's ok.
Configure more videoComposition instructions will cash, for example add kind of roateLayerInstruction to AVAssetTrack.
I have same issue in this situation and custom AVVideoCompositing maybe the better solution. some good demos about how to customize videoCompositor demo1 demo2.
I am working on this problem this week.

Cropping a specified CGRect from a video in ios

I want to crop a specified region of CGRect from my video file.
I followed this Tutorial .
I'm using renderSize property of AVMutableVideoComposition to get to a specified size. Here is my code that I use to crop a video to a specified size.
- (void) cropVideoAtPath:(NSString *) path
{
//load our movie Asset
AVAsset *asset = [AVAsset assetWithURL:[NSURL fileURLWithPath:path]];
//create an avassetrack with our asset
AVAssetTrack *clipVideoTrack = [[asset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];
//create a video composition and preset some settings
AVMutableVideoComposition* videoComposition = [AVMutableVideoComposition videoComposition];
videoComposition.frameDuration = CMTimeMake(1, 30);
//here we are setting its render size to its height x height (Square)
float deltaHeight = 100;
videoComposition.renderSize = CGSizeMake(clipVideoTrack.naturalSize.width, clipVideoTrack.naturalSize.height-deltaHeight);
//create a video instruction
AVMutableVideoCompositionInstruction *instruction =
[AVMutableVideoCompositionInstruction videoCompositionInstruction];
instruction.timeRange = CMTimeRangeMake(kCMTimeZero, CMTimeMakeWithSeconds(60, 30));
AVMutableVideoCompositionLayerInstruction* transformer [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:clipVideoTrack];
// CGAffineTransform t1 = CGAffineTransformMakeTranslation(0, 1000);
// CGAffineTransform finalTransform = t1;
// [transformer setTransform:finalTransform atTime:kCMTimeZero];
instruction.layerInstructions = [NSArray arrayWithObject:transformer];
videoComposition.instructions = [NSArray arrayWithObject: instruction];
//Create an Export Path to store the cropped video
NSString * documentsPath = [NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES) objectAtIndex:0];
NSString *exportPathLoc = [documentsPath stringByAppendingFormat:#"/CroppedVideo.mp4"];
NSURL *exportUrl = [NSURL fileURLWithPath:exportPathLoc];
//Remove any prevouis videos at that path
[[NSFileManager defaultManager] removeItemAtURL:exportUrl error:nil];
//Export
AVAssetExportSession *exporter = [[AVAssetExportSession alloc] initWithAsset:asset presetName:AVAssetExportPresetHighestQuality] ;
exporter.videoComposition = videoComposition;
exporter.outputURL = exportUrl;
exporter.outputFileType = AVFileTypeQuickTimeMovie;
[exporter exportAsynchronouslyWithCompletionHandler:^
{
dispatch_async(dispatch_get_main_queue(), ^{
NSLog(#"Saved");
UISaveVideoAtPathToSavedPhotosAlbum([[exporter outputURL] path],nil,nil,nil);
});
}];
}
My original video look like this one.
My video after applying renderSize to AVMutableVideoComposition look like this one.
As we can see bottom portion are successfully clipped away from original video.
Lets say I have a video with size (1000,1000) and I want only the center portion (500,500) region of that video. So my CGrect would be (0,250,500,500).
In my case I want only the region where image is present and so I want to crop out top bar and bottom bar.
So I applied CGAffineTransform to the AVMutableVideoCompositionLayerInstruction like this
CGAffineTransform t1 = CGAffineTransformMakeTranslation(0, deltaHeight);
CGAffineTransform finalTransform = t1;
[transformer setTransform:finalTransform atTime:kCMTimeZero];
which resulted like this below image.
So how can I apply this (x,y) value in cropping the video and what am I doing wrong in here . I would appreciate any help.

Using AVFoundation to crop a video [duplicate]

I'm trying to use AVFoundation to crop videos I'm recording. So lets say i create a AVCaptureVideoPreviewLayer and set the frame to be 300x300.
AVCaptureVideoPreviewLayer *captureVideoPreviewLayer = [AVCaptureVideoPreviewLayer layerWithSession:session];
captureVideoPreviewLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;
captureVideoPreviewLayer.delegate = self;
captureVideoPreviewLayer.frame = CGRectMake(0,0, 300, 300);
[previewView.layer addSublayer:captureVideoPreviewLayer];
The user sees the video cropped. I'd like to save the video exactly the way the user is viewing it. Using AVCaptureMovieFileOutput, the video obviously gets saved without cropping. I was considering using a AVCaptureVideoDataOutput to intercept the frames and crop them myself, but I was wondering if there is a more efficient way to do this, perhaps with AVExportSession and using an AVVideoComposition.
Any guidance would be appreciated.
Soemthing like this. 99% of this code just sets it up to do a custom CGAffineTransform, and then save out the result.
I'm assuming that you want the cropped video to take up full size/width of the output - so that e.g a Scale Affine is the correct solution (you zoom in on the video, giving the effect of having cropped + resized).
AVAsset* asset = // your input
AVAssetTrack *clipVideoTrack = [[asset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];
AVMutableVideoComposition* videoComposition = [[AVMutableVideoComposition videoComposition]retain];
videoComposition.renderSize = CGSizeMake(320, 240);
videoComposition.frameDuration = CMTimeMake(1, 30);
AVMutableVideoCompositionInstruction *instruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
instruction.timeRange = CMTimeRangeMake(kCMTimeZero, CMTimeMakeWithSeconds(60, 30) );
AVMutableVideoCompositionLayerInstruction* transformer = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:clipVideoTrack];
CGAffineTransform finalTransform = // setup a transform that grows the video, effectively causing a crop
[transformer setTransform:finalTransform atTime:kCMTimeZero];
instruction.layerInstructions = [NSArray arrayWithObject:transformer];
videoComposition.instructions = [NSArray arrayWithObject: instruction];
exporter = [[AVAssetExportSession alloc] initWithAsset:saveComposition presetName:AVAssetExportPresetHighestQuality] ;
exporter.videoComposition = videoComposition;
exporter.outputURL=url3;
exporter.outputFileType=AVFileTypeQuickTimeMovie;
[exporter exportAsynchronouslyWithCompletionHandler:^(void){}];
ios7 added a specific Layer instruction just for cropping.
videolayerInstruction setCropRectangle:atTime:
_mike

iOS AVFoundation - Show a time display over a video and export

I want to show a display overlay over a video and export that video including this display. I had a look into the AVFoundation Framework, AVCompositions, AVAssets etc. but I still do not have an idea to achieve this. There is a class called AVSynchronizedLayer which lets you animate things synchrounous to the video, but I do not want to animate, I jsut want to overlay the time display into every single frame of the video. Any advice?
Regards
Something like this...
(NB: culled from a much larger project, so I may have included some unnecessary pieces by accident).
You'll need to grab the CALayer of your clock / animation, and set it to the var myClockLayer (used 1/3 of the way down by the andimation tool).
This also assumes your incoming video has just two tracks - audio and video. If you have more, you'll need to set the track id in "asTrackID:2" more carefully.
AVURLAsset* url = [AVURLAsset URLAssetWithURL:incomingVideo options:nil];
AVMutableComposition *videoComposition = [AVMutableComposition composition];
AVMutableCompositionTrack *compositionVideoTrack = [videoComposition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
AVAssetTrack *clipVideoTrack = [[url tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];
[compositionVideoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, [url duration]) ofTrack:clipVideoTrack atTime:kCMTimeZero error:&error];
AVMutableVideoComposition* videoComposition = [[AVMutableVideoComposition videoComposition]retain];
videoComposition.renderSize = CGSizeMake(320, 240);
videoComposition.frameDuration = CMTimeMake(1, 30);
videoComposition.animationTool = [AVVideoCompositionCoreAnimationTool videoCompositionCoreAnimationToolWithAdditionalLayer:myClockLayer asTrackID:2];
AVMutableVideoCompositionInstruction *instruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
instruction.timeRange = CMTimeRangeMake(kCMTimeZero, CMTimeMakeWithSeconds(60, 30) );
AVMutableVideoCompositionLayerInstruction* layerInstruction = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:clipVideoTrack];
instruction.layerInstructions = [NSArray arrayWithObject:layerInstruction];
videoComposition.instructions = [NSArray arrayWithObject: instruction];
exporter = [[AVAssetExportSession alloc] initWithAsset:saveComposition presetName:AVAssetExportPresetHighestQuality] ;
exporter.videoComposition = videoComposition;
exporter.outputURL=url3;
exporter.outputFileType=AVFileTypeQuickTimeMovie;
[exporter exportAsynchronouslyWithCompletionHandler:^(void){}];
I think you can use AVCaptureVideoDataOutput to process each frame and use AVAssetWriter to record the processed frame.You can refer to this answer
https://stackoverflow.com/a/4944594/379941 .
use AVAssetWriterPixelBufferAdaptor's appendPixelBuffer:withPresentationTime: method to export
And I strongly suggest using OpenCV to process frame. this is a nice tutorial
http://aptogo.co.uk/2011/09/opencv-framework-for-ios/.
OpenCV library is very great.

Resources