Crash when applying filter to AVVideoComposition - ios

I want to apply filter to AVVideoComposition by function:
init(asset: AVAsset, applyingCIFiltersWithHandler: (AVAsynchronousCIImageFilteringRequest) -> Void)
In which, the asset is AVComposition. When an AVPlayerItem plays this composition with the videoComposition, app crashes with error:
reason: '*** -[AVCoreImageFilterCustomVideoCompositor startVideoCompositionRequest:] Expecting video composition to contain only AVCoreImageFilterVideoCompositionInstruction'
I wonder how to fix the crash.
PS: I have two videoTracks in composition, each timeRange has its instruction

You can't use both AVVideoCompositionLayerInstruction and applyingCIFiltersWithHandler.
Intead, you need to apply directly the transform in the filter.
This can be done by applying it to the source image.
request.sourceImage.transformed(by: transform)

I guess you are trying to add AVVideoCompositionLayerInstruction to the AVVideoComposition.
Try the simple approach first and see if you need any changes:
AVURLAsset *asset = [AVURLAsset assetWithURL:videoURL];
CIFilter *filter = [CIFilter filterWithName:#"CIHueAdjust"]; // the filter you want to add: https://developer.apple.com/library/content/documentation/GraphicsImaging/Reference/CoreImageFilterReference/index.html#//apple_ref/doc/filter/ci/
AVMutableVideoComposition *videoComposition = [AVMutableVideoComposition videoCompositionWithAsset:asset applyingCIFiltersWithHandler:^(AVAsynchronousCIImageFilteringRequest * _Nonnull request) {
// set filter input image
[filter setDefaults];
[filter setValue:sourceImage forKey:kCIInputImageKey];
// hue
NSNumber *angle = [NSNumber numberWithFloat:0.8];
[filter setValue:angle forKey:kCIInputAngleKey];
CIImage *outputImage = filter.outputImage;
[request finishWithImage:outputImage context:nil];
}];
AVAssetExportSession *exportSession = [[AVAssetExportSession alloc] initWithAsset:asset presetName:AVAssetExportPreset1920x1080];
exportSession.outputFileType = AVFileTypeQuickTimeMovie;
exportSession.outputURL = outputURL;
exportSession.videoComposition = videoComposition;
// export the session async
[exportSession exportAsynchronouslyWithCompletionHandler:^{
switch (exportSession.status) {
case AVAssetExportSessionStatusCompleted:
NSLog(#"Yeah!");
break;
default:
NSLog(#"Nooo!");
break;
}
}];

AVMutableVideoComposition:videoCompositionWithAsset:applyingCIFiltersWithHandler only support iOS 9+ and if you only need export video with CIFilter, that's ok.
Configure more videoComposition instructions will cash, for example add kind of roateLayerInstruction to AVAssetTrack.
I have same issue in this situation and custom AVVideoCompositing maybe the better solution. some good demos about how to customize videoCompositor demo1 demo2.
I am working on this problem this week.

Related

Green layer appear each time exporting video using AVExportSession

I am exporting a video with Some modifications.
After I get that exported video, I have the option to apply some more modifications Like Watermarking, Cropping, Video Rotation when I apply this transformation and export again the video quality is disturbed a strange green layer started appearing. Every time I export that video the green layer gets darker and darker. I am using the below code for exporting.
AVAssetExportSession *export = [[AVAssetExportSession alloc] initWithAsset:asset presetName:AVAssetExportPresetHighestQuality];
export.outputURL = [NSURL fileURLWithPath:exportPath];
if(videoComposition){
export.videoComposition = videoComposition;
}
if(audioComposition){
export.audioMix = audioComposition;
}
export.outputFileType = AVFileTypeQuickTimeMovie;
export.shouldOptimizeForNetworkUse = YES;
[export exportAsynchronouslyWithCompletionHandler:^{
dispatch_async(dispatch_get_main_queue(), ^{
if (export.status == AVAssetExportSessionStatusCompleted) {
completionHandler([export.outputURL relativePath],nil);
} else {
completionHandler(nil,export.error.localizedDescription);
}
});
}];

AVAssetExportSession exporting too slow

I am trying to export an AVMutableComposition using AVAssetExportSession.
AVAssetExportSession *exporter = [[AVAssetExportSession alloc] initWithAsset:mutableComposition presetName:AVAssetExportPresetHighestQuality];
exporter.outputURL=url;
exporter.outputFileType = AVFileTypeQuickTimeMovie;
exporter.videoComposition = mainCompositionInst;
exporter.shouldOptimizeForNetworkUse = YES;
[exporter exportAsynchronouslyWithCompletionHandler:^
{
switch (exporter.status)
{
case AVAssetExportSessionStatusCompleted:
{
NSLog(#"Video Merge SuccessFullt");
}
break;
case AVAssetExportSessionStatusFailed:
NSLog(#"Failed:%#", exporter.error.description);
break;
case AVAssetExportSessionStatusCancelled:
NSLog(#"Canceled:%#", exporter.error);
break;
case AVAssetExportSessionStatusExporting:
NSLog(#"Exporting!");
break;
case AVAssetExportSessionStatusWaiting:
NSLog(#"Waiting");
break;
default:
break;
}
}];
But exporting even 1 minute video takes around 30 seconds, which is too much considering iPad inbuilt camera app takes less than 2 seconds.
Also if I remove videoComposition from exporter, time reduces to 7 seconds, which is still bad considering video length to be only 1 minute.
So, I want to know how to decrease the export time to minimum?
Also, I want to know, does AVAssetExportSession takes generally this much time or is it just my case?
Update:
Merge Code:
AVMutableComposition *mutableComposition = [AVMutableComposition composition];
AVMutableCompositionTrack *videoCompositionTrack = [mutableComposition addMutableTrackWithMediaType:AVMediaTypeVideo
preferredTrackID:kCMPersistentTrackID_Invalid];
AVMutableCompositionTrack *audioCompositionTrack = [mutableComposition addMutableTrackWithMediaType:AVMediaTypeAudio
preferredTrackID:kCMPersistentTrackID_Invalid];
AVMutableVideoCompositionLayerInstruction *videoTrackLayerInstruction = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:videoCompositionTrack];
NSMutableArray *instructions = [NSMutableArray new];
CGSize size = CGSizeZero;
CMTime time = kCMTimeZero;
for (AVURLAsset *asset in assets)
{
AVAssetTrack *assetTrack = [[asset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];
AVAssetTrack *audioAssetTrack = [asset tracksWithMediaType:AVMediaTypeAudio].firstObject;
NSError *error;
[videoCompositionTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, assetTrack.timeRange.duration )
ofTrack:assetTrack
atTime:time
error:&error];
[videoTrackLayerInstruction setTransform:assetTrack.preferredTransform atTime:time];
if (error) {
NSLog(#"asset url :: %#",assetTrack.asset);
NSLog(#"Error1 - %#", error.debugDescription);
}
[audioCompositionTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, audioAssetTrack.timeRange.duration)
ofTrack:audioAssetTrack
atTime:time
error:&error];
if (error) {
NSLog(#"Error2 - %#", error.debugDescription);
}
time = CMTimeAdd(time, assetTrack.timeRange.duration);
if (CGSizeEqualToSize(size, CGSizeZero)) {
size = assetTrack.naturalSize;
}
}
AVMutableVideoCompositionInstruction *mainInstruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
mainInstruction.timeRange = CMTimeRangeMake(kCMTimeZero, time);
mainInstruction.layerInstructions = [NSArray arrayWithObject:videoTrackLayerInstruction];
AVMutableVideoComposition *mainCompositionInst = [AVMutableVideoComposition videoComposition];
mainCompositionInst.instructions = [NSArray arrayWithObject:mainInstruction];
mainCompositionInst.frameDuration = CMTimeMake(1, 30);
mainCompositionInst.renderSize = size;
I think the issue here is AVAssetExportPresetHighestQuality this will cause conversion or up sampling, and will slow things WAY down. Try using AVAssetExportPresetPassthrough instead. Took my export times from ~35+ secs down to less than a second
I also disabled optimizing for network use as all of our videos are only used within the app, and never streamed or passed over a network
I built an app that merged together different video fragments and I can safely say it is your case. My video files have ~10 mb so maybe they are little smaller but it takes less than second to merge them all together, even if there is 10, 20 segments.
Now as for it is actually happening, I have checked my configuration against yours and the difference is following:
I use export.outputFileType = AVFileTypeMPEG4
I have network optimization disabled, and if you are not planning to stream the video from your device, you should disable it too
Other than that, it should be the same, I can't really compare it as you would have to provide code about how you actually create the composition. There are some things to check though:
if you are using AVURLAssetPreferPreciseDurationAndTimingKey when creating AVURLAsset and you don't have enough keyframes, it can actually take quite some time to seek through to find the keys and so it slows it down
Consider if you really need highest quality for the video
Consider resolution of the video and possibly lower it down
I should be able to help you more if you provide more information, but maybe some of this stuff will work. Give it a try and then report back.
Hope it helps a little!
Edit 1:
I forgot to mention that if you are out of options, you should try to use FFmpeg library as it is very high performance, though due to licencing might not be suitable for you.
Keep in my that maybe the asset you're trying to export is not stored locally and first is downloading the content and then exporting your assets.
if you don't want to download any content
let videoRequestOptions: PHVideoRequestOptions = PHVideoRequestOptions()
videoRequestOptions.isNetworkAccessAllowed = true
you also going to receive a message within requestExportSession completion handler with a couple of useful info values.
https://developer.apple.com/documentation/photokit/phimagemanager/image_result_info_keys
otherwise, if you want to download your asset from iCloud and make it as fast as possible you could play with the following parameters
let videoRequestOptions: PHVideoRequestOptions = PHVideoRequestOptions()
// highQualityFormat, highQualityFormat, fastFormat, automatic
videoRequestOptions.deliveryMode = .fastFormat
videoRequestOptions.isNetworkAccessAllowed = true
another important property is the export preset, there's a bunch of available preset
let lowQualityPreset1 = AVAssetExportPresetLowQuality
let lowQualityPreset2 = AVAssetExportPreset640x480
let lowQualityPreset3 = AVAssetExportPreset960x540
let lowQualityPreset4 = AVAssetExportPreset1280x720
let manager = PHImageManager()
manager.requestExportSession(forVideo: asset,
options: videoRequestOptions,
exportPreset: lowQualityPreset1) { (session, info) in
session?.outputURL = outputUrl
session?.outputFileType = .mp4
session?.shouldOptimizeForNetworkUse = true
session?.exportAsynchronously {
}
}

How can I set the artwork for an M4A file using AVFoundation on iOS?

I have an M4A file and I'd like to set its artwork from a UIImage. Here's what I've got so far:
AVURLAsset* asset = [AVAsset assetWithURL:outputURL];
AVMutableMetadataItem* item = [AVMutableMetadataItem metadataItem];
item.key = AVMetadataCommonKeyArtwork;
item.keySpace = AVMetadataKeySpaceiTunes;
UIImage *image = [UIImage imageWithContentsOfFile:path];
NSData *imageData = UIImagePNGRepresentation(image);
item.value = imageData;
NSArray* metadata = [NSArray arrayWithObject:item];
AVAssetExportSession *exportSession = [[AVAssetExportSession alloc] initWithAsset:asset presetName:AVAssetExportPresetAppleM4A];
exportSession.outputURL = outputURL;
exportSession.outputFileType = AVFileTypeAppleM4A;
exportSession.metadata = metadata;
[exportSession exportAsynchronouslyWithCompletionHandler:^{
// ...
}];
I've tried a few different things, including:
AVMetadataKeySpaceCommon
JPEG images
Exporting to a different URL than the one of the existing file
Metadata item value as a dictionary with a data key instead of just the data
I'm trying to extrapolate based on examples of how to get metadata from a file, because I can't find any documentation about this anywhere. Any help is appreciated.

AVAsset preferred transform for a custom AVVideoComposition

I have created a custom AVVideoComposition class and used it like this:
AVAsset *asset = ...
AVMutableVideoComposition *videoComposition = [AVMutableVideoComposition videoCompositionWithPropertiesOfAsset:asset];
videoComposition.customVideoCompositorClass = [MyCustomCompositor class];
MyCustomInstruction *instruction = // custom instruction holding CIFilter that is applied to every video frame
videoComposition.instructions = #[instruction];
After export session used like this:
AVAssetExportSession *session = [[AVAssetExportSession alloc] initWithAsset:asset presetName:AVAssetExportPresetHighestQuality];
session.outputURL = ...
session.outputFileType = AVFileTypeQuickTimeMovie;
session.videoComposition = videoComposition;
[session exportAsynchronouslyWithCompletionHandler:^{
...
}];
According to the documentation if I'm using AVVideoComposition, a track's preferredTransform won't work. Also with custom AVVideoCompostion instruction I can't set AVMutableVideoCompositionLayerInstruction with setTransform:atTime:.
How to get video with correct orientation?

How to set metadata for AVFileTypeMPEG4 file via AVAssetExportSession?

I'm trying to use AVAssetExportSession to set the metadata of a AVFileTypeMPEG4 type file,but it
doesn't work,if I change the file type to AVFileTypeQuickTimeMovie,it works.But I need mp4 file,I can't find any document say AVFileTypeMPEG4 file can not be set metadata,Has anyone set meta successfully?
Here is the code that I used:
NSMutableArray *metadata = [NSMutableArray array];
AVMutableMetadataItem *metaItem = [AVMutableMetadataItem metadataItem];
metaItem.key = AVMetadataCommonKeySource;
metaItem.keySpace = AVMetadataKeySpaceCommon;
metaItem.value = #"Test metadata";
[metadata addObject:metaItem];
AVAssetExportSession *exportSession = [AVAssetExportSession exportSessionWithAsset:composition presetName:AVAssetExportPresetMediumQuality];
exportSession.metadata = metadata;
exportSession.audioMix = audioMix;
exportSession.videoComposition = videoComposition;
exportSession.outputFileType = AVFileTypeMPEG4;//AVFileTypeQuickTimeMovie;
NSString *outputFilePath = [NSTemporaryDirectory() stringByAppendingPathComponent:#"testMetadata.mp4"];
exportSession.outputURL = [NSURL fileURLWithPath:outputFilePath];
[exportSession exportAsynchronouslyWithCompletionHandler:^{
dispatch_async(dispatch_get_main_queue(), ^{
if (exportSession.status == AVAssetExportSessionStatusCompleted) {
//todo
}else{
//todo
}
});
}];
Try it with
metaItem.key = AVMetadataiTunesMetadataKeyDescription;
metaItem.keySpace = AVMetadataKeySpaceiTunes;
tried the other keyspaces but only itunes worked for me.
Apple filters metadata depending on the output type. They don't consider iTunes metadata valid for MPEG4 output, so they strip it.
Some options:
Use AVFileTypeQuickTimeMovie > MOV is closely related to MP4, and is often compatible. This depends on what you are looking to do.
Try other types (some people report success with the MPV type)
use a library to write custom data/atoms (mp4v2 works for example). Lots of work, but the only real way to achieve it.

Resources