AVMutableVideoComposition & AVVideoCompositionCoreAnimationTool color glitch - ios

I'm trying to add image to video using following code:
AVURLAsset* videoAsset = [[AVURLAsset alloc]initWithURL:[self assetURL] options:nil];
AVMutableComposition *mixComposition = [[AVMutableComposition alloc] init];
AVMutableCompositionTrack *videoTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
[videoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, videoAsset.duration)
ofTrack:[[videoAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0]
atTime:kCMTimeZero error:nil];
AVMutableCompositionTrack *audioTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
[audioTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, videoAsset.duration)
ofTrack:[[videoAsset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0]
atTime:kCMTimeZero error:nil];
AVMutableVideoCompositionInstruction *mainInstruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
mainInstruction.timeRange = CMTimeRangeMake(kCMTimeZero, videoAsset.duration);
AVMutableVideoCompositionLayerInstruction *videolayerInstruction = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:videoTrack];
AVAssetTrack *videoAssetTrack = [[videoAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];
[videolayerInstruction setTransform:videoAssetTrack.preferredTransform atTime:kCMTimeZero];
mainInstruction.layerInstructions = [NSArray arrayWithObjects:videolayerInstruction,nil];
AVMutableVideoComposition *mainCompositionInst = [AVMutableVideoComposition videoComposition];
CGSize naturalSize = CGSizeApplyAffineTransform(videoTrack.naturalSize, videoTrack.preferredTransform);
mainCompositionInst.renderSize = naturalSize;
mainCompositionInst.instructions = [NSArray arrayWithObject:mainInstruction];
mainCompositionInst.frameDuration = CMTimeMake(1, 30);
CGFloat ratio = naturalSize.width / rectVideo.size.width;
CALayer *parentLayer = [CALayer layer];
CALayer *videoLayer = [CALayer layer];
parentLayer.frame = CGRectMake(0, 0, naturalSize.width, naturalSize.height);
videoLayer.frame = CGRectMake(0, 0, naturalSize.width, naturalSize.height);
[parentLayer addSublayer:videoLayer];
for(DraggableImageView *iv in imageViews){
CALayer *overlayLayer = [CALayer layer];
[overlayLayer setContents:(id)[[self normalizedImage:iv.image] CGImage]];
overlayLayer.frame = CGRectMake(iv.frame.origin.x * ratio, (rectVideo.size.height - iv.frame.origin.y - iv.frame.size.height) * ratio, iv.frame.size.width * ratio, iv.frame.size.height * ratio);
[overlayLayer setMasksToBounds:YES];
overlayLayer.opacity = 0;
overlayLayer.masksToBounds = YES;
CGFloat duration = CMTimeGetSeconds(iv.endTime) - CMTimeGetSeconds(iv.startTime);
CGFloat start = CMTimeGetSeconds(iv.startTime);
if(duration > CMTimeGetSeconds(videoAsset.duration)){
duration = CMTimeGetSeconds(videoAsset.duration);
start = 0;
}
CGFloat fadeInSeconds = iv.fadeInSeconds;
CGFloat fade = fadeInSeconds / duration;
CAKeyframeAnimation *anim = [[CAKeyframeAnimation alloc] init];
anim.keyPath = #"opacity";
anim.values = #[#0, #1, #1, #0];
NSMutableArray *keyTimes = [NSMutableArray new];
[keyTimes addObject:#0];
[keyTimes addObject:#(fade)];
[keyTimes addObject:#(1.0 - fade)];
[keyTimes addObject:#1];
anim.keyTimes = keyTimes;
anim.duration = duration;
anim.beginTime = AVCoreAnimationBeginTimeAtZero + start;
[overlayLayer addAnimation:anim forKey:#"anim"];
[parentLayer addSublayer:overlayLayer];
}
mainCompositionInst.animationTool = [AVVideoCompositionCoreAnimationTool
videoCompositionCoreAnimationToolWithPostProcessingAsVideoLayer:videoLayer inLayer:parentLayer];
NSURL *exportURL = [self exportUrl];
exportSession = [[AVAssetExportSession alloc] initWithAsset:mixComposition presetName:AVAssetExportPresetHighestQuality];
exportSession.outputURL = exportURL;
exportSession.videoComposition = mainCompositionInst;
exportSession.outputFileType = _recordSession.fileType;
exportSession.timeRange = CMTimeRangeMake(kCMTimeZero, videoAsset.duration);
exportProgressTimer = [NSTimer scheduledTimerWithTimeInterval:.1 target:self selector:#selector(exportDidProgress:) userInfo:nil repeats:YES];
[exportSession exportAsynchronouslyWithCompletionHandler:^{
switch (exportSession.status) {
case AVAssetExportSessionStatusCompleted:
[self savingComplete:exportSession.error];
break;
case AVAssetExportSessionStatusFailed:
[self savingComplete:exportSession.error];
break;
case AVAssetExportSessionStatusCancelled:
break;
default:
break;
}
}];
This code works great on iPhone SE but on iPad Air 2 gives this color glitch on the right edge of video (zoomed screenshot):
This glitch is not static, it changes vertically (size and y position).
I think this is not a problem with video size (well known problem with green line on the edge when video size not dividable by 4 or 16). Size of this video is 1080x1920.
Any ideas?

Related

Issue with add watermark on video

I am trying to add an image on a video. Everything works fine except one thing, the image is distorted:
Here is the code :
//Capture the image
UIGraphicsBeginImageContextWithOptions(self.captureView.bounds.size, false, UIScreen.main.scale)
self.captureView.layer.render(in: UIGraphicsGetCurrentContext()!)
let image = UIGraphicsGetImageFromCurrentImageContext()
UIGraphicsEndImageContext()
let watermarkVideo = WatermakVideo()
//video file
let videoFile = Bundle.main.path(forResource: "videoTrim", ofType: "mp4")
let videoURL = URL(fileURLWithPath: videoFile!)
let imageFrame = captureView.frame
watermarkVideo.createWatermark(image, frame: imageFrame, video: videoURL)
Here is the class WatermakVideo :
https://www.dropbox.com/s/0d6i7ap9qu4klp5/WatermakVideo.zip
I would be grateful if you could help me fix this issue.
Copy the below into your file. I had the same issue and solved the problem two weeks ago:
-(void)forStackOverflow:(NSURL*)url{
AVURLAsset* videoAsset = [[AVURLAsset alloc]initWithURL:url options:nil];
AVMutableComposition* mixComposition = [AVMutableComposition composition];
AVMutableCompositionTrack *compositionVideoTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
AVAssetTrack *clipVideoTrack = [[videoAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];
AVMutableCompositionTrack *compositionAudioTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
AVAssetTrack *clipAudioTrack = [[videoAsset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0];
//If you need audio as well add the Asset Track for audio here
[compositionVideoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, videoAsset.duration) ofTrack:clipVideoTrack atTime:kCMTimeZero error:nil];
[compositionAudioTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, videoAsset.duration) ofTrack:clipAudioTrack atTime:kCMTimeZero error:nil];
[compositionVideoTrack setPreferredTransform:[[[videoAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0] preferredTransform]];
CGSize sizeOfVideo=compositionVideoTrack.naturalSize;
CGFloat scaleWidth = sizeOfVideo.height/self.view.frame.size.width;
CGFloat scaleHeight = sizeOfVideo.width/self.view.frame.size.height;
// add image
UIImage *myImage=[UIImage imageNamed:#"YOUR IMAGE PATH"];
CALayer *layerCa = [CALayer layer];
layerCa.contents = (id)myImage.CGImage;
layerCa.frame = CGRectMake(5*scaleWidth, 0, self.birdSize.width*scaleWidth, self.birdSize.height*scaleWidth);
layerCa.opacity = 1.0;
// add Text on image
CATextLayer *textOfvideo=[[CATextLayer alloc] init];
textOfvideo.alignmentMode = kCAAlignmentLeft;
[textOfvideo setFont:(__bridge CFTypeRef)([UIFont fontWithName:#"Arial" size:64.00])];//fontUsed is the name of font
[textOfvideo setFrame:CGRectMake(layerCa.frame.size.width/6, layerCa.frame.size.height/8*7-layerCa.frame.size.height/3, layerCa.frame.size.width/1.5, layerCa.frame.size.height/3)];
[textOfvideo setAlignmentMode:kCAAlignmentCenter];
[textOfvideo setForegroundColor:[[UIColor redColor] CGColor]];
UILabel*label = [[UILabel alloc]init];
[label setText:self.questionString];
label.textAlignment = NSTextAlignmentCenter;
label.numberOfLines = 4;
label.adjustsFontSizeToFitWidth = YES;
[label setFont:[UIFont fontWithName:#"Arial" size:64.00]];
//[label.layer setBackgroundColor:[[UIColor blackColor] CGColor]];
[label.layer setFrame:CGRectMake(0, 0, textOfvideo.frame.size.width, textOfvideo.frame.size.height)];
[textOfvideo addSublayer:label.layer];
[layerCa addSublayer:textOfvideo];
CALayer *parentLayer=[CALayer layer];
CALayer *videoLayer=[CALayer layer];
parentLayer.frame=CGRectMake(0, 0, sizeOfVideo.width, sizeOfVideo.height);
videoLayer.frame=CGRectMake(0, 0, sizeOfVideo.height,sizeOfVideo.width);
[parentLayer addSublayer:videoLayer];
//[parentLayer addSublayer:optionalLayer];
[parentLayer addSublayer:layerCa];
[parentLayer setBackgroundColor:[UIColor blueColor].CGColor];
AVMutableVideoComposition *videoComposition=[AVMutableVideoComposition videoComposition] ;
videoComposition.frameDuration=CMTimeMake(1, 30);
videoComposition.animationTool=[AVVideoCompositionCoreAnimationTool videoCompositionCoreAnimationToolWithPostProcessingAsVideoLayer:videoLayer inLayer:parentLayer];
//AVMutableVideoCompositionInstruction *instruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
AVMutableVideoCompositionInstruction *instruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
instruction.timeRange = CMTimeRangeMake(kCMTimeZero, [mixComposition duration]);
AVAssetTrack *videoTrack = [[mixComposition tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];
AVMutableVideoCompositionLayerInstruction* layerInstruction = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:videoTrack];
UIImageOrientation videoAssetOrientation_ = UIImageOrientationUp;
BOOL isVideoAssetPortrait_ = NO;
[layerInstruction setTransform:videoTrack.preferredTransform atTime:kCMTimeZero];
CGSize naturalSize;
naturalSize = videoTrack.naturalSize;
float renderWidth, renderHeight;
renderWidth = naturalSize.width;
renderHeight = naturalSize.height;
videoComposition.renderSize = naturalSize = CGSizeMake(videoTrack.naturalSize.height, videoTrack.naturalSize.width);
instruction.layerInstructions = [NSArray arrayWithObject:layerInstruction];
videoComposition.instructions = [NSArray arrayWithObject: instruction];
NSString *documentsDirectory = [NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES)objectAtIndex:0];
NSDateFormatter *dateFormatter = [[NSDateFormatter alloc] init];
[dateFormatter setDateFormat:#"yyyy-MM-dd_HH-mm-ss"];
NSString *destinationPath = [documentsDirectory stringByAppendingFormat:#"/utput_%#.mov", [dateFormatter stringFromDate:[NSDate date]]];
AVAssetExportSession *exportSession = [[AVAssetExportSession alloc] initWithAsset:mixComposition presetName:AVAssetExportPresetHighestQuality];
exportSession.videoComposition=videoComposition;
exportSession.outputURL = [NSURL fileURLWithPath:destinationPath];
exportSession.outputFileType = AVFileTypeQuickTimeMovie;
[exportSession exportAsynchronouslyWithCompletionHandler:^{
switch (exportSession.status)
{
case AVAssetExportSessionStatusCompleted:
NSLog(#"Export OK");
if (UIVideoAtPathIsCompatibleWithSavedPhotosAlbum(destinationPath)) {
UISaveVideoAtPathToSavedPhotosAlbum(destinationPath, self, #selector(video:didFinishSavingWithError:contextInfo:), nil);
}
break;
case AVAssetExportSessionStatusFailed:
NSLog (#"AVAssetExportSessionStatusFailed: %#", exportSession.error);
break;
case AVAssetExportSessionStatusCancelled:
NSLog(#"Export Cancelled");
break;
}
self.currentUrl = exportSession.outputURL;
dispatch_async(dispatch_get_main_queue(), ^{
});
}];
}

Adding a image as watermark on video results inverted video

I am adding image watermark on video using the following code but the resulted video's frame was rotated by 180 degree and i tried every possible solution to stop it. i just want the same video with watermark as a source video. please suggest solution.
-(void)watermarkVideoAtURL:(NSURL *)url fb:(BOOL)fb withCompletionHandler:(void(^)(bool success, NSURL *assetsURL, NSError *error))completionHandler {
AVURLAsset* videoAsset = [[AVURLAsset alloc]initWithURL:url options:nil];
AVMutableComposition *mixComposition = [AVMutableComposition composition];
AVMutableCompositionTrack *compositionVideoTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
AVAssetTrack *clipVideoTrack = [[videoAsset tracksWithMediaType:AVMediaTypeVideo] lastObject];
AVMutableCompositionTrack *compositionAudioTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
AVAssetTrack *clipAudioTrack = [[videoAsset tracksWithMediaType:AVMediaTypeAudio] lastObject];
[compositionVideoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, videoAsset.duration) ofTrack:clipVideoTrack atTime:kCMTimeZero error:nil];
[compositionAudioTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, videoAsset.duration) ofTrack:clipAudioTrack atTime:kCMTimeZero error:nil];
[compositionVideoTrack setPreferredTransform:[[[videoAsset tracksWithMediaType:AVMediaTypeVideo] lastObject] preferredTransform]];
CGSize sizeOfVideo = [videoAsset naturalSize];
CALayer *parentLayer = [CALayer layer];
CALayer *videoLayer = [CALayer layer];
parentLayer.frame = CGRectMake(0, 0, sizeOfVideo.width, sizeOfVideo.height);
videoLayer.frame = CGRectMake(0, 0, sizeOfVideo.width, sizeOfVideo.height);
// Image of watermark
UIImage *myImage = [self imageByApplyingAlpha:watermarkOpacityFactor toImage:[UIImage imageNamed:#"iconbig"]];
CALayer *layerCa = [CALayer layer];
layerCa.contents = (id)myImage.CGImage;
layerCa.frame = CGRectMake(10, sizeOfVideo.height - 50, 50, 50);
layerCa.opacity = 1.0;
CALayer *layerCa2 = [CALayer layer];
layerCa2.contents = (id)myImage.CGImage;
layerCa2.frame = CGRectMake(sizeOfVideo.width - 60, 10, 50, 50);
layerCa2.opacity = 1.0;
[parentLayer addSublayer:videoLayer];
[parentLayer addSublayer:layerCa];
[parentLayer addSublayer:layerCa2];
AVMutableVideoComposition *videoComposition = [AVMutableVideoComposition videoComposition];
videoComposition.frameDuration = CMTimeMake(1, 30);
videoComposition.renderSize = sizeOfVideo;
videoComposition.animationTool = [AVVideoCompositionCoreAnimationTool videoCompositionCoreAnimationToolWithPostProcessingAsVideoLayer:videoLayer inLayer:parentLayer];
AVMutableVideoCompositionInstruction *instruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
instruction.timeRange = CMTimeRangeMake(kCMTimeZero, [mixComposition duration]);
AVAssetTrack *videoTrack = [[mixComposition tracksWithMediaType:AVMediaTypeVideo] lastObject];
AVMutableVideoCompositionLayerInstruction *layerInstruction = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:videoTrack];
instruction.layerInstructions = [NSArray arrayWithObject:layerInstruction];
videoComposition.instructions = [NSArray arrayWithObject:instruction];
NSString *documentsDirectory = [NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES)objectAtIndex:0];
NSDateFormatter *dateFormatter = [[NSDateFormatter alloc] init];
[dateFormatter setDateFormat:#"yyyy-MM-dd_HH-mm-ss"];
NSString *destinationPath = [documentsDirectory stringByAppendingFormat:#"/utput_%#.mov", [dateFormatter stringFromDate:[NSDate date]]];
AVAssetExportSession *exportSession = [[AVAssetExportSession alloc] initWithAsset:mixComposition presetName:AVAssetExportPresetHighestQuality];
exportSession.videoComposition = videoComposition;
exportSession.outputURL = [NSURL fileURLWithPath:destinationPath];
exportSession.outputFileType = AVFileTypeQuickTimeMovie;
[exportSession exportAsynchronouslyWithCompletionHandler:^{
switch (exportSession.status) {
case AVAssetExportSessionStatusCompleted: {
ALAssetsLibrary *library = [[ALAssetsLibrary alloc] init];
[library writeVideoAtPathToSavedPhotosAlbum:exportSession.outputURL completionBlock:^(NSURL *assetURL, NSError *error) {
if (!error) {
completionHandler(YES, assetURL, nil);
} else {
completionHandler(NO, nil, error);
}
}];
}
break;
case AVAssetExportSessionStatusFailed: {
completionHandler(NO, nil, exportSession.error);
}
break;
case AVAssetExportSessionStatusCancelled: {
completionHandler(NO, nil, exportSession.error);
}
break;
default:
break;
}
}];
}
Try to set AVAssetTrack's preferredTransform to layer instruction
setTransform:atTime:
Sets a fixed transform to apply from the specified time until the next time at which a transform is set.[...] Before the first specified time for which a transform is set, the affine transform is held constant at the value of
CGAffineTransformIdentity
; after the last time for which a transform is set, the affine transform is held constant at that last value.

Can not add watermark for videos that its track's preferredTransform is not identity matrix using AVFoundation?

I want to add a watermark image on the bottom left corner of downloaded videos, and I find below code works when video's video track's preferredTransform property is CGAffineTransformIdentity(that means no real transformation), while for those videos have transformation, the code fails. How to fix it?
sample video (has transformation) url is here
BTW, the error code that AVFoundations reports is -11841
+(void) addWatermarkWithInputFile:(NSString *)inputFile outputFile:(NSString *)outputFile completion:(void (^)(BOOL))completion{
[[NSFileManager defaultManager] removeItemAtPath:outputFile error:nil];
AVURLAsset* videoAsset = [[AVURLAsset alloc]initWithURL:[NSURL fileURLWithPath:inputFile] options:nil];
AVMutableComposition* composition = [AVMutableComposition composition];
CMTimeRange timeRange = CMTimeRangeMake(kCMTimeZero, videoAsset.duration);
AVAssetTrack *track = [videoAsset tracksWithMediaType:AVMediaTypeVideo].firstObject;
{
AVMutableCompositionTrack * composedTrack = [composition addMutableTrackWithMediaType:AVMediaTypeVideo
preferredTrackID:kCMPersistentTrackID_Invalid];
composedTrack.preferredTransform = track.preferredTransform;
[composedTrack insertTimeRange:timeRange
ofTrack:[videoAsset tracksWithMediaType:AVMediaTypeVideo].firstObject
atTime:kCMTimeZero
error:nil];
}
{
AVMutableCompositionTrack * composedTrack = [composition addMutableTrackWithMediaType:AVMediaTypeAudio
preferredTrackID:kCMPersistentTrackID_Invalid];
composedTrack.preferredTransform = track.preferredTransform;
[composedTrack insertTimeRange:timeRange
ofTrack:[videoAsset tracksWithMediaType:AVMediaTypeAudio].firstObject
atTime:kCMTimeZero
error:nil];
}
AVAssetTrack *clipVideoTrack = [videoAsset tracksWithMediaType:AVMediaTypeVideo].firstObject;
CGSize videoSize = clipVideoTrack.naturalSize;
AVMutableVideoComposition* videoComp = [AVMutableVideoComposition videoComposition];
videoComp.renderSize = videoSize;
videoComp.frameDuration = CMTimeMake(1, 30);
{
UIImage *myImage = [UIImage imageNamed:#"video-watermark"];
CALayer *aLayer = [CALayer layer];
aLayer.contents = (id)myImage.CGImage;
aLayer.frame = CGRectMake(10, 10, myImage.size.width * myImage.scale, myImage.size.height * myImage.scale);
CALayer *parentLayer = [CALayer layer];
CALayer *videoLayer = [CALayer layer];
parentLayer.frame = CGRectMake(0, 0, videoSize.width, videoSize.height);
videoLayer.frame = CGRectMake(0, 0, videoSize.width, videoSize.height);
[parentLayer addSublayer:videoLayer];
[parentLayer addSublayer:aLayer];
videoComp.animationTool = [AVVideoCompositionCoreAnimationTool videoCompositionCoreAnimationToolWithPostProcessingAsVideoLayer:videoLayer inLayer:parentLayer];
}
/// instruction
AVMutableVideoCompositionInstruction *instruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
instruction.timeRange = timeRange;
AVMutableVideoCompositionLayerInstruction* layerInstruction = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:clipVideoTrack];
instruction.layerInstructions = #[layerInstruction];
videoComp.instructions = #[instruction];
AVAssetExportSession *exporter = [[AVAssetExportSession alloc] initWithAsset:composition
presetName:AVAssetExportPresetHighestQuality];
exporter.outputURL = [NSURL fileURLWithPath:outputFile];
exporter.outputFileType = AVFileTypeMPEG4;
exporter.shouldOptimizeForNetworkUse = YES;
exporter.videoComposition = videoComp;
exporter.timeRange = timeRange;
[exporter exportAsynchronouslyWithCompletionHandler:^{
completion(exporter.status == AVAssetExportSessionStatusCompleted);
}];
}

Crash when use AVAssetExportSession

I use AVAssetExportSession to save video with watermark follow link
But my app was crash at line:
[exporter exportAsynchronouslyWithCompletionHandler:^{ }];
My code:
- (void)mixVideoAsset{
AVAsset *videoAsset = [AVAsset assetWithURL:_urlVideo];
// 2 - Create AVMutableComposition object. This object will hold your AVMutableCompositionTrack instances.
AVMutableComposition *mixComposition = [[AVMutableComposition alloc] init];
// 3 - Video track
AVMutableCompositionTrack *videoTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo
preferredTrackID:kCMPersistentTrackID_Invalid];
[videoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, videoAsset.duration)
ofTrack:[[videoAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0]
atTime:kCMTimeZero error:nil];
// - Audio
AVMutableCompositionTrack *audioCompositionTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
AVAssetTrack *audioTrack = [[videoAsset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0];
[audioCompositionTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, audioTrack.timeRange.duration) ofTrack:audioTrack atTime:kCMTimeZero error:nil];
// 3.1 - Create AVMutableVideoCompositionInstruction
AVMutableVideoCompositionInstruction *mainInstruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
mainInstruction.timeRange = CMTimeRangeMake(kCMTimeZero, videoAsset.duration);
// 3.2 - Create an AVMutableVideoCompositionLayerInstruction for the video track and fix the orientation.
AVMutableVideoCompositionLayerInstruction *videolayerInstruction = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:videoTrack];
AVAssetTrack *videoAssetTrack = [[videoAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];
BOOL isVideoAssetPortrait_ = NO;
CGAffineTransform videoTransform = videoAssetTrack.preferredTransform;
if (videoTransform.a == 0 && videoTransform.b == 1.0 && videoTransform.c == -1.0 && videoTransform.d == 0) {
isVideoAssetPortrait_ = YES;
}
if (videoTransform.a == 0 && videoTransform.b == -1.0 && videoTransform.c == 1.0 && videoTransform.d == 0) {
isVideoAssetPortrait_ = YES;
}
[videolayerInstruction setTransform:videoAssetTrack.preferredTransform atTime:kCMTimeZero];
[videolayerInstruction setOpacity:0.0 atTime:videoAsset.duration];
// 3.3 - Add instructions
mainInstruction.layerInstructions = [NSArray arrayWithObjects:videolayerInstruction,nil];
AVMutableVideoComposition *mainCompositionInst = [AVMutableVideoComposition videoComposition];
CGSize naturalSize;
if(isVideoAssetPortrait_){
naturalSize = CGSizeMake(videoAssetTrack.naturalSize.height, videoAssetTrack.naturalSize.width);
} else {
naturalSize = videoAssetTrack.naturalSize;
}
float renderWidth, renderHeight;
renderWidth = naturalSize.width;
renderHeight = naturalSize.height;
mainCompositionInst.renderSize = CGSizeMake(renderWidth, renderHeight);
mainCompositionInst.instructions = [NSArray arrayWithObject:mainInstruction];
mainCompositionInst.frameDuration = CMTimeMake(1, 30);
[self applyVideoEffectsToComposition:mainCompositionInst size:naturalSize];
// 5 - Create exporter
AVAssetExportSession *exporter = [[AVAssetExportSession alloc] initWithAsset:mixComposition
presetName:AVAssetExportPresetHighestQuality];
NSString *exportPath = [PATH_VIDEO_FOLDER stringByAppendingPathComponent:[NSString stringWithFormat:#"%#.mov",[VideoHelper getVideoDateByDate:[NSDate date]]]];
CMTime start = CMTimeMakeWithSeconds(mySAVideoRangeSlider.leftPosition, videoAsset.duration.timescale);
CMTime duration = CMTimeMakeWithSeconds(VIDEO_TIME_TRIM_SECONDS, videoAsset.duration.timescale);
CMTimeRange range = CMTimeRangeMake(start, duration);
exporter.timeRange = range;
exporter.outputURL = [NSURL fileURLWithPath:exportPath];
exporter.outputFileType = AVFileTypeQuickTimeMovie;
// exporter.shouldOptimizeForNetworkUse = YES;
exporter.videoComposition = mainCompositionInst;
[exporter exportAsynchronouslyWithCompletionHandler:^{
dispatch_async(dispatch_get_main_queue(), ^{
NSLog(#"Export Complete %ld %#", (long)exporter.status, exporter.error);
[VideoHelper saveThumbnailByVideoURL:[NSURL fileURLWithPath:exportPath]];
});
}];
}
applyVideoEffectsToComposition
- (void)applyVideoEffectsToComposition:(AVMutableVideoComposition *)composition size:(CGSize)size
{
// - set up the parent layer
CALayer *parentLayer = [CALayer layer];
CALayer *videoLayer = [CALayer layer];
parentLayer.frame = CGRectMake(0, 0, size.width, size.height);
videoLayer.frame = CGRectMake(0, 0, size.width, size.height);
[parentLayer addSublayer:videoLayer];
// - set up the overlay
CALayer *overlayLayer = [CALayer layer];
overlayLayer.frame = CGRectMake(0, 0, size.width, size.height);
CAKeyframeAnimation *animation = [self animationForGifWithURL:urlEffect];
[overlayLayer addAnimation:animation forKey:#"contents"];
[parentLayer addSublayer:overlayLayer];
// - apply magic
composition.animationTool = [AVVideoCompositionCoreAnimationTool
videoCompositionCoreAnimationToolWithPostProcessingAsVideoLayer:videoLayer inLayer:parentLayer];
}
animationForGifWithURL
- (CAKeyframeAnimation *)animationForGifWithURL:(NSURL *)url {
CAKeyframeAnimation *animation = [CAKeyframeAnimation animationWithKeyPath:#"contents"];
NSMutableArray * frames = [NSMutableArray new];
NSMutableArray *delayTimes = [NSMutableArray new];
CGFloat totalTime = 0.0;
CGImageSourceRef gifSource = CGImageSourceCreateWithURL((CFURLRef)url, NULL);
// get frame count
size_t frameCount = CGImageSourceGetCount(gifSource);
for (size_t i = 0; i < frameCount; ++i) {
// get each frame
CGImageRef frame = CGImageSourceCreateImageAtIndex(gifSource, i, NULL);
[frames addObject:(__bridge id)frame];
CGImageRelease(frame);
// get gif info with each frame
NSDictionary *dict = (NSDictionary*)CFBridgingRelease(CGImageSourceCopyPropertiesAtIndex(gifSource, i, NULL));
// kCGImagePropertyGIFDictionary中kCGImagePropertyGIFDelayTime,kCGImagePropertyGIFUnclampedDelayTime值是一样的
NSDictionary *gifDict = [dict valueForKey:(NSString*)kCGImagePropertyGIFDictionary];
[delayTimes addObject:[gifDict valueForKey:(NSString*)kCGImagePropertyGIFDelayTime]];
totalTime = totalTime + [[gifDict valueForKey:(NSString*)kCGImagePropertyGIFDelayTime] floatValue];
CFRelease((__bridge CFTypeRef)(dict));
}
if (gifSource) {
CFRelease(gifSource);
}
NSMutableArray *times = [NSMutableArray arrayWithCapacity:3];
CGFloat currentTime = 0;
NSInteger count = delayTimes.count;
for (int i = 0; i < count; ++i) {
[times addObject:[NSNumber numberWithFloat:(currentTime / totalTime)]];
currentTime += [[delayTimes objectAtIndex:i] floatValue];
}
NSMutableArray *images = [NSMutableArray arrayWithCapacity:3];
for (int i = 0; i < count; ++i) {
[images addObject:[frames objectAtIndex:i]];
}
animation.keyTimes = times;
animation.values = images;
animation.timingFunction = [CAMediaTimingFunction functionWithName:kCAMediaTimingFunctionLinear];
animation.duration = totalTime;
animation.repeatCount = HUGE_VALF;
animation.beginTime = AVCoreAnimationBeginTimeAtZero;
animation.removedOnCompletion = NO;
return animation;
}

How to set Background image to AVMutableVideoCompositionInstruction

I set the backgroundColor its Working fine, but i am not able to set the background image.
AVMutableVideoCompositionInstruction * mainInstruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
mainInstruction.timeRange = CMTimeRangeMake(kCMTimeZero,assetWithMaxTime.duration);
mainInstruction.enablePostProcessing = NO;
mainInstruction.backgroundColor =[UIColor yellowColor].CGColor;
After trying several things finally I got a way to work it. To make it work need to use a blank video/audio track. Then add background image an overlay to this blank video layer. Ten export it and combine the original asset(video) and exported asset(asset) and export the final asset(video).
Add overlay
- (void)addOverlayImage:(UIImage *)overlayImage ToVideo:(AVMutableVideoComposition *)composition inSize:(CGSize)size {
// 1 - set up the overlay
CALayer *overlayLayer = [CALayer layer];
[overlayLayer setContents:(id)[overlayImage CGImage]];
overlayLayer.frame = CGRectMake(0, 0, size.width, size.height);
[overlayLayer setMasksToBounds:YES];
// 2 - set up the parent layer
CALayer *parentLayer = [CALayer layer];
CALayer *videoLayer = [CALayer layer];
parentLayer.frame = CGRectMake(0, 0, size.width, size.height);
videoLayer.frame = CGRectMake(0, 0, size.width, size.height);
[parentLayer addSublayer:videoLayer];
[parentLayer addSublayer:overlayLayer];
// 3 - apply magic
composition.animationTool = [AVVideoCompositionCoreAnimationTool
videoCompositionCoreAnimationToolWithPostProcessingAsVideoLayer:videoLayer inLayer:parentLayer];
}
- (void)getBackgroundVideoAssetWithcompletion:(void (^)(AVAsset *bgAsset))completionBlock {
NSString *path = [[NSBundle mainBundle] pathForResource:#"blank_video" ofType:#"mp4"];
NSURL *trackUrl = [NSURL fileURLWithPath:path];
AVAsset *asset = [AVAsset assetWithURL:trackUrl];
AVAssetTrack *track = [[asset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];
CMTimeRange range = CMTimeRangeMake(kCMTimeZero, [asset duration]);
AVMutableComposition* mixComposition = [AVMutableComposition composition];
AVMutableCompositionTrack *compositionVideoTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
[compositionVideoTrack insertTimeRange:range ofTrack:[[asset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0] atTime:kCMTimeZero error:nil];
CGAffineTransform videoTransform = track.preferredTransform;
CGSize naturalSize = CGSizeApplyAffineTransform(track.naturalSize, videoTransform);
naturalSize = CGSizeMake(fabs(naturalSize.width), fabs(naturalSize.height));
AVMutableVideoComposition *composition = [AVMutableVideoComposition videoCompositionWithPropertiesOfAsset:asset];
UIImage *img = [self imageWithImage:[UIImage imageNamed:#"white_image"] convertToSize:naturalSize];
[self addOverlayImage:img ToVideo:composition inSize:naturalSize];
AVMutableVideoCompositionInstruction *instruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
instruction.timeRange = range;
composition.instructions = #[instruction];
AVAssetExportSession *_assetExport = [[AVAssetExportSession alloc] initWithAsset:mixComposition presetName:AVAssetExportPresetMediumQuality];
_assetExport.videoComposition = composition;
NSString *exportPath = [NSTemporaryDirectory() stringByAppendingPathComponent:[NSString stringWithFormat:#"exported-%d.mov", arc4random() % 100000]];
unlink([exportPath UTF8String]);
NSURL *exportUrl = [NSURL fileURLWithPath:exportPath];
_assetExport.outputFileType = AVFileTypeQuickTimeMovie;
_assetExport.outputURL = exportUrl;
_assetExport.shouldOptimizeForNetworkUse = YES;
[_assetExport exportAsynchronouslyWithCompletionHandler:^{
switch (_assetExport.status) {
case AVAssetExportSessionStatusFailed:
break;
case AVAssetExportSessionStatusExporting:
break;
case AVAssetExportSessionStatusCompleted:{
dispatch_async(dispatch_get_main_queue(), ^{
NSLog(#"Successful!!!");
AVAsset *finalAsset = [AVAsset assetWithURL:_assetExport.outputURL];
completionBlock(finalAsset);
});
}
break;
default:
break;
}
}];
}
Now there is a video asset with an overlay image. Only thing is remain to combine original video and the exported video asset. Exported asset should be bottom layer and original should be top layer.

Resources