Add text to Video for specific time in iOS - ios

I am creating a video based application where I have to select any video from local gallery and have to add a text using CATextLayer over the Video. For this I am using below code:
- (void) createWatermark:(UIImage*)image video:(NSURL*)videoURL
{
if (videoURL == nil)
return;
AVURLAsset* videoAsset = [[AVURLAsset alloc]initWithURL:videoURL options:nil];
AVMutableComposition* mixComposition = [AVMutableComposition composition];
AVMutableCompositionTrack* compositionVideoTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
AVAssetTrack* clipVideoTrack = [[videoAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];
[compositionVideoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, videoAsset.duration)
ofTrack:clipVideoTrack
atTime:kCMTimeZero error:nil];
[compositionVideoTrack setPreferredTransform:[[[videoAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0] preferredTransform]];
// create the layer with the watermark image
CALayer* aLayer = [CALayer layer];
aLayer.contents = (id)image.CGImage;
aLayer.frame = CGRectMake(50, 100, image.size.width, image.size.height);
aLayer.opacity = 0.9;
//sorts the layer in proper order
AVAssetTrack* videoTrack = [[videoAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];
CGSize videoSize = [videoTrack naturalSize];
CALayer *parentLayer = [CALayer layer];
CALayer *videoLayer = [CALayer layer];
parentLayer.frame = CGRectMake(0, 0, videoSize.width, videoSize.height);
videoLayer.frame = CGRectMake(0, 0, videoSize.width, videoSize.height);
[parentLayer addSublayer:videoLayer];
[parentLayer addSublayer:aLayer];
// create text Layer
CATextLayer* titleLayer = [CATextLayer layer];
titleLayer.backgroundColor = [UIColor clearColor].CGColor;
titleLayer.string = #"Dummy text";
titleLayer.font = CFBridgingRetain(#"Helvetica");
titleLayer.fontSize = 28;
titleLayer.shadowOpacity = 0.5;
titleLayer.alignmentMode = kCAAlignmentCenter;
titleLayer.frame = CGRectMake(0, 50, videoSize.width, videoSize.height / 6);
[parentLayer addSublayer:titleLayer];
//create the composition and add the instructions to insert the layer:
AVMutableVideoComposition* videoComp = [AVMutableVideoComposition videoComposition];
videoComp.renderSize = videoSize;
videoComp.frameDuration = CMTimeMake(1, 30);
videoComp.animationTool = [AVVideoCompositionCoreAnimationTool videoCompositionCoreAnimationToolWithPostProcessingAsVideoLayer:videoLayer inLayer:parentLayer];
/// instruction
AVMutableVideoCompositionInstruction* instruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
instruction.timeRange = CMTimeRangeMake(kCMTimeZero, [mixComposition duration]);
AVAssetTrack* mixVideoTrack = [[mixComposition tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];
AVMutableVideoCompositionLayerInstruction* layerInstruction = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:mixVideoTrack];
instruction.layerInstructions = [NSArray arrayWithObject:layerInstruction];
videoComp.instructions = [NSArray arrayWithObject: instruction];
// export video
_assetExport = [[AVAssetExportSession alloc] initWithAsset:mixComposition presetName:AVAssetExportPresetMediumQuality];
_assetExport.videoComposition = videoComp;
NSLog (#"created exporter. supportedFileTypes: %#", _assetExport.supportedFileTypes);
NSString* videoName = #"NewWatermarkedVideo.mov";
NSString* exportPath = [NSTemporaryDirectory() stringByAppendingPathComponent:videoName];
NSURL* exportUrl = [NSURL fileURLWithPath:exportPath];
if ([[NSFileManager defaultManager] fileExistsAtPath:exportPath])
[[NSFileManager defaultManager] removeItemAtPath:exportPath error:nil];
_assetExport.outputFileType = AVFileTypeQuickTimeMovie;
_assetExport.outputURL = exportUrl;
_assetExport.shouldOptimizeForNetworkUse = YES;
[_assetExport exportAsynchronouslyWithCompletionHandler:
^(void ) {
//Final code here
switch (_assetExport.status)
{
case AVAssetExportSessionStatusUnknown:
NSLog(#"Unknown");
break;
case AVAssetExportSessionStatusWaiting:
NSLog(#"Waiting");
break;
case AVAssetExportSessionStatusExporting:
NSLog(#"Exporting");
break;
case AVAssetExportSessionStatusCompleted:
NSLog(#"Created new water mark image");
_playButton.hidden = NO;
break;
case AVAssetExportSessionStatusFailed:
NSLog(#"Failed- %#", _assetExport.error);
break;
case AVAssetExportSessionStatusCancelled:
NSLog(#"Cancelled");
break;
}
}
];
}
Using above code, I can able to set a text over my video. But now I have to give either FadeIN/Fade Out animation on CATextLayer OR to display moving text on Video. Please advise me. Thanks in Advance!

Try to use CAKeyframeAnimation class declared in core animation framework.
for example:
//start from zero opacity
titleLayer.opacity = 0;
CAKeyframeAnimation *fadeInAndOut = [CAKeyframeAnimation animationWithKeyPath:#"opacity"];
fadeInAndOut.duration = 5.0; //5 seconds will be your fade in fade out animation.
fadeInAndOut.autoreverses = NO;
// from 0.0 * animDuration time to 0.2 * animDuration time your animation will animate opacity from zero to 1.0.
// from 0.2 * animDuration time to 0.8 * animDuration time your opacity will be 1.0
// from 0.8 * animDuration time to 1.0 * animDuration time your animation will animate opacity from 1.0 opacity to zero.
fadeInAndOut.keyTimes = #[#(0.0), #(0.2), #(0.8), #(1.0)];
fadeInAndOut.values = #[#(0.0), #(1.0), #(1.0), #(0.0)];
fadeInAndOut.beginTime = 1.0;
fadeInAndOut.removedOnCompletion = NO;
fadeInAndOut.fillMode = kCAFillModeBoth;
[titleLayer addAnimation:fadeInAndOut forKey:nil];
I hope it's clear for you. using same way you can animate your label position.
UPDATE
for animate position try this
CAKeyframeAnimation *fadeInAndOut = [CAKeyframeAnimation animationWithKeyPath:#"position"];
fadeInAndOut.duration = 5.0;
fadeInAndOut.autoreverses = NO;
fadeInAndOut.keyTimes = #[#(0.0), #(0.2), #(0.8), #(1.0)];
fadeInAndOut.values = #[[NSValue valueWithCGPoint:CGPointMake(20, 40)],
[NSValue valueWithCGPoint:CGPointMake(200, 40)],
[NSValue valueWithCGPoint:CGPointMake(200, 40)],
[NSValue valueWithCGPoint:CGPointMake(400, 40)]];
fadeInAndOut.beginTime = 1.0;
fadeInAndOut.removedOnCompletion = NO;
fadeInAndOut.fillMode = kCAFillModeBoth;
[titleLayer addAnimation:fadeInAndOut forKey:nil];
I hardcoded points for explain purpose. you can calculate points and animation duration yourself.

First add parent layer:
let parentLayer = CALayer()
After that add overlayer for adding fade animation:
let overlayLayer = CALayer()
And call extension for adding fade animation:
overlayLayer.addFadeAnimationWithTime(showTime: startTime, endTime: endTime)
And add this to parent layer:
parentLayer.addSublayer(overlayLayer)
Extension for fade animation with startTime and endTime:
extension CALayer {
func addFadeAnimationWithTime(showTime:CGFloat, endTime:CGFloat) {
let fadeInAnimation = CABasicAnimation.init(keyPath: "opacity")
fadeInAnimation.duration = 1
fadeInAnimation.fromValue = NSNumber(value: 0)
fadeInAnimation.toValue = NSNumber(value: 1)
fadeInAnimation.isRemovedOnCompletion = false
fadeInAnimation.beginTime = CFTimeInterval(showTime)
fadeInAnimation.fillMode = CAMediaTimingFillMode.backwards
self.add(fadeInAnimation, forKey: "fadeIn")
if endTime > 0 {
let fadeOutAnimation = CABasicAnimation.init(keyPath: "opacity")
fadeOutAnimation.duration = 1
fadeOutAnimation.fromValue = NSNumber(value: 1)
fadeOutAnimation.toValue = NSNumber(value: 0)
fadeOutAnimation.isRemovedOnCompletion = false
fadeOutAnimation.beginTime = CFTimeInterval(endTime)
fadeOutAnimation.fillMode = CAMediaTimingFillMode.forwards
self.add(fadeOutAnimation, forKey: "fadeOut")
}
}
}

Related

How to show different text over video based on time range using CATextLayer? [duplicate]

I have found a few examples that show how to add a text overlay on a video.
Ray's Tutorial - http://www.raywenderlich.com/30200/avfoundation-tutorial-adding-overlays-and-animations-to-videos
This SO answer - How can I add a watermark in a captured video on iOS
There were a few others that I referenced as well. My code looks almost identical to that answer and I've been trying to tweak it to get the text overlay to only show up for a second or two at the beginning of the video. Any help on how I could accomplish that?
Here is my code. This works as is to export the video with the overlay being shown for the entire duration.
if(vA) {
videoCompositionTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
[videoCompositionTrack insertTimeRange:videoTimerange ofTrack:[[vA tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0] atTime:kCMTimeZero error:&error];
[videoCompositionTrack setPreferredTransform:[[[vA tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0] preferredTransform]];
if(error)
NSLog(#"%#", error);
}
if(aA) {
audioCompositionTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
[audioCompositionTrack insertTimeRange:audioTimerange ofTrack:[[aA tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0] atTime:kCMTimeZero error:&error];
if(error)
NSLog(#"%#", error);
}
CGSize videoSize = [videoCompositionTrack naturalSize];
UIImage *myImage = [UIImage imageNamed:#"cover.png"];
CALayer *aLayer = [CALayer layer];
aLayer.contents = (id)myImage.CGImage;
aLayer.frame = CGRectMake(5, 25, 100, 56);
aLayer.opacity = 0.7;
CATextLayer *titleLayer = [CATextLayer layer];
titleLayer.string = titleText;
titleLayer.fontSize = 18;
titleLayer.foregroundColor = titleColor.CGColor;
titleLayer.alignmentMode = kCAAlignmentCenter;
titleLayer.frame = CGRectMake(20, 10, videoSize.width - 40, 20);
[titleLayer displayIfNeeded];
CALayer *parentLayer = [CALayer layer];
CALayer *videoLayer = [CALayer layer];
parentLayer.frame = CGRectMake(0, 0, videoSize.width, videoSize.height);
videoLayer.frame = CGRectMake(0, 0, videoSize.width, videoSize.height);
[parentLayer addSublayer:videoLayer];
[parentLayer addSublayer:aLayer];
if(titleText && [titleText length] > 0) {
[parentLayer addSublayer:titleLayer];
}
AVMutableVideoComposition *videoComp = [AVMutableVideoComposition videoComposition];
videoComp.renderSize = videoSize;
videoComp.frameDuration = CMTimeMake(1, 30);
videoComp.animationTool = [AVVideoCompositionCoreAnimationTool videoCompositionCoreAnimationToolWithPostProcessingAsVideoLayer:videoLayer inLayer:parentLayer];
AVAssetTrack *videoTrack = [[mixComposition tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];
AVMutableVideoCompositionLayerInstruction *layerInstruction = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:videoTrack];
AVMutableVideoCompositionInstruction *instruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
[instruction setTimeRange:CMTimeRangeMake(kCMTimeZero, [mixComposition duration])];
instruction.layerInstructions = [NSArray arrayWithObject:layerInstruction];
videoComp.instructions = [NSArray arrayWithObject:instruction];
AVAssetExportSession *_assetExport = [[AVAssetExportSession alloc] initWithAsset:mixComposition presetName:AVAssetExportPresetMediumQuality];
_assetExport.videoComposition = videoComp;
_assetExport.outputFileType = AVFileTypeMPEG4;
_assetExport.outputURL = outputFileUrl;
[_assetExport exportAsynchronouslyWithCompletionHandler:^{
AVAssetExportSessionStatus status = [_assetExport status];
switch (status) {
case AVAssetExportSessionStatusFailed:
NSLog(#"Export Failed");
NSLog(#"Export Error: %#", [_assetExport.error localizedDescription]);
NSLog(#"Export Error Reason: %#", [_assetExport.error localizedFailureReason]);
break;
case AVAssetExportSessionStatusCompleted:
NSLog(#"Export Completed");
[self performSelectorOnMainThread:#selector(updateProgressIndicator:) withObject:[NSNumber numberWithFloat:2] waitUntilDone:YES];
break;
case AVAssetExportSessionStatusUnknown:
NSLog(#"Export Unknown");
break;
case AVAssetExportSessionStatusExporting:
NSLog(#"Export Exporting");
break;
case AVAssetExportSessionStatusWaiting:
NSLog(#"Export Waiting");
break;
}
}];
I figured out what I needed to do. It wasn't anything special really. It just required a better understanding of what was all possible.
Basically all I had to do was add a basic opacity animation to the layer with text in it.
// My original code for creating the text layer
CATextLayer *titleLayer = [CATextLayer layer];
.
.
.
[titleLayer displayIfNeeded];
// the code for the opacity animation which then removes the text
CABasicAnimation *animation = [CABasicAnimation animationWithKeyPath:#"opacity"];
[animation setDuration:0];
[animation setFromValue:[NSNumber numberWithFloat:1.0]];
[animation setToValue:[NSNumber numberWithFloat:0.0]];
[animation setBeginTime:1];
[animation setRemovedOnCompletion:NO];
[animation setFillMode:kCAFillModeForwards];
[titleLayer addAnimation:animation forKey:#"animateOpacity"];
We should add two opacity animations for Video overlay at particular time rang.
First animation for show overlay at start duration and second animation for hide overlay at end duration. We should take care to set fill mode of animation. Sample code of Video overlay for particular time Rang.
let textLayer = CATextLayer()
textLayer.opacity = 0.0
.
.
textLayer.frame = CGRect(x: 0.0, y: 0.0, width: 150.0, height:100.0)
textLayer.string = "Overlay Text"
let startVisible = CABasicAnimation.init(keyPath:"opacity")
startVisible.duration = 0.1 // for fade in duration
startVisible.repeatCount = 1
startVisible.fromValue = 0.0
startVisible.toValue = 1.0
startVisible.beginTime = overlay.startTime.seconds // overlay time range start duration
startVisible.isRemovedOnCompletion = false
startVisible.fillMode = kCAFillModeForwards
textLayer.add(startVisible, forKey: "startAnimation")
let endVisible = CABasicAnimation.init(keyPath:"opacity")
endVisible.duration = 0.1
endVisible.repeatCount = 1
endVisible.fromValue = 1.0
endVisible.toValue = 0.0
endVisible.beginTime = overlay.endTime.seconds
endVisible.fillMode = kCAFillModeForwards
endVisible.isRemovedOnCompletion = false
textLayer.add(endVisible, forKey: "endAnimation")

Saving video with overlay of GIF image

I am working on an application in which I record a video. When recording finished I put a GIF image on it with use of Library.
My code for playing video and putting gif image as an overlay
self.avPlayer = [AVPlayer playerWithURL:self.urlstring];
self.avPlayer.actionAtItemEnd = AVPlayerActionAtItemEndNone;
AVPlayerLayer *videoLayer = [AVPlayerLayer playerLayerWithPlayer:self.avPlayer];
videoLayer.frame = self.preview_view.bounds;
videoLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;
[self.preview_view.layer addSublayer:videoLayer];
NSURL *url = [[NSBundle mainBundle] URLForResource:#"02" withExtension:#"gif"];
self.img_gif.image = [UIImage animatedImageWithAnimatedGIFData:[NSData dataWithContentsOfURL:url]];
But now I want to merge and save video with overlay of this GIF image. I google it didn't find what I want.
Thank you for your help
This is the best answer to merge video with GIF image.
- (void)mixVideoAsset:(AVAsset *)videoAsset {
NSDate * begin = [NSDate date];
// 2 - Create AVMutableComposition object. This object will hold your AVMutableCompositionTrack instances.
AVMutableComposition *mixComposition = [[AVMutableComposition alloc] init];
// 3 - Video track
AVMutableCompositionTrack *videoTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo
preferredTrackID:kCMPersistentTrackID_Invalid];
[videoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, videoAsset.duration)
ofTrack:[[videoAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0]
atTime:kCMTimeZero error:nil];
// - Audio
AVMutableCompositionTrack *audioCompositionTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
AVAssetTrack *audioTrack = [[videoAsset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0];
[audioCompositionTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, audioTrack.timeRange.duration) ofTrack:audioTrack atTime:kCMTimeZero error:nil];
// 3.1 - Create AVMutableVideoCompositionInstruction
AVMutableVideoCompositionInstruction *mainInstruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
mainInstruction.timeRange = CMTimeRangeMake(kCMTimeZero, videoAsset.duration);
// 3.2 - Create an AVMutableVideoCompositionLayerInstruction for the video track and fix the orientation.
AVMutableVideoCompositionLayerInstruction *videolayerInstruction = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:videoTrack];
AVAssetTrack *videoAssetTrack = [[videoAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];
UIImageOrientation videoAssetOrientation_ = UIImageOrientationUp;
BOOL isVideoAssetPortrait_ = NO;
CGAffineTransform videoTransform = videoAssetTrack.preferredTransform;
if (videoTransform.a == 0 && videoTransform.b == 1.0 && videoTransform.c == -1.0 && videoTransform.d == 0) {
videoAssetOrientation_ = UIImageOrientationRight;
isVideoAssetPortrait_ = YES;
}
if (videoTransform.a == 0 && videoTransform.b == -1.0 && videoTransform.c == 1.0 && videoTransform.d == 0) {
videoAssetOrientation_ = UIImageOrientationLeft;
isVideoAssetPortrait_ = YES;
}
if (videoTransform.a == 1.0 && videoTransform.b == 0 && videoTransform.c == 0 && videoTransform.d == 1.0) {
videoAssetOrientation_ = UIImageOrientationUp;
}
if (videoTransform.a == -1.0 && videoTransform.b == 0 && videoTransform.c == 0 && videoTransform.d == -1.0) {
videoAssetOrientation_ = UIImageOrientationDown;
}
[videolayerInstruction setTransform:videoAssetTrack.preferredTransform atTime:kCMTimeZero];
[videolayerInstruction setOpacity:0.0 atTime:videoAsset.duration];
// 3.3 - Add instructions
mainInstruction.layerInstructions = [NSArray arrayWithObjects:videolayerInstruction,nil];
AVMutableVideoComposition *mainCompositionInst = [AVMutableVideoComposition videoComposition];
CGSize naturalSize;
if(isVideoAssetPortrait_){
naturalSize = CGSizeMake(videoAssetTrack.naturalSize.height, videoAssetTrack.naturalSize.width);
} else {
naturalSize = videoAssetTrack.naturalSize;
}
float renderWidth, renderHeight;
renderWidth = naturalSize.width;
renderHeight = naturalSize.height;
mainCompositionInst.renderSize = CGSizeMake(renderWidth, renderHeight);
mainCompositionInst.instructions = [NSArray arrayWithObject:mainInstruction];
mainCompositionInst.frameDuration = CMTimeMake(1, 30);
// Watermark Layers
[self applyVideoEffectsToComposition:mainCompositionInst size:naturalSize];
// 4 - Get path
NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
NSString *documentsDirectory = [paths objectAtIndex:0];
NSString *myPathDocs = [documentsDirectory stringByAppendingPathComponent:
[NSString stringWithFormat:#"FinalVideo-%d.mov",arc4random() % 1000]];
NSURL *url = [NSURL fileURLWithPath:myPathDocs];
// NSURL * url = TempVideoURL();
// 5 - Create exporter
AVAssetExportSession *exporter = [[AVAssetExportSession alloc] initWithAsset:mixComposition
presetName:AVAssetExportPresetHighestQuality];
exporter.outputURL=url;
exporter.outputFileType = AVFileTypeMPEG4;
exporter.shouldOptimizeForNetworkUse = YES;
exporter.videoComposition = mainCompositionInst;
[exporter exportAsynchronouslyWithCompletionHandler:^{
dispatch_async(dispatch_get_main_queue(), ^{
NSDate * endDate = [NSDate date];
NSTimeInterval interval = [endDate timeIntervalSinceDate:begin];
NSLog(#"completed %f senconds",interval);
ALAssetsLibrary *assetsLibrary = [[ALAssetsLibrary alloc] init];
if ([assetsLibrary videoAtPathIsCompatibleWithSavedPhotosAlbum:exporter.outputURL]) {
[assetsLibrary writeVideoAtPathToSavedPhotosAlbum:exporter.outputURL completionBlock:NULL];
}
});
}];
}
- (void)applyVideoEffectsToComposition:(AVMutableVideoComposition *)composition size:(CGSize)size
{
// - set up the parent layer
CALayer *parentLayer = [CALayer layer];
CALayer *videoLayer = [CALayer layer];
parentLayer.frame = CGRectMake(0, 0, size.width, size.height);
videoLayer.frame = CGRectMake(0, 0, size.width, size.height);
[parentLayer addSublayer:videoLayer];
size.width = 100;
size.height = 100;
// - set up the overlay
CALayer *overlayLayer = [CALayer layer];
overlayLayer.frame = CGRectMake(0, 100, size.width, size.height);
NSURL *fileUrl = [[NSBundle mainBundle] URLForResource:#"jiafei" withExtension:#"gif"];
[self startGifAnimationWithURL:fileUrl inLayer:overlayLayer];
// UIImage * image = [UIImage imageNamed:#"gifImage.gif"];
// [overlayLayer setContents:(id)[image CGImage]];
// [overlayLayer setMasksToBounds:YES];
[parentLayer addSublayer:overlayLayer];
// - apply magic
composition.animationTool = [AVVideoCompositionCoreAnimationTool
videoCompositionCoreAnimationToolWithPostProcessingAsVideoLayer:videoLayer inLayer:parentLayer];
}
- (void)startGifAnimationWithURL:(NSURL *)url inLayer:(CALayer *)layer {
CAKeyframeAnimation * animation = [self animationForGifWithURL:url];
[layer addAnimation:animation forKey:#"contents"];
}
- (CAKeyframeAnimation *)animationForGifWithURL:(NSURL *)url {
CAKeyframeAnimation *animation = [CAKeyframeAnimation animationWithKeyPath:#"contents"];
NSMutableArray * frames = [NSMutableArray new];
NSMutableArray *delayTimes = [NSMutableArray new];
CGFloat totalTime = 0.0;
CGFloat gifWidth;
CGFloat gifHeight;
CGImageSourceRef gifSource = CGImageSourceCreateWithURL((CFURLRef)url, NULL);
// get frame count
size_t frameCount = CGImageSourceGetCount(gifSource);
for (size_t i = 0; i < frameCount; ++i) {
// get each frame
CGImageRef frame = CGImageSourceCreateImageAtIndex(gifSource, i, NULL);
[frames addObject:(__bridge id)frame];
CGImageRelease(frame);
// get gif info with each frame
NSDictionary *dict = (NSDictionary*)CFBridgingRelease(CGImageSourceCopyPropertiesAtIndex(gifSource, i, NULL));
NSLog(#"kCGImagePropertyGIFDictionary %#", [dict valueForKey:(NSString*)kCGImagePropertyGIFDictionary]);
// get gif size
gifWidth = [[dict valueForKey:(NSString*)kCGImagePropertyPixelWidth] floatValue];
gifHeight = [[dict valueForKey:(NSString*)kCGImagePropertyPixelHeight] floatValue];
// kCGImagePropertyGIFDictionary中kCGImagePropertyGIFDelayTime,kCGImagePropertyGIFUnclampedDelayTime值是一样的
NSDictionary *gifDict = [dict valueForKey:(NSString*)kCGImagePropertyGIFDictionary];
[delayTimes addObject:[gifDict valueForKey:(NSString*)kCGImagePropertyGIFDelayTime]];
totalTime = totalTime + [[gifDict valueForKey:(NSString*)kCGImagePropertyGIFDelayTime] floatValue];
CFRelease((__bridge CFTypeRef)(dict));
}
if (gifSource) {
CFRelease(gifSource);
}
NSMutableArray *times = [NSMutableArray arrayWithCapacity:3];
CGFloat currentTime = 0;
NSInteger count = delayTimes.count;
for (int i = 0; i < count; ++i) {
[times addObject:[NSNumber numberWithFloat:(currentTime / totalTime)]];
currentTime += [[delayTimes objectAtIndex:i] floatValue];
}
NSMutableArray *images = [NSMutableArray arrayWithCapacity:3];
for (int i = 0; i < count; ++i) {
[images addObject:[frames objectAtIndex:i]];
}
animation.keyTimes = times;
animation.values = images;
animation.timingFunction = [CAMediaTimingFunction functionWithName:kCAMediaTimingFunctionLinear];
animation.duration = totalTime;
animation.repeatCount = HUGE_VALF;
animation.beginTime = AVCoreAnimationBeginTimeAtZero;
animation.removedOnCompletion = NO;
return animation;
}
Here is the swift version of #Jitendra Modi's answer and it worked like a charm.
Swift 5.2:
func animationForGif(with url: URL) -> CAKeyframeAnimation? {
let animation = CAKeyframeAnimation(keyPath: #keyPath(CALayer.contents))
var frames: [CGImage] = []
var delayTimes: [CGFloat] = []
var totalTime: CGFloat = 0.0
// var gifWidth: CGFloat, gifHeight: CGFloat
guard let gifSource = CGImageSourceCreateWithURL(url as CFURL, nil) else {
print("Can not get image source from the gif: \(url)")
return nil
}
// get frame
let frameCount = CGImageSourceGetCount(gifSource)
for i in 0..<frameCount {
guard let frame = CGImageSourceCreateImageAtIndex(gifSource, i, nil) else {
continue
}
guard let dic = CGImageSourceCopyPropertiesAtIndex(gifSource, i, nil) as? [AnyHashable: Any] else { continue }
// gifWidth = dic[kCGImagePropertyPixelWidth] as? CGFloat ?? 0
// gifHeight = dic[kCGImagePropertyPixelHeight] as? CGFloat ?? 0
guard let gifDic: [AnyHashable: Any] = dic[kCGImagePropertyGIFDictionary] as? [AnyHashable: Any] else { continue }
let delayTime = gifDic[kCGImagePropertyGIFDelayTime] as? CGFloat ?? 0
frames.append(frame)
delayTimes.append(delayTime)
totalTime += delayTime
}
if frames.count == 0 {
return nil
}
assert(frames.count == delayTimes.count)
var times: [NSNumber] = []
var currentTime: CGFloat = 0
for i in 0..<delayTimes.count {
times.append(NSNumber(value: Double(currentTime / totalTime)))
currentTime += delayTimes[i]
}
animation.keyTimes = times
animation.values = frames
animation.timingFunction = CAMediaTimingFunction(name: CAMediaTimingFunctionName.linear)
animation.duration = Double(totalTime)
animation.repeatCount = .greatestFiniteMagnitude
animation.beginTime = AVCoreAnimationBeginTimeAtZero
animation.isRemovedOnCompletion = false
return animation
}
And you can use this animation:
let gifLayer = CALayer()
gifLayer.frame = CGRect(x: 0, y: 0, width: 300, height: 300)
if let animation = animationForGif(with: gifUrl) {
gifLayer.add(animation, forKey: "contents")
}
parentLayer.addSublayer(gifLayer)
You can try any of the below code for Screen Recording. It will merger your video and GIF.
You can download sample from the link below provided by Apple. https://developer.apple.com/library/mac/samplecode/AVScreenShack/Introduction/Intro.html
https://github.com/alskipp/ASScreenRecorder
http://codethink.no-ip.org/wordpress/archives/673
Hope this help you..

Use AVFoundation to add image to video, but the image layer not show in the right way

In my app, I need to edit the video which recorded by AVFoundation.
The users could be add multiple images to the video, and they may drag images to anywhere of the full video screen.
Editing view:
Editing view and Exported View
and here the code:
AVMutableVideoCompositionInstruction *instruction = nil;
AVMutableVideoCompositionLayerInstruction *layerInstruction = nil;
AVAssetTrack *assetVideoTrack = nil;
AVAssetTrack *assetAudioTrack = nil;
// Check if the asset contains video and audio tracks
if ([[self.inputAsset tracksWithMediaType:AVMediaTypeVideo] count] != 0) {
assetVideoTrack = [self.inputAsset tracksWithMediaType:AVMediaTypeVideo][0];
}
if ([[self.inputAsset tracksWithMediaType:AVMediaTypeAudio] count] != 0) {
assetAudioTrack = [self.inputAsset tracksWithMediaType:AVMediaTypeAudio][0];
}
CMTime insertionPoint = kCMTimeZero;
NSError *error = nil;
// Check if a composition already exists, else create a composition using the input asset
self.composition = [AVMutableComposition composition];
// Insert the video and audio tracks from AVAsset
if (assetVideoTrack != nil) {
AVMutableCompositionTrack *compositionVideoTrack = [self.composition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
[compositionVideoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, [self.inputAsset duration]) ofTrack:assetVideoTrack atTime:insertionPoint error:&error];
}
if (assetAudioTrack != nil) {
AVMutableCompositionTrack *compositionAudioTrack = [self.composition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
[compositionAudioTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, [self.inputAsset duration]) ofTrack:assetAudioTrack atTime:insertionPoint error:&error];
}
CGAffineTransform t1 = CGAffineTransformMakeTranslation(assetVideoTrack.naturalSize.height,0);
// Rotate transformation
CGAffineTransform t2 = CGAffineTransformRotate(t1, ( ( 90 ) / 180.0 * M_PI ));
// Create a new video composition
self.videoComposition = [AVMutableVideoComposition videoComposition];
//self.videoComposition.renderSize = CGSizeMake([UIScreen mainScreen].bounds.size.width, [UIScreen mainScreen].bounds.size.height);
self.videoComposition.renderSize = CGSizeMake(assetVideoTrack.naturalSize.height, assetVideoTrack.naturalSize.width);
self.videoComposition.frameDuration = CMTimeMake(1, 30);
// The rotate transform is set on a layer instruction
instruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
instruction.timeRange = CMTimeRangeMake(kCMTimeZero, [self.composition duration]);
layerInstruction = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:(self.composition.tracks)[0]];
[layerInstruction setTransform:t2 atTime:kCMTimeZero];
instruction.layerInstructions = #[layerInstruction];
self.videoComposition.instructions = #[instruction];
if I change
self.videoComposition.renderSize = CGSizeMake(assetVideoTrack.naturalSize.height, assetVideoTrack.naturalSize.width);
to
self.videoComposition.renderSize = CGSizeMake([UIScreen mainScreen].bounds.size.width, [UIScreen mainScreen].bounds.size.height);
In addition, I use this code to add image layer's to video:
- (void)exportWillBegin
{
//CALayer *exportWatermarkLayer = [self copyWatermarkLayer:self.watermarkLayer];
CALayer *parentLayer = [CALayer layer];
CALayer *videoLayer = [CALayer layer];
parentLayer.frame = CGRectMake(0, 0, self.videoComposition.renderSize.width, self.videoComposition.renderSize.height);
videoLayer.frame = CGRectMake(0, 0, self.videoComposition.renderSize.width, self.videoComposition.renderSize.height);
[parentLayer addSublayer:videoLayer];
//exportWatermarkLayer.position = CGPointMake(self.videoComposition.renderSize.width/2, self.videoComposition.renderSize.height/4);
//[parentLayer addSublayer:exportWatermarkLayer];
for (NSInteger i = 0; i < self.stickerViews.count; i++) {
WTStickerView *sticker = [self.stickerViews objectAtIndex:i];
CALayer *layer = sticker.contentView.layer;
// CGAffineTransform transform = CGAffineTransformIdentity;
layer.affineTransform = sticker.transform;
layer.position = sticker.center;
[parentLayer addSublayer:layer];
}
self.videoComposition.animationTool = [AVVideoCompositionCoreAnimationTool videoCompositionCoreAnimationToolWithPostProcessingAsVideoLayer:videoLayer inLayer:parentLayer];
}
can anyone help me to fix this? The image layer not in the correct position where I drag to.

Add GIF watermark on a video in iOS

I need to accomplish this function: There is a GIF overlay on a video, hoping to composition this video and GIF to a new video. I'm using the following code, but result is only the video without GIF:
- (void)mixVideoAsset:(AVAsset *)videoAsset {
LLog(#"Begining");
NSDate * begin = [NSDate date];
// 2 - Create AVMutableComposition object. This object will hold your AVMutableCompositionTrack instances.
AVMutableComposition *mixComposition = [[AVMutableComposition alloc] init];
// 3 - Video track
AVMutableCompositionTrack *videoTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo
preferredTrackID:kCMPersistentTrackID_Invalid];
[videoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, videoAsset.duration)
ofTrack:[[videoAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0]
atTime:kCMTimeZero error:nil];
// - Audio
AVMutableCompositionTrack *audioCompositionTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
AVAssetTrack *audioTrack = [[videoAsset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0];
[audioCompositionTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, audioTrack.timeRange.duration) ofTrack:audioTrack atTime:kCMTimeZero error:nil];
// 3.1 - Create AVMutableVideoCompositionInstruction
AVMutableVideoCompositionInstruction *mainInstruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
mainInstruction.timeRange = CMTimeRangeMake(kCMTimeZero, videoAsset.duration);
// 3.2 - Create an AVMutableVideoCompositionLayerInstruction for the video track and fix the orientation.
AVMutableVideoCompositionLayerInstruction *videolayerInstruction = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:videoTrack];
AVAssetTrack *videoAssetTrack = [[videoAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];
UIImageOrientation videoAssetOrientation_ = UIImageOrientationUp;
BOOL isVideoAssetPortrait_ = NO;
CGAffineTransform videoTransform = videoAssetTrack.preferredTransform;
if (videoTransform.a == 0 && videoTransform.b == 1.0 && videoTransform.c == -1.0 && videoTransform.d == 0) {
videoAssetOrientation_ = UIImageOrientationRight;
isVideoAssetPortrait_ = YES;
}
if (videoTransform.a == 0 && videoTransform.b == -1.0 && videoTransform.c == 1.0 && videoTransform.d == 0) {
videoAssetOrientation_ = UIImageOrientationLeft;
isVideoAssetPortrait_ = YES;
}
if (videoTransform.a == 1.0 && videoTransform.b == 0 && videoTransform.c == 0 && videoTransform.d == 1.0) {
videoAssetOrientation_ = UIImageOrientationUp;
}
if (videoTransform.a == -1.0 && videoTransform.b == 0 && videoTransform.c == 0 && videoTransform.d == -1.0) {
videoAssetOrientation_ = UIImageOrientationDown;
}
[videolayerInstruction setTransform:videoAssetTrack.preferredTransform atTime:kCMTimeZero];
[videolayerInstruction setOpacity:0.0 atTime:videoAsset.duration];
// 3.3 - Add instructions
mainInstruction.layerInstructions = [NSArray arrayWithObjects:videolayerInstruction,nil];
AVMutableVideoComposition *mainCompositionInst = [AVMutableVideoComposition videoComposition];
CGSize naturalSize;
if(isVideoAssetPortrait_){
naturalSize = CGSizeMake(videoAssetTrack.naturalSize.height, videoAssetTrack.naturalSize.width);
} else {
naturalSize = videoAssetTrack.naturalSize;
}
float renderWidth, renderHeight;
renderWidth = naturalSize.width;
renderHeight = naturalSize.height;
mainCompositionInst.renderSize = CGSizeMake(renderWidth, renderHeight);
mainCompositionInst.instructions = [NSArray arrayWithObject:mainInstruction];
mainCompositionInst.frameDuration = CMTimeMake(1, 30);
// Watermark Layers
[self applyVideoEffectsToComposition:mainCompositionInst size:naturalSize];
// 4 - Get path
// NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
// NSString *documentsDirectory = [paths objectAtIndex:0];
// NSString *myPathDocs = [documentsDirectory stringByAppendingPathComponent:
// [NSString stringWithFormat:#"FinalVideo-%d.mov",arc4random() % 1000]];
// NSURL *url = [NSURL fileURLWithPath:myPathDocs];
NSURL * url = TempVideoURL();
// 5 - Create exporter
AVAssetExportSession *exporter = [[AVAssetExportSession alloc] initWithAsset:mixComposition
presetName:AVAssetExportPresetHighestQuality];
exporter.outputURL=url;
exporter.outputFileType = AVFileTypeMPEG4;
exporter.shouldOptimizeForNetworkUse = YES;
exporter.videoComposition = mainCompositionInst;
[exporter exportAsynchronouslyWithCompletionHandler:^{
dispatch_async(dispatch_get_main_queue(), ^{
NSDate * endDate = [NSDate date];
NSTimeInterval interval = [endDate timeIntervalSinceDate:begin];
LLog(#"completed %f senconds",interval);
ALAssetsLibrary *assetsLibrary = [[ALAssetsLibrary alloc] init];
if ([assetsLibrary videoAtPathIsCompatibleWithSavedPhotosAlbum:exporter.outputURL]) {
[assetsLibrary writeVideoAtPathToSavedPhotosAlbum:exporter.outputURL completionBlock:NULL];
}
});
}];
}
Add Gif Watermark
- (void)applyVideoEffectsToComposition:(AVMutableVideoComposition *)composition size:(CGSize)size
{
// - set up the parent layer
CALayer *parentLayer = [CALayer layer];
CALayer *videoLayer = [CALayer layer];
parentLayer.frame = CGRectMake(0, 0, size.width, size.height);
videoLayer.frame = CGRectMake(0, 0, size.width, size.height);
[parentLayer addSublayer:videoLayer];
size.width = 100;
size.height = 100;
// - set up the overlay
CALayer *overlayLayer = [CALayer layer];
overlayLayer.frame = CGRectMake(0, 100, size.width, size.height);
NSURL *fileUrl = [[NSBundle mainBundle] URLForResource:#"jiafei" withExtension:#"gif"];
[BBGifManager startGifAnimationWithURL:fileUrl inLayer:overlayLayer];
// UIImage * image = [UIImage imageNamed:#"gifImage.gif"];
// [overlayLayer setContents:(id)[image CGImage]];
// [overlayLayer setMasksToBounds:YES];
[parentLayer addSublayer:overlayLayer];
// - apply magic
composition.animationTool = [AVVideoCompositionCoreAnimationTool
videoCompositionCoreAnimationToolWithPostProcessingAsVideoLayer:videoLayer inLayer:parentLayer];
}
Add CALayer Animations
+ (void)startGifAnimationWithURL:(NSURL *)url inLayer:(CALayer *)layer {
CAKeyframeAnimation * animation = [self animationForGifWithURL:url];
[layer addAnimation:animation forKey:#"contents"];
}
Create CAKeyFrameAnimation
+ (CAKeyframeAnimation *)animationForGifWithURL:(NSURL *)url {
CAKeyframeAnimation *animation = [CAKeyframeAnimation animationWithKeyPath:#"contents"];
NSMutableArray * frames = [NSMutableArray new];
NSMutableArray *delayTimes = [NSMutableArray new];
CGFloat totalTime = 0.0;
CGFloat gifWidth;
CGFloat gifHeight;
CGImageSourceRef gifSource = CGImageSourceCreateWithURL((CFURLRef)url, NULL);
// get frame count
size_t frameCount = CGImageSourceGetCount(gifSource);
for (size_t i = 0; i < frameCount; ++i) {
// get each frame
CGImageRef frame = CGImageSourceCreateImageAtIndex(gifSource, i, NULL);
[frames addObject:(__bridge id)frame];
CGImageRelease(frame);
// get gif info with each frame
NSDictionary *dict = (NSDictionary*)CFBridgingRelease(CGImageSourceCopyPropertiesAtIndex(gifSource, i, NULL));
NSLog(#"kCGImagePropertyGIFDictionary %#", [dict valueForKey:(NSString*)kCGImagePropertyGIFDictionary]);
// get gif size
gifWidth = [[dict valueForKey:(NSString*)kCGImagePropertyPixelWidth] floatValue];
gifHeight = [[dict valueForKey:(NSString*)kCGImagePropertyPixelHeight] floatValue];
// kCGImagePropertyGIFDictionary中kCGImagePropertyGIFDelayTime,kCGImagePropertyGIFUnclampedDelayTime值是一样的
NSDictionary *gifDict = [dict valueForKey:(NSString*)kCGImagePropertyGIFDictionary];
[delayTimes addObject:[gifDict valueForKey:(NSString*)kCGImagePropertyGIFDelayTime]];
totalTime = totalTime + [[gifDict valueForKey:(NSString*)kCGImagePropertyGIFDelayTime] floatValue];
CFRelease((__bridge CFTypeRef)(dict));
}
if (gifSource) {
CFRelease(gifSource);
}
NSMutableArray *times = [NSMutableArray arrayWithCapacity:3];
CGFloat currentTime = 0;
NSInteger count = delayTimes.count;
for (int i = 0; i < count; ++i) {
[times addObject:[NSNumber numberWithFloat:(currentTime / totalTime)]];
currentTime += [[delayTimes objectAtIndex:i] floatValue];
}
NSMutableArray *images = [NSMutableArray arrayWithCapacity:3];
for (int i = 0; i < count; ++i) {
[images addObject:[frames objectAtIndex:i]];
}
animation.keyTimes = times;
animation.values = images;
animation.timingFunction = [CAMediaTimingFunction functionWithName:kCAMediaTimingFunctionLinear];
animation.duration = totalTime;
animation.repeatCount = HUGE_VALF;
return animation;
}
You should adjust your animation settings for CoreAnimation:
animation.beginTime = AVCoreAnimationBeginTimeAtZero;
animation.removedOnCompletion = NO;
Just in case here's the example on swift3 how to do the same - insert animated frames/images into the video (not exactly the gif but array of Images). It uses AVAssetExportSession and AVMutableVideoComposition together with AVMutableVideoCompositionInstruction, and CAKeyframeAnimation to animate the frames.

Creating a square video with custom background color and aspect-fitted video on top with AVFoundation

I want to take a video (eg:16:9 shot with iPhone) and fit and center it in a square with custom background color. My code goes like:
- (void)videoOutput
{
if (!self.firstAsset) {
UIAlertView *alert = [[UIAlertView alloc] initWithTitle:#"Error" message:#"Please Load a Video Asset First"
delegate:nil cancelButtonTitle:#"OK" otherButtonTitles:nil];
[alert show];
return;
}
AVMutableComposition *mixComposition = [[AVMutableComposition alloc] init];
AVMutableCompositionTrack *videoTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo
preferredTrackID:kCMPersistentTrackID_Invalid];
[videoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, self.firstAsset.duration)
ofTrack:[[self.firstAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0]
atTime:kCMTimeZero error:nil];
AVMutableVideoCompositionInstruction *mainInstruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
mainInstruction.timeRange = CMTimeRangeMake(kCMTimeZero, self.firstAsset.duration);
AVMutableVideoCompositionLayerInstruction *videolayerInstruction = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:videoTrack];
AVAssetTrack *videoAssetTrack = [[self.firstAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];
CGSize videoSize = [[[firstAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0] naturalSize];
NSLog(#"Video Size W:%f, H:%f",videoSize.width,videoSize.height);
float scaleRatio = 600/videoSize.width;
[videolayerInstruction setTransform:CGAffineTransformMakeScale(scaleRatio, scaleRatio) atTime:kCMTimeZero];
[videolayerInstruction setOpacity:0.0 atTime:self.firstAsset.duration];
mainInstruction.layerInstructions = [NSArray arrayWithObjects:videolayerInstruction,nil];
AVMutableVideoComposition *mainCompositionInst = [AVMutableVideoComposition videoComposition];
float renderWidth, renderHeight;
renderWidth = 600;
renderHeight = 600;
mainCompositionInst.renderSize = CGSizeMake(renderWidth, renderHeight);
mainCompositionInst.instructions = [NSArray arrayWithObject:mainInstruction];
mainCompositionInst.frameDuration = CMTimeMake(1, 30);
[self applyVideoEffectsToComposition:mainCompositionInst size:CGSizeMake(600, 600)];
NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
NSString *documentsDirectory = [paths objectAtIndex:0];
NSString *myPathDocs = [documentsDirectory stringByAppendingPathComponent:
[NSString stringWithFormat:#"FinalVideo-%d.mov",arc4random() % 1000]];
NSURL *url = [NSURL fileURLWithPath:myPathDocs];
AVAssetExportSession *exporter = [[AVAssetExportSession alloc] initWithAsset:mixComposition
presetName:AVAssetExportPresetHighestQuality];
exporter.outputURL=url;
exporter.outputFileType = AVFileTypeQuickTimeMovie;
exporter.shouldOptimizeForNetworkUse = YES;
exporter.videoComposition = mainCompositionInst;
[exporter exportAsynchronouslyWithCompletionHandler:^{
dispatch_async(dispatch_get_main_queue(), ^{
[self exportDidFinish:exporter];
});
}];
}
- (void)applyVideoEffectsToComposition:(AVMutableVideoComposition *)composition size:(CGSize)size
{
UIImage *borderImage = nil;
borderImage = [self imageWithColor:[UIColor greenColor] rectSize:CGRectMake(0, 0, size.width, size.height)];
CALayer *backgroundLayer = [CALayer layer];
[backgroundLayer setContents:(id)[borderImage CGImage]];
backgroundLayer.frame = CGRectMake(0, 0, size.width, size.height);
[backgroundLayer setMasksToBounds:YES];
AVPlayerItem *playerItem2 = [[AVPlayerItem alloc] initWithAsset:secondAsset];
AVPlayer *videoPlayer2 = [AVPlayer playerWithPlayerItem:playerItem2];
AVPlayerLayer *videoLayer = [AVPlayerLayer playerLayerWithPlayer:videoPlayer2];
CGSize videoSize = [[[secondAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0] naturalSize];
[videoLayer setBackgroundColor:[UIColor whiteColor].CGColor];
videoLayer.frame = CGRectMake(0, (600-337.5)/2, 600, 337.5);
CALayer *parentLayer = [CALayer layer];
parentLayer.frame = CGRectMake(0, 0, size.width, size.height);
[parentLayer addSublayer:backgroundLayer];
[parentLayer addSublayer:videoLayer];
composition.animationTool = [AVVideoCompositionCoreAnimationTool
videoCompositionCoreAnimationToolWithPostProcessingAsVideoLayer:videoLayer inLayer:parentLayer];
}
The original video and the exported result are the images below. As you can see in the exported video, the frame for the overlaid video is correct. But the video in it does not maintain its aspect ratio. If I choose to make the videolayer frame square, the aspect ratio remains normal.
I am stuck at this early level. Ultimately I am trying to build a WYSIWYG editor for square videos and apply scale,translation and rotation transformations to the videoLayer that will be rendered in the square video. Any help with this specific question and forward is much appreciated.
This is exactly what I'm getting through. Now I have a solution. If you want to change the video from rectangle shape to square. You need to crop the video and set AVMutableVideoCompositionInstruction's backgroundColor to the colors you want.
Confirm the 'UIImageOrientation' of the video.
Crop the video by 'videoComposition.renderSize', the renderSize.width equals to renderSize.height. And the renderSize.width is larger side of the original video size.
AVMutableVideoCompositionInstruction *instruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
instruction.backgroundColor = /CGColorRef/;
Quote by Apple :
/* Indicates the background color of the composition. Solid BGRA colors only are supported; patterns and other color refs that are not supported will be ignored.
If the background color is not specified the video compositor will use a default backgroundColor of opaque black.
If the rendered pixel buffer does not have alpha, the alpha value of the backgroundColor will be ignored. */
#property (nonatomic, retain, nullable) attribute((NSObject)) CGColorRef backgroundColor CF_RETURNS_RETAINED;
The wrong way to set instruction backgroundColor is :
instruction.backgroundColor = [UIColor blueColor].CGColor;
The correct one is :
CGColorSpaceRef rgb = CGColorSpaceCreateDeviceRGB();
const CGFloat myColor[] = {1.0, 1.0, 1.0, 1.0}; //white
CGColorRef ref = CGColorCreate(rgb, myColor);
instruction.backgroundColor = ref;
Export the video.
Done!
By the way, the videolayer and videoComposition.renderSize must set naturalSize of original video. You can not set videolayer.frame to customized CGRect.
var transform: CGAffineTransform
if isVideoAssetPortrait == true {
let scale = naturalSize.height / naturalSize.width
transform = CGAffineTransformMakeScale(scale, 1)
transform = CGAffineTransformConcat(videoTrack.preferredTransform, transform)
} else {
let scale = naturalSize.width / naturalSize.height
transform = CGAffineTransformMakeScale(1, scale)
}
videoLayerInstruction.setTransform(transform, atTime: kCMTimeZero)
I was inspired by this https://www.snip2code.com/Snippet/42212/Crop-video-to-square-with-a-specified-si. However, I do not actually know what this is doing. I copied the code and it does not work. I see the video is deformed, thus, I tried to make it tranform, and it works. If you still have any problem. Please contact me. Here is my sample code which I was still writing for myself:

Resources