AVSynchronizedLayer not synchronizing animation - ios

I'm having issues making the animation use the AVPlayer time instead of the system time. the synchronized layer does not work properly and animations stay synchronized on the system time instead of the player time. I know the player do play. and if I pass CACurrentMediaTime() to the beginTime, the animation start right away as it should when not synchronized.
EDIT
I can see the red square in its final state since the beginning, which mean the animation has reach its end at the beginning because it is synchronized on the system time and not the AVPlayerItem time.
// play composition live in order to modifier
AVPlayerItem * playerItem = [AVPlayerItem playerItemWithAsset:composition];
AVPlayer * player = [AVPlayer playerWithPlayerItem:playerItem];
AVPlayerLayer * playerLayer = [AVPlayerLayer playerLayerWithPlayer:player ];
playerLayer.frame = [UIScreen mainScreen].bounds;
if (!playerItem) {
NSLog(#"playerItem empty");
}
// dummy time
playerItem.forwardPlaybackEndTime = totalDuration;
playerItem.videoComposition = videoComposition;
CALayer * aLayer = [CALayer layer];
aLayer.frame = CGRectMake(100, 100, 60, 60);
aLayer.backgroundColor = [UIColor redColor].CGColor;
aLayer.opacity = 0.f;
CAKeyframeAnimation * keyframeAnimation2 = [CAKeyframeAnimation animationWithKeyPath:#"opacity"];
keyframeAnimation2.removedOnCompletion = NO;
keyframeAnimation2.beginTime = 0.1;
keyframeAnimation2.duration = 4.0;
keyframeAnimation2.fillMode = kCAFillModeBoth;
keyframeAnimation2.keyTimes = #[#0.0, #1.0];
keyframeAnimation2.values = #[#0.f, #1.f];
NSLog(#"%f current media time", CACurrentMediaTime());
[aLayer addAnimation:keyframeAnimation2
forKey:#"opacity"];
[self.parentLayer addSublayer:aLayer];
AVSynchronizedLayer * synchronizedLayer =
[AVSynchronizedLayer synchronizedLayerWithPlayerItem:playerItem];
synchronizedLayer.frame = [UIScreen mainScreen].bounds;
[synchronizedLayer addSublayer:self.parentLayer];
[playerLayer addSublayer:synchronizedLayer];

The solution was that AVSynchronizedLayer doesn't work on the Simulator but works fine on a device.

Related

CAEmitterLayer draining battery

My project deals with IOS 12+ iPhones and i am using fire emitter to follow the finger of the user across the screen (with a pan gesture recogniser). When using the following code, i have noticed that the screen becomes warmer the game is played and it tends to drain battery more rapidly than usual the more the app is used.
Is there any particular to have a fire emitter for following a finger on the screen and yet have a battery friendly solution?
//set ref to the layer
cursorEmiter = [CAEmitterLayer layer];
//configure the emitter layer
cursorEmiter.emitterPosition = CGPointMake(50, 50);
cursorEmiter.emitterSize = CGSizeMake(5, 5);
CAEmitterCell* fire = [CAEmitterCell emitterCell];
fire.birthRate = 1000; // The number of emitted objects created every second. Animatable
fire.lifetime = 0.6; // The lifetime of the cell, in seconds. Animatable
fire.lifetimeRange = 0.5;
fire.color = [[UIColor colorWithRed:1.0 green:0.4 blue:0.2 alpha:0.3]
CGColor];
fire.contents = (id)[[UIImage imageNamed:#"PopCornSmall.png"] CGImage];
fire.velocity = 10; // The initial velocity of the cell. Animatable
fire.velocityRange = 20;
fire.scaleSpeed = 0.3;
fire.spin = 0.5;
cursorEmiter.renderMode = kCAEmitterLayerAdditive;
[fire setName:#"fire"];
//add the cell to the layer and we're done
cursorEmiter.emitterCells = [NSArray arrayWithObject:fire];
emiterNotAdded = false;
[(CAEmitterLayer *)self.layer addSublayer: cursorEmiter];

AVfoundation blur background in Video

In my application I have fix composition render size of 1280 x 720. So if will import any portrait video then I have to show blur background with fill and aspect frame of video in centre. Same like this:
https://www.youtube.com/watch?v=yCOrqUA0ws4
I achieved to play both videos using AVMtableComposition, but I don't know how to blur a particular background track. I did following in my code:
self.composition = [AVMutableComposition composition];
AVAsset *firstAsset = [AVAsset assetWithURL:[NSURL fileURLWithPath:[[NSBundle mainBundle] pathForResource:#"ScreenFlow_Blend" ofType:#"mp4"]]];
[self addAsset:firstAsset toComposition:self.composition withTrackID:1];
[self addAsset:firstAsset toComposition:self.composition withTrackID:2];
// [self addAsset:ThirdAsset toComposition:self.composition withTrackID:3];
AVAssetTrack *backVideoTrack = [firstAsset tracksWithMediaType:AVMediaTypeVideo][0];;
self.videoComposition = [AVMutableVideoComposition videoComposition];
self.videoComposition.renderSize = CGSizeMake(1280, 720);
self.videoComposition.frameDuration = CMTimeMake(1, 30);
AVMutableVideoCompositionInstruction *instruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
instruction.timeRange = [backVideoTrack timeRange];
CGFloat scale = 1280/backVideoTrack.naturalSize.width;
CGAffineTransform t = CGAffineTransformMakeScale(scale, scale);
t = CGAffineTransformTranslate(t, 0, -backVideoTrack.naturalSize.height/2 + self.videoComposition.renderSize.height/2);
AVMutableVideoCompositionLayerInstruction *frontLayerInstruction = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstruction];
frontLayerInstruction.trackID = 1;
[frontLayerInstruction setTransform:t atTime:kCMTimeZero];
CGFloat scaleSmall = 720/backVideoTrack.naturalSize.height;
CGAffineTransform translate = CGAffineTransformMakeTranslation(self.videoComposition.renderSize.width/2 - ((backVideoTrack.naturalSize.width/2)*scaleSmall),0);
CGAffineTransform scaleTransform = CGAffineTransformMakeScale(scaleSmall,scaleSmall);
CGAffineTransform finalTransform = CGAffineTransformConcat(scaleTransform, translate);
CGAffineTransform t1 = CGAffineTransformMakeScale(scaleSmall,scaleSmall);
t1 = CGAffineTransformTranslate(t1,1280, 0);
AVMutableVideoCompositionLayerInstruction *backLayerInstruction = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstruction];
backLayerInstruction.trackID = 2;
[backLayerInstruction setTransform:finalTransform atTime:kCMTimeZero];
// AVMutableVideoCompositionLayerInstruction *maskLayerInstruction = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstruction];
// maskLayerInstruction.trackID = 3;
// [maskLayerInstruction setTransform:t atTime:kCMTimeZero];
instruction.layerInstructions = #[backLayerInstruction,frontLayerInstruction];
self.videoComposition.instructions = #[ instruction ];
AVPlayerItem *playerItem = [AVPlayerItem playerItemWithAsset:self.composition];
playerItem.videoComposition = self.videoComposition;
self.player = [AVPlayer playerWithPlayerItem:playerItem];
AVPlayerLayer *newPlayerLayer = [AVPlayerLayer playerLayerWithPlayer:[self player]];
[newPlayerLayer setFrame:[[[self playerView] layer] bounds]];
// [newPlayerLayer setHidden:YES];
[[[self playerView] layer] addSublayer:newPlayerLayer];
[self setPlayerLayer:newPlayerLayer];
Using above code I can achieve this:
https://drive.google.com/open?id=0B2jCvCt5fosyOVNOcGZ1MU1laEU
I know about the customVideoCompositor class to filter composition frames. I tried it but if I use customVideoCompositor then I am loosing my transformation on composition layers. Plus, from customVideoCompositor I don't know how to filter a particular track id.
If someone have any docs link or suggestion then it's really appreciate go forward in this.
Before adding the second video layer which is on the centre of the screen, add this code
UIVisualEffect *blurEffect;
blurEffect = [UIBlurEffect effectWithStyle:UIBlurEffectStyleExtraLight];//Change the effect which you want.
UIVisualEffectView *visualEffectView;
visualEffectView = [[UIVisualEffectView alloc] initWithEffect:blurEffect];
visualEffectView.frame = self.view.bounds;
[self.view addSubview:visualEffectView];
In Swift
let blurEffect = UIBlurEffect(style: UIBlurEffectStyle.Light) //Change the style which suites you
let blurEffectView = UIVisualEffectView(effect: blurEffect)
blurEffectView.frame = view.bounds
blurEffectView.autoresizingMask = [.FlexibleWidth, .FlexibleHeight] // for supporting device rotation
view.addSubview(blurEffectView)
A way to achieve that is using 2 different AVPlayers and a a blur overlay view: backgroundLayer -> bluer overlay view -> frontLayer. You only need to make sure both player start and stop at the same time.
Another options is using 1 AVPlayer and a time observer. Extract the current image of the frontLayer on every frame, blur it and display in a backgroundLayer. The blur function can be found in the same link I provided above.

AVPlayer frame animation

I am developing an application that include functionality to play video with per-frame animation.
You can see an example of such functionality.
I already tried to add CAKeyFrameAnimation to sublayer of AVSynchronizedLayer and have some troubles with it.
I also already tried to pre-render video with AVAssetExportSession, and it is working perfectly. But it's very slow. It needa up to 3 minutes to render such video.
Maybe there are other approaches to make it?
Update:
This is how I implement animation with AVSynchronizedLayer:
let fullScreenAnimationLayer = CALayer()
fullScreenAnimationLayer.frame = videoRect
fullScreenAnimationLayer.geometryFlipped = true
values: [NSValue] = [], times: [NSNumber] = []
// fill values array with positions of face center for each frame
values.append(NSValue(CATransform3D: t))
// fill times with corresoinding time for each frame
times.append(NSNumber(double: (Double(j) / fps) / videoDuration)) // where fps = 25 (according to video file fps)
...
let transform = CAKeyframeAnimation(keyPath: "transform")
transform.duration = videoDuration
transform.calculationMode = kCAAnimationDiscrete
transform.beginTime = AVCoreAnimationBeginTimeAtZero
transform.values = values
transform.keyTimes = times
transform.removedOnCompletion = false
transform.fillMode = kCAFillModeForwards
fullScreenAnimationLayer.addAnimation(transform, forKey: "a_transform")
...
if let syncLayer = AVSynchronizedLayer(playerItem: player.currentItem) {
syncLayer.frame = CGRect(origin: CGPointZero, size: videoView.bounds.size)
syncLayer.addSublayer(fullScreenAnimationLayer)
videoView.layer.addSublayer(syncLayer)
}
Here's my opinion,
add AVPlayer layer property(AVPlayerLayer class) to a sublayer to a UIView layer, then manipulate the view animation.
for example,
AVURLAsset *urlAsset = [AVURLAsset URLAssetWithURL:blahURL options:nil];
AVPlayerItem *playerItem = [AVPlayerItem playerItemWithAsset:urlAsset];
AVPlayer *player = [AVPlayer playerWithPlayerItem:playerItem];
AVPlayerLayer *playerLayer = [AVPlayerLayer playerLayerWithPlayer:player];
playerLayer.frame = yourFrame;
UIView *videoView = [UIView new];
[videoView addSublayer:playerLayer];
then
give animations to videoView

CAAnimation how to raise interpolation or redraw frequency

When animating the position or bounds of a CALayer with a CAKeyframeAnimation (or probably with any other CAAnimation) and exporting a full HD 1920x1080 or HDReady 1280x720 movie using AVAssetExport, the animation in the resulting movie become jumpy.
If the resolution is the same as the native iPhone screen size, the animation stays fluid.
CAAction seem to have no impact on the result. Using a path instead of values doesn't change anything either.
Here's a short short video showing the issue.
My guess is I should need to interpolate more frequently or redraw more (or less) frequently the layer during the animation. Any idea to act on those parameters ?
The layer
CALayer * layer = [CALayer layer];
layer.anchorPoint = CGPointZero;
layer.frame = self.creator.animationLayer.bounds;
layer.backgroundColor = [UIColor clearColor].CGColor;
layer.opacity = 0.f;
layer.contentsGravity = kCAGravityResizeAspectFill;
layer.contents = (__bridge id)image;
The Animation
CAKeyframeAnimation * contentRectAnimation =
[CAKeyframeAnimation animationWithKeyPath:#"position"];
contentRectAnimation.removedOnCompletion = NO;
contentRectAnimation.beginTime = beginTime;
contentRectAnimation.duration = totalDuration;
contentRectAnimation.fillMode = kCAFillModeBoth;
contentRectAnimation.keyTimes = keyTimes;
contentRectAnimation.values = positions;
contentRectAnimation.timingFunctions = timingFunctions;
[self.currentAnimationLayer addAnimation:contentRectAnimation
forKey:#"position"];
CAKeyframeAnimation * boundsAnimation =
[CAKeyframeAnimation animationWithKeyPath:#"bounds"];
boundsAnimation.removedOnCompletion = NO;
boundsAnimation.beginTime = beginTime;
boundsAnimation.duration = totalDuration;
boundsAnimation.fillMode = kCAFillModeBoth;
boundsAnimation.keyTimes = keyTimes;
boundsAnimation.values = bounds;
boundsAnimation.timingFunctions = timingFunctions;
[self.currentAnimationLayer addAnimation:boundsAnimation
forKey:#"bounds"];

Animation doesn't begin when using AVSynchronizedLayer

I've set up a project as a test using AVSyncronizedLayer to move a red line (CALayer) across the screen as a movie plays.
When doing this I referenced the answer given here and have included that solution, but the animation doesn't start when the video does.
If anyone has any ideas where I'm going wrong that would be really helpful. The code is:
- (void)viewDidLoad
{
[super viewDidLoad];
// Do any additional setup after loading the view, typically from a nib.
//create the red line sublayer
redLine = [CALayer layer];
redLine.backgroundColor = [UIColor redColor].CGColor;
redLine.frame = CGRectMake(engravingControl.frame.origin.x, engravingControl.center.y - (engravingControl.frame.size.height/2), 3, engravingControl.frame.size.height);
//create the AVplayer object
NSURL *fileURL = [[NSBundle mainBundle] URLForResource:#"IMG_1306" withExtension:#"mov"];
AVURLAsset *asset = [AVURLAsset URLAssetWithURL:fileURL options:nil];
AVPlayerItem *playerItem = [AVPlayerItem playerItemWithAsset:asset];
AVPlayer *player = [AVPlayer playerWithPlayerItem:playerItem];
AVPlayerLayer *playerLayer = [AVPlayerLayer playerLayerWithPlayer:player];
playerLayer.frame = CGRectMake(0, 0,400,300);
playerLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;
//add the subViews
[self.view.layer addSublayer:playerLayer];
[self.view.layer addSublayer:redLine];
CABasicAnimation *redLineMove;
redLineMove=[CABasicAnimation animationWithKeyPath:#"transform.translation.x"];
redLineMove.duration=10;
redLineMove.fromValue=[NSNumber numberWithFloat:engravingControl.frame.origin.x];
redLineMove.toValue=[NSNumber numberWithFloat:engravingControl.frame.size.width + engravingControl.frame.origin.x];
redLineMove.beginTime = AVCoreAnimationBeginTimeAtZero;
//create the sync layer
AVSynchronizedLayer *syncLayer = [AVSynchronizedLayer synchronizedLayerWithPlayerItem:playerItem];
[syncLayer addSublayer:redLine];
[redLine addAnimation:redLineMove forKey:#"redLineMove"];
[self.view.layer addSublayer:syncLayer];
[player play];
}
I discovered a solution to this. If you don't have the following line of code:
animation.removedOnCompletion = NO;
Then the animation never starts (I presume it is removed by CA before the movie starts to play).
Add that in, and it works as you would expect.

Resources