Can't seem to get video height for AVPlayerLayer - ios

I can't seem to get the height of the video from the URL I have. Every time I log the playerLayer.videoRect.bounds.size.the height returns 0.000. Any help would be appreciated on how to adjust the playerLayer's height based on the video.
I have the playerLayer's height set to half the height of the view just to get the video to show up.
Here is my edited code:
NSURL *videoURL = [NSURL URLWithString:self.videoURL];
self.player = [AVPlayer playerWithURL:videoURL];
AVAssetTrack* track = [[AVAsset assetWithURL:videoURL] tracksWithMediaType:AVMediaTypeVideo].firstObject;
CGSize size = CGSizeApplyAffineTransform(track.naturalSize, track.preferredTransform);
CGFloat videoWidth = size.width;
CGFloat videoHeight = size.height;
NSLog(#"%f wide, %f high)", videoWidth, videoHeight);
self.playerLayer = [AVPlayerLayer playerLayerWithPlayer:self.player];
self.playerLayer.frame = CGRectMake(0, self.navigationController.navigationBar.bottom, self.view.bounds.size.width, videoHeight);
[self.view.layer addSublayer:self.playerLayer];
[self.player play];
self.player.actionAtItemEnd = AVPlayerActionAtItemEndNone;

You can use AvAssetTrack to get size of video.
NSString* urlString = #"http://clips.vorwaerts-gmbh.de/big_buck_bunny.mp4";
NSURL* url = [[NSURL alloc]initWithString:urlString];
AVAssetTrack* track = [[AVAsset assetWithURL:url] tracksWithMediaType:AVMediaTypeVideo].firstObject;
CGSize size = CGSizeApplyAffineTransform(track.naturalSize, track.preferredTransform);
CGFloat width = size.width; // 640
CGFloat height = size.height; // 360
Note: You should put it in background mode because it will block main thread until finish get size success.

There are two problems:
You are asking for this information too soon. Like everything else about video, gathering the video's size takes time.
You have the dependency order backwards. The videoRect depends upon the layer's bounds. But you have not given it any bounds; it has zero size.

Related

AVfoundation blur background in Video

In my application I have fix composition render size of 1280 x 720. So if will import any portrait video then I have to show blur background with fill and aspect frame of video in centre. Same like this:
https://www.youtube.com/watch?v=yCOrqUA0ws4
I achieved to play both videos using AVMtableComposition, but I don't know how to blur a particular background track. I did following in my code:
self.composition = [AVMutableComposition composition];
AVAsset *firstAsset = [AVAsset assetWithURL:[NSURL fileURLWithPath:[[NSBundle mainBundle] pathForResource:#"ScreenFlow_Blend" ofType:#"mp4"]]];
[self addAsset:firstAsset toComposition:self.composition withTrackID:1];
[self addAsset:firstAsset toComposition:self.composition withTrackID:2];
// [self addAsset:ThirdAsset toComposition:self.composition withTrackID:3];
AVAssetTrack *backVideoTrack = [firstAsset tracksWithMediaType:AVMediaTypeVideo][0];;
self.videoComposition = [AVMutableVideoComposition videoComposition];
self.videoComposition.renderSize = CGSizeMake(1280, 720);
self.videoComposition.frameDuration = CMTimeMake(1, 30);
AVMutableVideoCompositionInstruction *instruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
instruction.timeRange = [backVideoTrack timeRange];
CGFloat scale = 1280/backVideoTrack.naturalSize.width;
CGAffineTransform t = CGAffineTransformMakeScale(scale, scale);
t = CGAffineTransformTranslate(t, 0, -backVideoTrack.naturalSize.height/2 + self.videoComposition.renderSize.height/2);
AVMutableVideoCompositionLayerInstruction *frontLayerInstruction = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstruction];
frontLayerInstruction.trackID = 1;
[frontLayerInstruction setTransform:t atTime:kCMTimeZero];
CGFloat scaleSmall = 720/backVideoTrack.naturalSize.height;
CGAffineTransform translate = CGAffineTransformMakeTranslation(self.videoComposition.renderSize.width/2 - ((backVideoTrack.naturalSize.width/2)*scaleSmall),0);
CGAffineTransform scaleTransform = CGAffineTransformMakeScale(scaleSmall,scaleSmall);
CGAffineTransform finalTransform = CGAffineTransformConcat(scaleTransform, translate);
CGAffineTransform t1 = CGAffineTransformMakeScale(scaleSmall,scaleSmall);
t1 = CGAffineTransformTranslate(t1,1280, 0);
AVMutableVideoCompositionLayerInstruction *backLayerInstruction = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstruction];
backLayerInstruction.trackID = 2;
[backLayerInstruction setTransform:finalTransform atTime:kCMTimeZero];
// AVMutableVideoCompositionLayerInstruction *maskLayerInstruction = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstruction];
// maskLayerInstruction.trackID = 3;
// [maskLayerInstruction setTransform:t atTime:kCMTimeZero];
instruction.layerInstructions = #[backLayerInstruction,frontLayerInstruction];
self.videoComposition.instructions = #[ instruction ];
AVPlayerItem *playerItem = [AVPlayerItem playerItemWithAsset:self.composition];
playerItem.videoComposition = self.videoComposition;
self.player = [AVPlayer playerWithPlayerItem:playerItem];
AVPlayerLayer *newPlayerLayer = [AVPlayerLayer playerLayerWithPlayer:[self player]];
[newPlayerLayer setFrame:[[[self playerView] layer] bounds]];
// [newPlayerLayer setHidden:YES];
[[[self playerView] layer] addSublayer:newPlayerLayer];
[self setPlayerLayer:newPlayerLayer];
Using above code I can achieve this:
https://drive.google.com/open?id=0B2jCvCt5fosyOVNOcGZ1MU1laEU
I know about the customVideoCompositor class to filter composition frames. I tried it but if I use customVideoCompositor then I am loosing my transformation on composition layers. Plus, from customVideoCompositor I don't know how to filter a particular track id.
If someone have any docs link or suggestion then it's really appreciate go forward in this.
Before adding the second video layer which is on the centre of the screen, add this code
UIVisualEffect *blurEffect;
blurEffect = [UIBlurEffect effectWithStyle:UIBlurEffectStyleExtraLight];//Change the effect which you want.
UIVisualEffectView *visualEffectView;
visualEffectView = [[UIVisualEffectView alloc] initWithEffect:blurEffect];
visualEffectView.frame = self.view.bounds;
[self.view addSubview:visualEffectView];
In Swift
let blurEffect = UIBlurEffect(style: UIBlurEffectStyle.Light) //Change the style which suites you
let blurEffectView = UIVisualEffectView(effect: blurEffect)
blurEffectView.frame = view.bounds
blurEffectView.autoresizingMask = [.FlexibleWidth, .FlexibleHeight] // for supporting device rotation
view.addSubview(blurEffectView)
A way to achieve that is using 2 different AVPlayers and a a blur overlay view: backgroundLayer -> bluer overlay view -> frontLayer. You only need to make sure both player start and stop at the same time.
Another options is using 1 AVPlayer and a time observer. Extract the current image of the frontLayer on every frame, blur it and display in a backgroundLayer. The blur function can be found in the same link I provided above.

AVPlayer frame animation

I am developing an application that include functionality to play video with per-frame animation.
You can see an example of such functionality.
I already tried to add CAKeyFrameAnimation to sublayer of AVSynchronizedLayer and have some troubles with it.
I also already tried to pre-render video with AVAssetExportSession, and it is working perfectly. But it's very slow. It needa up to 3 minutes to render such video.
Maybe there are other approaches to make it?
Update:
This is how I implement animation with AVSynchronizedLayer:
let fullScreenAnimationLayer = CALayer()
fullScreenAnimationLayer.frame = videoRect
fullScreenAnimationLayer.geometryFlipped = true
values: [NSValue] = [], times: [NSNumber] = []
// fill values array with positions of face center for each frame
values.append(NSValue(CATransform3D: t))
// fill times with corresoinding time for each frame
times.append(NSNumber(double: (Double(j) / fps) / videoDuration)) // where fps = 25 (according to video file fps)
...
let transform = CAKeyframeAnimation(keyPath: "transform")
transform.duration = videoDuration
transform.calculationMode = kCAAnimationDiscrete
transform.beginTime = AVCoreAnimationBeginTimeAtZero
transform.values = values
transform.keyTimes = times
transform.removedOnCompletion = false
transform.fillMode = kCAFillModeForwards
fullScreenAnimationLayer.addAnimation(transform, forKey: "a_transform")
...
if let syncLayer = AVSynchronizedLayer(playerItem: player.currentItem) {
syncLayer.frame = CGRect(origin: CGPointZero, size: videoView.bounds.size)
syncLayer.addSublayer(fullScreenAnimationLayer)
videoView.layer.addSublayer(syncLayer)
}
Here's my opinion,
add AVPlayer layer property(AVPlayerLayer class) to a sublayer to a UIView layer, then manipulate the view animation.
for example,
AVURLAsset *urlAsset = [AVURLAsset URLAssetWithURL:blahURL options:nil];
AVPlayerItem *playerItem = [AVPlayerItem playerItemWithAsset:urlAsset];
AVPlayer *player = [AVPlayer playerWithPlayerItem:playerItem];
AVPlayerLayer *playerLayer = [AVPlayerLayer playerLayerWithPlayer:player];
playerLayer.frame = yourFrame;
UIView *videoView = [UIView new];
[videoView addSublayer:playerLayer];
then
give animations to videoView

iOS - Playing video through UIWebView: Player controls appear in awkward position

I'm trying to play a .m3u8 through a webView inside a HTML video tag.
The video plays correctly and has the functionality I need. But the presentation is off... The player controls appear on the bottom of the video, overlayed half-way on and half-way off the video content.
(Sorry, massive screenshots)
Here, I extended the frame for the webview, and you can see that the video stops, but the controls appear lower.
Does anyone know if there is a way to reposition the player's controls?
Here's my webview code:
- (void)viewDidAppear:(BOOL)animated {
_webView.delegate = self;
_webView.allowsInlineMediaPlayback = YES;
NSString *urlAddress = #"http://www.pulpwoodpress.com/playVidTest.html";
NSURL *url = [NSURL URLWithString:urlAddress];
NSURLRequest *requestObj = [NSURLRequest requestWithURL:url];
[_webView loadRequest:requestObj];
_webView.scrollView.scrollEnabled = NO;
}
- (void)webViewDidFinishLoad:(UIWebView *)theWebView
{
NSLog(#"%f, %f", theWebView.scrollView.contentSize.width, theWebView.scrollView.contentSize.height);
CGSize contentSize = CGSizeMake(theWebView.scrollView.contentSize.width, theWebView.scrollView.contentSize.height);
CGSize viewSize = self.view.bounds.size;
float rw = viewSize.width / contentSize.width;
theWebView.scrollView.minimumZoomScale = rw;
theWebView.scrollView.maximumZoomScale = rw;
theWebView.scrollView.zoomScale = rw;
theWebView.scrollView.contentOffset = CGPointMake(0, 17);
}

AVSynchronizedLayer not synchronizing animation

I'm having issues making the animation use the AVPlayer time instead of the system time. the synchronized layer does not work properly and animations stay synchronized on the system time instead of the player time. I know the player do play. and if I pass CACurrentMediaTime() to the beginTime, the animation start right away as it should when not synchronized.
EDIT
I can see the red square in its final state since the beginning, which mean the animation has reach its end at the beginning because it is synchronized on the system time and not the AVPlayerItem time.
// play composition live in order to modifier
AVPlayerItem * playerItem = [AVPlayerItem playerItemWithAsset:composition];
AVPlayer * player = [AVPlayer playerWithPlayerItem:playerItem];
AVPlayerLayer * playerLayer = [AVPlayerLayer playerLayerWithPlayer:player ];
playerLayer.frame = [UIScreen mainScreen].bounds;
if (!playerItem) {
NSLog(#"playerItem empty");
}
// dummy time
playerItem.forwardPlaybackEndTime = totalDuration;
playerItem.videoComposition = videoComposition;
CALayer * aLayer = [CALayer layer];
aLayer.frame = CGRectMake(100, 100, 60, 60);
aLayer.backgroundColor = [UIColor redColor].CGColor;
aLayer.opacity = 0.f;
CAKeyframeAnimation * keyframeAnimation2 = [CAKeyframeAnimation animationWithKeyPath:#"opacity"];
keyframeAnimation2.removedOnCompletion = NO;
keyframeAnimation2.beginTime = 0.1;
keyframeAnimation2.duration = 4.0;
keyframeAnimation2.fillMode = kCAFillModeBoth;
keyframeAnimation2.keyTimes = #[#0.0, #1.0];
keyframeAnimation2.values = #[#0.f, #1.f];
NSLog(#"%f current media time", CACurrentMediaTime());
[aLayer addAnimation:keyframeAnimation2
forKey:#"opacity"];
[self.parentLayer addSublayer:aLayer];
AVSynchronizedLayer * synchronizedLayer =
[AVSynchronizedLayer synchronizedLayerWithPlayerItem:playerItem];
synchronizedLayer.frame = [UIScreen mainScreen].bounds;
[synchronizedLayer addSublayer:self.parentLayer];
[playerLayer addSublayer:synchronizedLayer];
The solution was that AVSynchronizedLayer doesn't work on the Simulator but works fine on a device.

Animation doesn't begin when using AVSynchronizedLayer

I've set up a project as a test using AVSyncronizedLayer to move a red line (CALayer) across the screen as a movie plays.
When doing this I referenced the answer given here and have included that solution, but the animation doesn't start when the video does.
If anyone has any ideas where I'm going wrong that would be really helpful. The code is:
- (void)viewDidLoad
{
[super viewDidLoad];
// Do any additional setup after loading the view, typically from a nib.
//create the red line sublayer
redLine = [CALayer layer];
redLine.backgroundColor = [UIColor redColor].CGColor;
redLine.frame = CGRectMake(engravingControl.frame.origin.x, engravingControl.center.y - (engravingControl.frame.size.height/2), 3, engravingControl.frame.size.height);
//create the AVplayer object
NSURL *fileURL = [[NSBundle mainBundle] URLForResource:#"IMG_1306" withExtension:#"mov"];
AVURLAsset *asset = [AVURLAsset URLAssetWithURL:fileURL options:nil];
AVPlayerItem *playerItem = [AVPlayerItem playerItemWithAsset:asset];
AVPlayer *player = [AVPlayer playerWithPlayerItem:playerItem];
AVPlayerLayer *playerLayer = [AVPlayerLayer playerLayerWithPlayer:player];
playerLayer.frame = CGRectMake(0, 0,400,300);
playerLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;
//add the subViews
[self.view.layer addSublayer:playerLayer];
[self.view.layer addSublayer:redLine];
CABasicAnimation *redLineMove;
redLineMove=[CABasicAnimation animationWithKeyPath:#"transform.translation.x"];
redLineMove.duration=10;
redLineMove.fromValue=[NSNumber numberWithFloat:engravingControl.frame.origin.x];
redLineMove.toValue=[NSNumber numberWithFloat:engravingControl.frame.size.width + engravingControl.frame.origin.x];
redLineMove.beginTime = AVCoreAnimationBeginTimeAtZero;
//create the sync layer
AVSynchronizedLayer *syncLayer = [AVSynchronizedLayer synchronizedLayerWithPlayerItem:playerItem];
[syncLayer addSublayer:redLine];
[redLine addAnimation:redLineMove forKey:#"redLineMove"];
[self.view.layer addSublayer:syncLayer];
[player play];
}
I discovered a solution to this. If you don't have the following line of code:
animation.removedOnCompletion = NO;
Then the animation never starts (I presume it is removed by CA before the movie starts to play).
Add that in, and it works as you would expect.

Resources