In my project I am using AVSampleBufferDisplayLayer and AVPlayerLayer. Bot of them have similar interface and videoGravity property.
When I change AVPlayerLayer videoGravity property video is re-sized immediately with animation effect. With AVSampleBufferDisplayLayer videoGravity nothing happen till I change device orientation and then video is re-sized without animation.
How to change videoGravity of AVSampleBufferDisplayLayer to behave similar to the AVPlayerLayer?
The only solution I've found is to reinitialize the AVSampleBufferDisplayLayer for example:
var displayBufferLayer: AVSampleBufferDisplayLayer?
...
func reinitBufferLayer(videoGravity: AVLayerVideoGravity) {
displayBufferLayer?.flush()
displayBufferLayer?.stopRequestingMediaData()
displayBufferLayer?.removeFromSuperlayer()
let bufferLayer = AVSampleBufferDisplayLayer()
bufferLayer.frame = view.bounds
bufferLayer.videoGravity = videoGravity // (i.e. .resizeAspectFill)
bufferLayer.isOpaque = true
view.layer.insertSublayer(bufferLayer, at: 0)
self.displayBufferLayer = bufferLayer
}
Related
How to add an overlay UIImage in PNG format (such as a logo) with alpha channel on top of video playback, especially in case of external playback mode of AVPlayer when video is casted on Apple TV via AirPlay?
I would like add an overlay UIImage in PNG format with alpha channel (such as a logo) on top of video playback. This can be easily done on the phone by using contentOverlayView of AVPlayerViewController. However, when the video is casted and played on Apple TV via AirPlay, the contentOverlayView does not display.
I also tried to apply customised UIImageView on external screen when UIScreen has detected the external screen. However, the image still does not show. Instead, only the video playback is shown on the external screen. Here you can see my code for this approach:
if UIScreen.screens.count > 1 {
let externalScreen = UIScreen.screens[1]
print("Playing: externalScreen.bounds: \(externalScreen.bounds)")
let secondWindow = UIWindow(frame: externalScreen.bounds)
secondWindow.screen = externalScreen
let overlayImage = UIImage(named: "rain.png")
let overlayImageView = UIImageView(frame: externalScreen.bounds)
overlayImageView.image = overlayImage
secondWindow.addSubview(overlayImageView)
secondWindow.isHidden = false
secondWindow.makeKeyAndVisible()
}
It does not sound like a difficult problem as adding overlay views like subtitle or image logo seems to be very common operations. Can somebody help?
Thanks.
I am using MPMoviePlayerController for showing video inside UIView and using the code below:
self.moviePlayerSmall = MPMoviePlayerController(contentURL:self.objUrl )
self.moviePlayerSmall.view.frame = self.videoView.bounds
self.videoView.addSubview(self.moviePlayerSmall.view)
self.moviePlayerSmall.view.autoresizingMask = [.FlexibleWidth, .FlexibleHeight]
for subV: UIView in self.moviePlayerSmall.view.subviews {
subV.backgroundColor = UIColor.whiteColor()
}
self.moviePlayerSmall.fullscreen = false
self.moviePlayerSmall.controlStyle = MPMovieControlStyle.Default
self.moviePlayerSmall.scalingMode = .AspectFill
Where videoview is the UIView in which I am adding the MPMoviePlayer view.
The issue I am facing is that whenever the video player comes out of full screen then the player view shows spacing on the left and right sides.
I have tried setting up contentmode, ScaleMode and even AutorezingMask but nothing is working. Have anyone experienced the same issue?
I'm using AVFoundation to do some video recording and I've looked all over for how to get the video to aspect fill. I've read through avfoundation guide by apple and class referene for AVCaptureVideoPreviewLayer. I also read this question here AVFoundation camera preview layer not working . Here is my code
videoPreviewLayer.frame = captureView.frame
videoPreviewLayer.frame.origin = CGPoint(x: 0, y: 0)
videoPreviewLayer.backgroundColor = UIColor.redColor().CGColor
videoPreviewLayer.contentsGravity = AVLayerVideoGravityResizeAspectFill
videoPreviewLayer.masksToBounds = true
captureView.layer.addSublayer(videoPreviewLayer)
I put this in viewDidLayoutSubviews() so that I can get the correct size for captureView.frame which is the UIView my previewLayer is inside of. Any clue why aspectFill isn't working? As you can see from the red background the layer is the correct size but the contentsGravity isn't filling the layer.
Found my answer in this link AVCaptureVideoPreviewLayer . I needed to use videoPreviewLayer.videoGravity = AVLayerVideoGravityResizeAspectFill instead of videoPreviewLayer.contentsGravity = AVLayerVideoGravityResizeAspectFill
For me, the answer for my problem was provided by NSGangster in the question:
I put this in viewDidLayoutSubviews() so that I can get the correct size for captureView.frame which is the UIView my previewLayer is inside of.
I originally had the code in viewDidLoad() which meant the layer was resizing before the true bounds of the view was determined.
Thanks for sharing.
I'm trying to get the video output on my screen in Swift. But the screen stays completely white. I found this tutorial in ObjC and I followed it (only in Swift style syntax).
In there there is a line previewLayer.frame = myView.bounds;
But the field .frame seems to be read only in swift. And I think this might be why I don't see anything on the screen.
How can I set the frame for the previewLayer in Swift?
I see three points in that tutorial where you could end up not displaying the preview, and thus getting a white screen. Below are the Obj-C and Swift counterparts.
1) You might have missed adding the input to the capture session:
// [session addInput:input];
session.addInput(input)
2) You might not have initialized the preview layer's bounds to that of your view controller:
// UIView *myView = self.view;
// previewLayer.frame = myView.bounds;
previewLayer.frame = self.view.bounds
3) You might not have added the preview layer as a sublayer of your view:
// [self.view.layer addSublayer:previewLayer];
self.view.layer.addSublayer(previewLayer)
I have implemented a custom movie player with AVPlayer. On setting the value of videoGravity in AVPlayerLayer to AVLayerVideoGravityResizeAspectFill I see the desired effect in iOS 4.2, 4.3. But somehow on iOS 5.0 it has got no effect. Is anybody seeing a similar issue? Am I doing something wrong?
On iOS5 you should reset layers bounds after setting videoGravity.
This worked for me:
((AVPlayerLayer *)[self layer]).videoGravity = AVLayerVideoGravityResizeAspectFill;
((AVPlayerLayer *)[self layer]).bounds = ((AVPlayerLayer *)[self layer]).bounds;
EDITED: "self" points to a PlayerView (subclass of UIView) object from example "Putting all together":
https://developer.apple.com/library/ios/#documentation/AudioVideo/Conceptual/AVFoundationPG/Articles/02_Playback.html
Found the solution to this issue. Tick checkbox "Clip Subviews" in IB for the view with the layer you're going to attach the video player to. Then, set the AVLayerVideoGravityResizeAspectFill of your AVPlayerLayer object. If you don't have the view in IB but you're creating it programmatically, set its clipsToBounds property to YES.