Partial-screen video preview distorted regardless of UIViewContentMode selection - ios

I'm trying to make a simple app which will show in the top half of the iphone screen a raw preview of what the back camera sees, while in the bottom half the same preview but with various filters applied.
I first got the raw preview part working, not too hard thanks to several SO and blog posts. The UIImageView I'm displaying to takes up the entire screen for that part.
To get a half-screen view I just divide the image view's height by two, then set its contentMode to show everything while keeping the same aspect ratio:
imageView = UIImageView(frame: CGRectMake(0,0, self.view.frame.size.width, self.view.frame.size.height/2))
imageView.contentMode = UIViewContentMode.ScaleAspectFit
The height reduction works, but the image in the view is compressed vertically (e.g. a coin viewed straight-on looks like a horizontal oval). I don't think it's a coincidence that the appearance of the preview looks like the contentMode default ScaleToFill, but nothing I've tried changes the mode.
The complete code is below - the project has one scene with one view controller class; everything's done programatically.
Thanks!
import UIKit
import AVFoundation
class ViewController: UIViewController, AVCaptureVideoDataOutputSampleBufferDelegate
{
var imageView : UIImageView!
override func viewDidLoad()
{
super.viewDidLoad()
imageView = UIImageView(frame: CGRectMake(0,0, self.view.frame.size.width, self.view.frame.size.height/2))
imageView.contentMode = UIViewContentMode.ScaleAspectFit
view.addSubview(imageView)
let captureSession = AVCaptureSession()
captureSession.sessionPreset = AVCaptureSessionPresetHigh
let backCamera = AVCaptureDevice.defaultDeviceWithMediaType(AVMediaTypeVideo)
do
{
let input = try AVCaptureDeviceInput(device: backCamera)
captureSession.addInput(input)
}
catch
{
print("Camera not available")
return
}
// Unused but required for AVCaptureVideoDataOutputSampleBufferDelegate:captureOutput() events to be fired
let previewLayer = AVCaptureVideoPreviewLayer(session: captureSession)
view.layer.addSublayer(previewLayer)
let videoOutput = AVCaptureVideoDataOutput()
videoOutput.setSampleBufferDelegate(self, queue: dispatch_queue_create("SampleBufferDelegate", DISPATCH_QUEUE_SERIAL))
if captureSession.canAddOutput(videoOutput)
{
captureSession.addOutput(videoOutput)
}
videoOutput.connectionWithMediaType(AVMediaTypeVideo).videoOrientation = AVCaptureVideoOrientation.Portrait
captureSession.startRunning()
}
func captureOutput(captureOutput: AVCaptureOutput!, didOutputSampleBuffer sampleBuffer: CMSampleBuffer!, fromConnection connection: AVCaptureConnection!)
{
let pixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer)
let cameraImage = CIImage(CVPixelBuffer: pixelBuffer!)
dispatch_async(dispatch_get_main_queue())
{
self.imageView.image = UIImage(CIImage: cameraImage)
}
}
}

I guess this line has problem because it is in viewDidLoad() method.
Not sure it is related to your issue but I feel this is important note to you.
imageView = UIImageView(frame: CGRectMake(0,0, self.view.frame.size.width, self.view.frame.size.height/2))
You can not get correct view's size in viewDidLoad() function.
Furthermore, from iOS 10, I found controls are initially sized (0, 0, 1000, 1000) from storyboard before they are correctly laid out by iOS layout engine.
Also, the size you get in viewDidLoad() can be size of view controller in storyboard. So if you laid out controls in 4 Inch screen, it will return size of 4 inch screen in viewDidLoad() method even you run the app on iPhone6 or bigger screens.
Also, please set imageView.layer.maskToBounds property to true to prevent any out bounding of image.
One more thing is that you should place code laying out your image view appropriately when the view bounds changes (Like rotation of screen).

Related

iPhone Xs - Why is there a huge padding between the top border of my UIView and the top border of the safe area when using AVCaptureVideoPreviewLayer?

I am developing a custom camera on the iphone Xs Max. My layout is below. The only UIView's top, left, right, and bottom borders are anchored to the safe area. Yet, what I am seeing is a huge black space between the top border of my video capture output and the top border of the safe view. What is this black space and how do i calculate its height?
Layout:
UIView constraints:
code:
class NewCapturViewController: UIViewController, UIImagePickerControllerDelegate,AVCaptureVideoDataOutputSampleBufferDelegate {
var previewLayer = AVCaptureVideoPreviewLayer.init()
var captureSession: AVCaptureSession!
override func viewWillAppear(_ animated: Bool) {
startAVCaptureSession()
}
func startAVCaptureSession() {
print("START CAPTURE SESSION!!")
// Setting Up a Capture Session
self.captureSession = AVCaptureSession()
captureSession.beginConfiguration()
// Configure input
let videoDevice = AVCaptureDevice.default(for: .video)
guard
let videoDeviceInput = try? AVCaptureDeviceInput.init(device: videoDevice!) as AVCaptureInput,
self.captureSession.canAddInput(videoDeviceInput)else {return}
self.captureSession.addInput(videoDeviceInput)
// Capture video output
let videoOutput = AVCaptureVideoDataOutput.init()
guard self.captureSession.canAddOutput(videoOutput) else {return}
videoOutput.setSampleBufferDelegate(self, queue: DispatchQueue.init(label: "videoQueue"))
self.captureSession.addOutput(videoOutput)
// start
self.captureSession.commitConfiguration()
self.captureSession.startRunning()
// Display camera preview
self.previewLayer = AVCaptureVideoPreviewLayer.init(session: self.captureSession)
// Use 'insertSublayer' to enable button to be viewable
self.camViewOutlet.layer.insertSublayer(self.previewLayer, at: 0)
self.previewLayer.frame = self.camViewOutlet.frame
self.previewFrame = previewLayer.frame
print("previewLayer.frame: \(previewLayer.frame)")
}
}
It's because you have set the Align Top to the Safe Area. Just remember that this phone has a notch - so the safe area will be lower.
If you change the phone in your Storyboard to the iPhone Xs (as mentioned in the question), you will see that you have this gap. All you need to do is set the top constraint to the Superview rather than the Safe Area.
You should set layer frames in viewDidLayoutSubviews, not in viewDidLoad. Note that sizes of views change but layers added by you remain static because they are not updated automatically with their container views:
override func viewDidLayoutSubviews() {
super.viewDidLayoutSubviews()
self.previewLayer.frame = self.camViewOutlet.bounds
self.previewFrame = previewLayer.frame
}
Also note the difference between frame and bounds. camViewOutlet.frame is relative to its superview (self.view) but previewLayer is put inside camViewOutlet, therefore you have to use camViewOutlet.bounds. Basically, because camViewOutlet is put X points below the top of the screen (the height of the safe area), previewLayer is also put X points below the top of camViewOutlet.
There are other small issues in your code.
Note that viewWillAppear must call super.viewWillAppear and it can be called multiple times, therefore you should never add views and layers inside it.
You should probably also not start capturing before viewDidAppear has been called.

Custom camera view preview layer Plus size phone issue

I have a custom camera view in a UIViewController that does not take up the full screen. Beneath it is a collection view that fills up with little thumbnails of images taken with "X's" on then to allow the user to delete them before uploading the kept ones to firebase.
This all works lovely on IPhone 7 sized screens. However on the plus size phones, the preview layer of the custom camera view does not fill the entire UIView that the camera is placed in.
override func viewWillAppear(_ animated: Bool) {
super.viewWillAppear(animated)
let deviceSession = AVCaptureDeviceDiscoverySession(deviceTypes:
[.builtInDuoCamera, .builtInTelephotoCamera, .builtInWideAngleCamera], mediaType: AVMediaTypeVideo, position: .unspecified)
for device in (deviceSession?.devices)! {
if device.position == AVCaptureDevicePosition.back {
do {
let input = try AVCaptureDeviceInput(device: device)
if captureSession.canAddInput(input) {
captureSession.addInput(input)
if captureSession.canAddOutput(sessionOutput) {
captureSession.addOutput(sessionOutput)
previewLayer = AVCaptureVideoPreviewLayer(session: captureSession)
previewLayer.videoGravity = AVLayerVideoGravityResizeAspectFill
previewLayer.connection.videoOrientation = .portrait
cameraView.layer.addSublayer(previewLayer)
cameraView.addSubview(takePhotoButton)
previewLayer.position = CGPoint(x: self.view.frame.width/2, y: self.cameraView.frame.height/2)
previewLayer.bounds = cameraView.frame
captureSession.startRunning()
}
}
} catch let avError {
print(avError)
}
}
}
}
Here is where I add the preview layer as a sublayer of the CameraView which is just a UIView added in the interface builder with constraints to allow space in the view for the collection view beneath it.
previewLayer.position = CGPoint(x: self.cameraView.frame.width/2, y: self.cameraView.frame.height/2)
previewLayer.bounds = cameraView.frame
I don't understand why the preview layer added as a sublayer would behave any differently on an Iphone7 than an Iphone 7 Plus??
Especially because you can see on the Iphone 7 Plus, the grey around the preview layer which suggests that the CameraView is being constrained correctly, why would it's frame.width be different.
It seems that on the Plus size phone, the preview layer remains the same size as it would on a regular size phone, while the CameraView itself conforms to the larger phone screen correctly.
here is the View in question in the interface builder
And this is what seems to happen on a plus size phone
TL;DR: Set coordinates and frames of your views and layers in viewDidAppear() method.
At the time viewWillAppear() method executed, all view controller's subviews have frame value that you've set in your Storyboard. What you need is to use updated cameraView.frame value recalculated for the actual device size, to set your previewLayer.position and previewLayer.bounds properly. viewDidLayoutSubviews() method exists just for this purpose — at the time it executed, all views have actual sizes recalculated for the particular device app is running on.
viewDidLayoutSubviews() can be called multiple times though, which might lead to multiple instances of your views/layers created. There is another method in view controller's lifecycle which is called with autolayouted subviews just once — viewDidAppear().
So what you need is to move your previewLayer related code from viewWillAppear() to viewDidAppear().

Save what user see in AVCaptureVideoPreviewLayer

I'm trying to develop an app with custom camera where the user can add filters or sticker (like in TextCamera app) and share in social feed.
But I found my first problem.
I show the preview to the user with AVCaptureVideoPreviewLayer, take the photo and pass it to another view controller in a UiImageView but the second picture is bigger than first one.
I tried to resize the picture with this function:
func resize(image: UIImage) -> UIImage {
let size = image.size
let newWidth = CGFloat(size.width)
let newHeight = CGFloat(size.height - blackBottomTab.bounds.size.height)
let newSize = CGSizeMake(newWidth,newHeight)
let rect = CGRectMake(0, 0, newSize.width, newSize.height)
UIGraphicsBeginImageContextWithOptions(newSize, false, 1.0)
image.drawInRect(rect)
let newImage = UIGraphicsGetImageFromCurrentImageContext()
UIGraphicsEndImageContext()
return newImage
}
In this function I subtract the height of the black view (under the button) from the image height. But the result that I have is different (see the photo attached).
This is my preview with a black view under the button
This is the photo taken larger than preview one
I also tried to use Aspect Fit in Storyboard Image View of the second View Controller but the result is the same.
Where is my error? Thank you to everyone that help me!
I think that the AVCaptureVideoPreviewLayer frame is the same as the screen's frame (UIScreen.mainScreen().bounds), and you added the "Shoot Photo" black view on top of it. Instead, you should change the frame of the AVCaptureVideoPreviewLayer.
Your case (what I think):
Assuming that the green rectangle is the AVCaptureVideoPreviewLayer frame and the red one is the black view frame. So, it covers (on top) of the green rectangle.
Make them look like this:
Hope that helped.
I had to solve a similar problem. As the question notes, there does not appear to be an easy way to detect the size of the video preview.
My solution is hinted at at the end of the answer to https://stackoverflow.com/a/57996039/10449843 which explains in detail how I take the snapshot and create the combined snapshot with the sticker.
Here is an elaboration of that hint.
While I use AVCapturePhotoCaptureDelegate to take the snapshot, I also use AVCaptureVideoDataOutputSampleBufferDelegate to sample the buffer once when the preview is first shown to detect the proportions of the snapshot.
// The preview layer is displayed with aspect fill
let previewLayer = AVCaptureVideoPreviewLayer(session: session)
previewLayer.videoGravity = .resizeAspect
previewLayer.frame = self.cameraView.bounds
Detect the size of the preview:
func captureOutput(_ output: AVCaptureOutput, didOutput sampleBuffer: CMSampleBuffer, from connection: AVCaptureConnection) {
// Only need to do this once
guard self.sampleVideoSize == true else {
return
}
guard let pixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer) else {
return
}
// Sample the size of the preview layer and then store it
DispatchQueue.main.async {
let width = CGFloat(CVPixelBufferGetWidth(pixelBuffer))
let height = CGFloat(CVPixelBufferGetHeight(pixelBuffer))
// Store the video size in a local variable
// We don't care which side is which, we just need the
// picture ratio to decide how to align it on different
// screens.
self.videoSize = CGSize.init(width: width, height: height)
// Now we can set up filters and stickers etc
...
// And we don't need to sample the size again
self.sampleVideoSize = false
}
return
}

AVCaptureVideoPreviewLayer isn't showing centre of preview

I'm trying to display a video preview for a custom camera view in my app, however at the moment the preview is very much off centre - especially noticeable when using the front camera.
I'm positioning it like so:
if let previewLayer = AVCaptureVideoPreviewLayer(session: captureSession) {
previewLayer.bounds = scanningView.bounds
previewLayer.position = CGPointMake(CGRectGetMidX(scanningView.bounds), CGRectGetMidY(scanningView.bounds))
previewLayer.videoGravity = AVLayerVideoGravityResizeAspectFill
scanningView.layer.insertSublayer(previewLayer, atIndex: 0)
}
However the centre of the preview is not in the centre of the view it is in. How can I make this happen?
Not sure you need the CGRectGetMidX/Y just place give the preview layer a frame equal to the hosting view.
previewLayer.videoGravity = AVLayerVideoGravityResizeAspect
scanningView.layer.addSublayer(previewLayer)
Also, you using auto layout? If so you might want adjust the frame when viewWillLayoutSubviews gets called
override func viewWillLayoutSubviews() {
super.viewWillLayoutSubviews()
previewLayer.frame = scanningView.bounds
}
Instead of adding a layer and setting frame it seems better to use custom UIView and assign its subLayer class to AVCaptureVideoPreviewLayer. See the example below
final class PreviewView: UIView {
// 1
override class var layerClass: AnyClass {
return AVCaptureVideoPreviewLayer.self
}
.
.
I found the solution this way.

How should I properly add constraints to an AVCaptureVideoPreviewLayer?

I have a Navigation Bar at the top of my view. I'm adding an AVCaptureVideoPreviewLayer but the way it gets positioned, there is a gap between the bottom of the Nav Bar and the top of the Preview Layer.
self.view.backgroundColor = UIColor.redColor()
var navBarFrame = CGRectMake(0, 0, self.view.frame.width, 64.0)
var navBar = UINavigationBar(frame: navBarFrame)
var navItem = UINavigationItem()
navItem.title = "zzzz"
navBar.pushNavigationItem(navItem, animated: false)
self.view.addSubview(navBar)
// init device input
var error: NSErrorPointer!
var deviceInput: AVCaptureInput = AVCaptureDeviceInput.deviceInputWithDevice(captureDevice, error: error) as AVCaptureInput
self.stillImageOutput = AVCaptureStillImageOutput()
// init session
self.session = AVCaptureSession()
self.session.sessionPreset = AVCaptureSessionPresetPhoto
self.session.addInput(deviceInput as AVCaptureInput)
self.session.addOutput(self.stillImageOutput)
// layer for preview
var previewLayer: AVCaptureVideoPreviewLayer = AVCaptureVideoPreviewLayer.layerWithSession(self.session) as AVCaptureVideoPreviewLayer
previewLayer.frame = self.view.bounds
self.view.layer.addSublayer(previewLayer)
How can I ensure that the Preview Layer is constrained to the bottom of the Nav Bar?
Design a view in IB with the constraints you want. Add the preview layer to this view instead, you also need to update your layer size when the view size changes, use the layoutSubviews method from your view controller to get updates when the size changes.

Resources