camera roll image is bigger than the camera preview - ios

I am creating a camera app. I can take a image and the image is passing back to a view controller to show the image. It also saved to the camera roll.
If I compare the image with the camera preview to the saved image, it seems like that the camera preview is a little bit zoomed in.
Image from camera preview:
Image from Gallery:
This is my code so far from the camera preview:
let cameraPreviewLayer = AVCaptureVideoPreviewLayer(session: captureSession)
cameraPreviewLayer.videoGravity = .resizeAspectFill
cameraPreviewLayer.connection?.videoOrientation = .portrait
cameraPreviewLayer.frame = view.frame
view.layer.insertSublayer(cameraPreviewLayer, at: 0)
captureSession.startRunning()
How can I make the camera preview to the same size as the saved image?

The culprit is the videoGravity setting. .resizeAspectFill will resize the preview to fill the whole view, even when that means cropping content.
If you want to see the whole frame, set it to .resizeAspect. This will introduce black bars, though.

If you want the captured Image to have the same size as the camera preview. You should set videoGravity = .resize

Related

Overlay an image (UIImageView) into photo

Is there a way to integrate a UIImageView when taking photos?
I can't seem to find a topic discussing this.
What I wanted to do is to have a UIImageView overlay on the preview. When the shutter is tapped, it should take the picture with the UIImageView embedded in it.
So here's where I am right now. The "test image" part will be made draggable on top of the preview view. I want to include that image with the image I'll be taking when I push the shutter button.
Is this possible? Or is there a better way?
I'm using AVCaptureVideoPreviewLayer, AVCaptureSession and AVCapturePhotoOutput. I'm just putting the UIIMageView above the UIView where the AVCaptureVideoPreviewLayer is also a subview.
EDIT: Tested saving the UIView as an image, and it doesn't include the preview layer (only returns the UIView + UIImageView overlay).
There is at least two ways of doing this.
1) Capture photo from camera and add overlay image based on draggable image view position. This will produce best possible quality.
2) If you don't need photo quality you can use another solution - render draggable view layer in graphic context and video preview layer:
func takePhotoWithOverlay() -> UIImage? {
let overlayImageView // UIImageView with overlay image
let videoPreviewLayer // AVCaptureVideoPreviewLayer
guard let superview = overlayImageView.superview else { return nil }
UIGraphicsBeginImageContextWithOptions(superview.bounds.size, false, 0.0)
defer { UIGraphicsEndImageContext() }
guard let context = UIGraphicsGetCurrentContext() else { return nil }
videoPreviewLayer.render(in: context)
overlayImageView.layer.render(in: context)
return UIGraphicsGetImageFromCurrentImageContext()
}

How to display image overlay on video in external playback mode of AVPlayer via AirPlay?

How to add an overlay UIImage in PNG format (such as a logo) with alpha channel on top of video playback, especially in case of external playback mode of AVPlayer when video is casted on Apple TV via AirPlay?
I would like add an overlay UIImage in PNG format with alpha channel (such as a logo) on top of video playback. This can be easily done on the phone by using contentOverlayView of AVPlayerViewController. However, when the video is casted and played on Apple TV via AirPlay, the contentOverlayView does not display.
I also tried to apply customised UIImageView on external screen when UIScreen has detected the external screen. However, the image still does not show. Instead, only the video playback is shown on the external screen. Here you can see my code for this approach:
if UIScreen.screens.count > 1 {
let externalScreen = UIScreen.screens[1]
print("Playing: externalScreen.bounds: \(externalScreen.bounds)")
let secondWindow = UIWindow(frame: externalScreen.bounds)
secondWindow.screen = externalScreen
let overlayImage = UIImage(named: "rain.png")
let overlayImageView = UIImageView(frame: externalScreen.bounds)
overlayImageView.image = overlayImage
secondWindow.addSubview(overlayImageView)
secondWindow.isHidden = false
secondWindow.makeKeyAndVisible()
}
It does not sound like a difficult problem as adding overlay views like subtitle or image logo seems to be very common operations. Can somebody help?
Thanks.

swift: avfoundation to capture images

i basically have two UIImages. One called previewImage that displays what the AVFoundation Camera displays, and captureImage that displays the image taken using the camera. When I had:
previewLayer!.videoGravity = AVLayerVideoGravityResize
the two images displayed the same formatting, but when I changed it to:
previewLayer!.videoGravity = AVLayerVideoGravityResizeAspectFill
it seems that the previewImage is horizontally flattened. Is there a way to adjust the captureImage UIImage so that it displays what the previewImage displays through the video feed?Part of code that assigns image to imageView UIImage

AVPreviewLayer contentsGravity not filling Layer

I'm using AVFoundation to do some video recording and I've looked all over for how to get the video to aspect fill. I've read through avfoundation guide by apple and class referene for AVCaptureVideoPreviewLayer. I also read this question here AVFoundation camera preview layer not working . Here is my code
videoPreviewLayer.frame = captureView.frame
videoPreviewLayer.frame.origin = CGPoint(x: 0, y: 0)
videoPreviewLayer.backgroundColor = UIColor.redColor().CGColor
videoPreviewLayer.contentsGravity = AVLayerVideoGravityResizeAspectFill
videoPreviewLayer.masksToBounds = true
captureView.layer.addSublayer(videoPreviewLayer)
I put this in viewDidLayoutSubviews() so that I can get the correct size for captureView.frame which is the UIView my previewLayer is inside of. Any clue why aspectFill isn't working? As you can see from the red background the layer is the correct size but the contentsGravity isn't filling the layer.
Found my answer in this link AVCaptureVideoPreviewLayer . I needed to use videoPreviewLayer.videoGravity = AVLayerVideoGravityResizeAspectFill instead of videoPreviewLayer.contentsGravity = AVLayerVideoGravityResizeAspectFill
For me, the answer for my problem was provided by NSGangster in the question:
I put this in viewDidLayoutSubviews() so that I can get the correct size for captureView.frame which is the UIView my previewLayer is inside of.
I originally had the code in viewDidLoad() which meant the layer was resizing before the true bounds of the view was determined.
Thanks for sharing.

iPhone: Capture iOS Camera with Overlay View

In my application i am capturing picture through camera with overlay view & in the overlay view, there is a custom button through which i want to capture the whole screen.Overlay view is transparent at some points where i want to capture image. I am doing it like this:
- (IBAction)captue:(id)sender
{
[self setBackgroundColor:[UIColor clearColor]];
UIGraphicsBeginImageContext(self.frame.size);
[self.layer.presentationLayer renderInContext:UIGraphicsGetCurrentContext()];
UIImage * image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
UIImageWriteToSavedPhotosAlbum(image, nil, nil, nil);
}
It is capturing the image of overlay view but at the camera view (the view where overlay is transparent and i want to show camera view there) it is capturing black color instead of a photo. Anyone please tell me am i doing something wrong?
I found Screen capture is the one of the thing to capture camera view with overlay. But i didn't get the preview layer in screen captured video(in case video recording). Look at MyAVControllerDemo code to get clear idea and i used IAScreenCaptureView to capture video, or simple snap shot. Now working properly.
Using AVFoundationFramework to fix your problem.
Referred this link: http://code4app.net/ios/Camera-With-AVFoundation/5003cb1d6803fa9a2c000000

Resources