I am creating a ViewController in which I want to have a somewhat small UIView in the corner of the ViewController to display the camera preview. I am using a function to do this. However when I pass in the small UIView into the function the camera preview is not showing up. The weird thing is if I tell the function to display the preview on self.view everything works fine and I can see the camera preview. For this reason I think the problem is with the way I insert the layer or something similar.
Here is the function I am using to display the preview...
func displayPreview(on view: UIView) throws {
guard let captureSession = self.captureSession, captureSession.isRunning else { throw CameraControllerError.captureSessionIsMissing }
self.previewLayer = AVCaptureVideoPreviewLayer(session: captureSession)
self.previewLayer?.videoGravity = AVLayerVideoGravity.resizeAspectFill
self.previewLayer?.connection?.videoOrientation = .portrait
view.layer.insertSublayer(self.previewLayer!, at: 0)
self.previewLayer?.frame = view.frame
}
I call this function from inside another function which handles setting up the capture session and other similar things.
func configureCameraController() {
cameraController.prepare {(error) in
if let error = error {
print("ERROR")
print(error)
}else{
}
print("hello")
try! self.cameraController.displayPreview(on: self.mirrorView)
}
}
configureCameraController()
How can I get the camera preview layer to show up on the smaller UIView?
Can you try adding the following
let rootLayer: CALayer = self.yourSmallerView.layer
rootLayer.masksToBounds = true
self.previewLayer.frame = rootLayer.bounds
rootLayer.addSublayer(self.previewLayer)
in place of
view.layer.insertSublayer(self.previewLayer!, at: 0)
Also ensure, yourSmallerView.contentMode = UIViewContentMode.scaleToFill
Related
I'm trying to get camera preview and photos working in a swiftUI app, I can display the preview and snap a photo but proportions are coming out wrong. The preview screen looks good, but as soon as the capture happens, it gets distorted and doesn't stay in portrait.
The preview screen looks like this:
Some code:
class CameraController: UIViewController {
//...
override func viewDidLoad() {
//...
// Setup preview layer
previewLayer = AVCaptureVideoPreviewLayer(session: captureSession)
previewLayer?.videoGravity = .resizeAspectFill
previewLayer?.connection?.videoOrientation = .portrait
previewLayer?.frame = view.frame
view.layer.insertSublayer(previewLayer!, at: 0)
//...
}
}
But after I snap a photo, the Image I'm displaying in a VStack looks like:
The code is:
VStack {
Image(uiImage: self.inputImage!)
.resizable()
.aspectRatio(contentMode: .fit)
.edgesIgnoringSafeArea(.all)
}
I'm using .fit because I want to see the whole thing, and it's just coming out all wrong. Also, here's my camera setup code happening inside viewDidLoad:
captureSession.beginConfiguration()
// Get device
captureDevice = AVCaptureDevice.default(.builtInDualCamera, for: .video, position: .back)
do {
// Create input
let captureDeviceInput = try AVCaptureDeviceInput(device: captureDevice!)
captureSession.addInput(captureDeviceInput)
// Create output
photoOutput = AVCapturePhotoOutput()
captureSession.sessionPreset = .photo
photoOutput?.setPreparedPhotoSettingsArray(
[AVCapturePhotoSettings(format: [AVVideoCodecKey: AVVideoCodecType.jpeg])],
completionHandler: nil
)
captureSession.connections.first?.videoOrientation = .portrait // tested with and and without this line
captureSession.addOutput(photoOutput!)
captureSession.commitConfiguration()
} catch {
print(error)
}
Any help much appreciated!
It returns only on landscape because the image which was taken with in-app camera has an orientation. You need to normalize it and display it.
I make an extension of UIImage with
func fixOrientation(img:UIImage) -> UIImage {
if (img.imageOrientation == UIImageOrientation.Up) {
return img;
}
UIGraphicsBeginImageContextWithOptions(img.size, false, img.scale);
let rect = CGRect(x: 0, y: 0, width: img.size.width, height: img.size.height)
img.drawInRect(rect)
let normalizedImage : UIImage = UIGraphicsGetImageFromCurrentImageContext()
UIGraphicsEndImageContext();
return normalizedImage;
}
Original answer from here:
Normalize Pictures taken by camera
Also I just did a few changes to copy and paste solution for the ImagePickerManager:
ImagePickerManager
I am looking to display a UIView subclass within a UIStackView. The subclass is called PLBarcodeScannerView and is using AVCaptureMetadataOuput to detect barcodes within the camera's field of view. Because this view is not the entire screen, I need to set the region of interest to be the same as the frame of the PLBarcodeScannerView because the user is only seeing a portion of the camera view and we want to be sure that the barcode in the visible view is the one being scanned.
Issue
I cannot seem to set the metadataOutputRectOfInterest properly, nor does the "zoom level" of the preview layer on this view seem correct, although the aspect ratio is correct. The system does receive barcodes successfully, but they are not always visible within the preview window. Codes are still scanned when they reside outside the visible preview window.
Screenshot:
The colorful photo is the PLBarcodeScannerView. Only codes which are fully visible inside this view should be considered.
Below is the code that initializes the view:
This is called within the init methods of PLBarcodeScannerView:UIView
func setupView() {
session = AVCaptureSession()
let tap = UITapGestureRecognizer(target: self, action: #selector(self.resume))
addGestureRecognizer(tap)
// Set the captureDevice.
let videoCaptureDevice = AVCaptureDevice.defaultDevice(withMediaType: AVMediaTypeVideo)
// Create input object.
let videoInput: AVCaptureDeviceInput?
do {
videoInput = try AVCaptureDeviceInput(device: videoCaptureDevice)
} catch {
return
}
// Add input to the session.
if (session!.canAddInput(videoInput)) {
session!.addInput(videoInput)
} else {
scanningNotPossible()
}
// Create output object.
let metadataOutput = AVCaptureMetadataOutput()
// Add output to the session.
if (session!.canAddOutput(metadataOutput)) {
session!.addOutput(metadataOutput)
// Send captured data to the delegate object via a serial queue.
metadataOutput.setMetadataObjectsDelegate(self, queue: DispatchQueue.main)
// Set barcode type for which to scan: EAN-13.
metadataOutput.metadataObjectTypes = [
AVMetadataObjectTypeCode128Code
]
} else {
scanningNotPossible()
}
// Determine the size of the region of interest
let x = self.frame.origin.x/UIScreen.main.bounds.width
let y = self.frame.origin.y/UIScreen.main.bounds.height
let width = self.frame.width/UIScreen.main.bounds.height
let height = self.frame.height/UIScreen.main.bounds.height
let scanRectTransformed = CGRect(x: x, y: y, width: 1, height: height)
metadataOutput.metadataOutputRectOfInterest(for: scanRectTransformed)
// Add previewLayer and have it show the video data.
previewLayer = AVCaptureVideoPreviewLayer(session: session)
previewLayer.frame = self.bounds
previewLayer.videoGravity = AVLayerVideoGravityResizeAspectFill
layer.addSublayer(previewLayer)
// Begin the capture session.
session!.startRunning()
}
Of course with iOS 10, you now have to do THIS
to use the phone's camera. On first launch, the user gets a question such as,
and all is well.
BUT we have a client app that is a "camera app": when you launch the app it simply immediately launches the camera, when the app is running the camera is running and is shown fullscreen. The code to do so is the usual way, see below.
The problem is - the first launch of the app on a phone, the user is asked the question; user says yes. But then, the camera is just black on devices we have tried. It does not crash (as it would if you forget the plist item) but it goes black and stays black.
If the user quits the app and launches it again - it's fine, everything works.
What the heck is the workflow for a "camera app"? I can't see a good solution, but there must be one for the various camera apps out there - which immediately go to fullscreen camera when you launch the app.
class CameraPlane:UIViewController
{
...
func cameraBegin()
{
captureSession = AVCaptureSession()
captureSession!.sessionPreset = AVCaptureSessionPresetPhoto
let backCamera = AVCaptureDevice.defaultDevice(withMediaType: AVMediaTypeVideo)
var error: NSError?
var input: AVCaptureDeviceInput!
do {
input = try AVCaptureDeviceInput(device: backCamera)
} catch let error1 as NSError
{
error = error1
input = nil
}
if ( error != nil )
{
print("probably on simulator? no camera?")
return;
}
if ( captureSession!.canAddInput(input) == false )
{
print("capture session problem?")
return;
}
captureSession!.addInput(input)
stillImageOutput = AVCaptureStillImageOutput()
stillImageOutput!.outputSettings = [AVVideoCodecKey: AVVideoCodecJPEG]
if ( captureSession!.canAddOutput(stillImageOutput) == false )
{
print("capture session with stillImageOutput problem?")
return;
}
captureSession!.addOutput(stillImageOutput)
previewLayer = AVCaptureVideoPreviewLayer(session: captureSession)
previewLayer!.videoGravity = AVLayerVideoGravityResizeAspectFill
// or, AVLayerVideoGravityResizeAspect
fixConnectionOrientation()
view.layer.addSublayer(previewLayer!)
captureSession!.startRunning()
previewLayer!.frame = view.bounds
}
Note: it's likely OP's code was actually working correctly in terms of the new ISO10 permission string, and OP had another problem causing the black screen.
From the code you've posted, I can't tell why you experience this kind of behavior. I can only give you the code that is working for me.
This code also runs on iOS 9. Note that I am loading the camera in viewDidAppear to make sure that all constraints are set.
import AVFoundation
class ViewController : UIViewController {
//the view where the camera feed is shown
#IBOutlet weak var cameraView: UIView!
var captureSession: AVCaptureSession = {
let session = AVCaptureSession()
session.sessionPreset = AVCaptureSessionPresetPhoto
return session
}()
var sessionOutput = AVCaptureStillImageOutput()
var previewLayer = AVCaptureVideoPreviewLayer()
override func viewDidAppear(_ animated: Bool) {
super.viewDidAppear(animated)
let devices = AVCaptureDevice.devices(withMediaType: AVMediaTypeVideo) as? [AVCaptureDevice]
guard let backCamera = (devices?.first { $0.position == .back }) else {
print("The back camera is currently not available")
return
}
do {
let input = try AVCaptureDeviceInput(device: backCamera)
if captureSession.canAddInput(input){
captureSession.addInput(input)
sessionOutput.outputSettings = [AVVideoCodecKey: AVVideoCodecJPEG]
if captureSession.canAddOutput(sessionOutput) {
captureSession.addOutput(sessionOutput)
captureSession.startRunning()
previewLayer = AVCaptureVideoPreviewLayer(session: captureSession)
previewLayer.videoGravity = AVLayerVideoGravityResizeAspectFill
previewLayer.connection.videoOrientation = .portrait
cameraView.layer.addSublayer(previewLayer)
previewLayer.position = CGPoint(x: cameraView.frame.width / 2, y: cameraView.frame.height / 2)
previewLayer.bounds = cameraView.frame
}
}
} catch {
print("Could not create a AVCaptureSession")
}
}
}
If you want the camera to show fullscreen you can simply use view instead of cameraView. In my camera implementation the camera feed does not cover the entire view, there's still some navigation stuff.
What's happening is that on that first launch, you're activating the camera before it can check what the permissions are and display the appropriate UIAlertController. What you'd want to do is include this code inside an if statement to check the status of the camera permissions (AVAuthorizationStatus). Make sure that if it's not allowed to ask for permission before displaying the camera. See this question for more help.
Is there any way to record IOS screen programmatically. Means whatever activity you are doing like clicking buttons, Scrolling tableviews.
Even if a video is playing that will be captured again along with some other activity?
Have tried these
https://www.raywenderlich.com/30200/avfoundation-tutorial-adding-overlays-and-animations-to-videos
https://github.com/alskipp/ASScreenRecorder
but with these libraries won't provide quality video. I need quality video.
The issue is that with video playing in the background when i capture screen it does not show smooth video. It shows like one frame of video and then after 3-4 secs 2nd frame and so on. Also quality of video is not good its blurred
As of iOS 9, it looks like ReplayKit is available to greatly simplify this.
https://developer.apple.com/reference/replaykit
https://code.tutsplus.com/tutorials/ios-9-an-introduction-to-replaykit--cms-25458
Update: This may be less relevant now that iOS 11 has a built-in screen recorder, but the following Swift 3 code worked for me:
import ReplayKit
#IBAction func toggleRecording(_ sender: UIBarButtonItem) {
let r = RPScreenRecorder.shared()
guard r.isAvailable else {
print("ReplayKit unavailable")
return
}
if r.isRecording {
self.stopRecording(sender, r)
}
else {
self.startRecording(sender, r)
}
}
func startRecording(_ sender: UIBarButtonItem, _ r: RPScreenRecorder) {
r.startRecording(handler: { (error: Error?) -> Void in
if error == nil { // Recording has started
sender.title = "Stop"
} else {
// Handle error
print(error?.localizedDescription ?? "Unknown error")
}
})
}
func stopRecording(_ sender: UIBarButtonItem, _ r: RPScreenRecorder) {
r.stopRecording( handler: { previewViewController, error in
sender.title = "Record"
if let pvc = previewViewController {
if UI_USER_INTERFACE_IDIOM() == UIUserInterfaceIdiom.pad {
pvc.modalPresentationStyle = UIModalPresentationStyle.popover
pvc.popoverPresentationController?.sourceRect = CGRect.zero
pvc.popoverPresentationController?.sourceView = self.view
}
pvc.previewControllerDelegate = self
self.present(pvc, animated: true, completion: nil)
}
else if let error = error {
print(error.localizedDescription)
}
})
}
// MARK: RPPreviewViewControllerDelegate
func previewControllerDidFinish(_ previewController: RPPreviewViewController) {
previewController.dismiss(animated: true, completion: nil)
}
ReplayKit is available, although you are not allowed to access the result video, the only way I've found so far is to make a number of screenshots (store them in array of images) and then convert these images to the video, not very efficient from performance standpoint though, but might work when you don't really need a 30/60 fps screen recording and might be ok w/ 6-20 pfs. Here's the full example.
Check out ScreenCaptureView, this has video-recording support built-in (see link).
What this does is it saves the contents of a UIView to a UIImage. The author suggests you can save a video of the app in use by passing the frames through AVCaptureSession.
I believe it hasn't been tested with an OpenGL subview, but assuming that it works you might be able to modify it slightly to include audio and then you'd be set.
AVCaptureSession Sample
AVCaptureSession Reference
import UIKit
import AVFoundation
class ViewController: UIViewController {
let captureSession = AVCaptureSession()
let stillImageOutput = AVCaptureStillImageOutput()
var error: NSError?
override func viewDidLoad() {
super.viewDidLoad()
let devices = AVCaptureDevice.devices().filter{ $0.hasMediaType(AVMediaTypeVideo) && $0.position == AVCaptureDevicePosition.Back }
if let captureDevice = devices.first as? AVCaptureDevice {
captureSession.addInput(AVCaptureDeviceInput(device: captureDevice, error: &error))
captureSession.sessionPreset = AVCaptureSessionPresetPhoto
captureSession.startRunning()
stillImageOutput.outputSettings = [AVVideoCodecKey:AVVideoCodecJPEG]
if captureSession.canAddOutput(stillImageOutput) {
captureSession.addOutput(stillImageOutput)
}
if let previewLayer = AVCaptureVideoPreviewLayer(session: captureSession) {
previewLayer.bounds = view.bounds
previewLayer.position = CGPointMake(view.bounds.midX, view.bounds.midY)
previewLayer.videoGravity = AVLayerVideoGravityResizeAspectFill
let cameraPreview = UIView(frame: CGRectMake(0.0, 0.0, view.bounds.size.width, view.bounds.size.height))
cameraPreview.layer.addSublayer(previewLayer)
cameraPreview.addGestureRecognizer(UITapGestureRecognizer(target: self, action:"saveToCamera:"))
view.addSubview(cameraPreview)
}
}
}
func saveToCamera(sender: UITapGestureRecognizer) {
if let videoConnection = stillImageOutput.connectionWithMediaType(AVMediaTypeVideo) {
stillImageOutput.captureStillImageAsynchronouslyFromConnection(videoConnection) {
(imageDataSampleBuffer, error) -> Void in
let imageData = AVCaptureStillImageOutput.jpegStillImageNSDataRepresentation(imageDataSampleBuffer)
UIImageWriteToSavedPhotosAlbum(UIImage(data: imageData), nil, nil, nil)
}
}
}
override func didReceiveMemoryWarning() {
super.didReceiveMemoryWarning()
}
}
You can use this library to record a view : screen-cap-view available on GitHub written in Objective C.
**And to use it in swift:**
--> Drag and drop the .m and .h files in your xcode project.
--> Make a header file and import the this file in that : *#import "IAScreenCaptureView.h"*
--> Then give a View this class from the PropertyInspector and then make a IBOutlet for that view . Something like this:
*#IBOutlet weak var contentView: IAScreenCaptureView!*
--> Then Finally just simply start and stop the recording of the view where ever and when ever you want and for that the code will be like this :
For Starting the Recording : *contentView.startRecording()*
For Stoping the Recording : *contentView.stopRecording()*
//Hope this helps.Happy coding. \o/ , ¯\_(ツ)_/¯ ,(╯°□°)╯︵ ┻━┻
I am using AvFoundation for camera.
This is my live preview:
It looks good. When user presses to "Button" I am creating a snapshot on same screen. (Like snapchat)
I am using following code for capturing image and showing it on the screen:
self.stillOutput.captureStillImageAsynchronouslyFromConnection(videoConnection){
(imageSampleBuffer : CMSampleBuffer!, _) in
let imageDataJpeg = AVCaptureStillImageOutput.jpegStillImageNSDataRepresentation(imageSampleBuffer)
let pickedImage: UIImage = UIImage(data: imageDataJpeg)!
self.captureSession.stopRunning()
self.previewImageView.frame = CGRect(x:0, y:0, width:UIScreen.mainScreen().bounds.width, height:UIScreen.mainScreen().bounds.height)
self.previewImageView.image = pickedImage
self.previewImageView.layer.zPosition = 100
}
After user captures an image screen looks like this:
Please look at the marked area. It wasn't looking on the live preview screen(Screenshot 1).
I mean live preview is not showing everything. But I am sure my live preview works well because I compared with other camera apps and everything was same as my live preview screen. I guess I have a problem with captured image.
I am creating live preview with following code:
override func viewWillAppear(animated: Bool) {
super.viewWillAppear(animated)
captureSession.sessionPreset = AVCaptureSessionPresetPhoto
let devices = AVCaptureDevice.devices()
for device in devices {
// Make sure this particular device supports video
if (device.hasMediaType(AVMediaTypeVideo)) {
// Finally check the position and confirm we've got the back camera
if(device.position == AVCaptureDevicePosition.Back) {
captureDevice = device as? AVCaptureDevice
}
}
}
if captureDevice != nil {
beginSession()
}
}
func beginSession() {
let err : NSError? = nil
do {
try captureSession.addInput(AVCaptureDeviceInput(device: captureDevice))
} catch{
}
captureSession.addOutput(stillOutput)
if err != nil {
print("error: \(err?.localizedDescription)")
}
previewLayer = AVCaptureVideoPreviewLayer(session: captureSession)
previewLayer?.videoGravity=AVLayerVideoGravityResizeAspectFill
self.cameraLayer.layer.addSublayer(previewLayer!)
previewLayer?.frame = self.cameraLayer.frame
captureSession.startRunning()
}
My cameraLayer :
How can I resolve this problem?
Presumably you are using an AVCaptureVideoPreviewLayer. So it sounds like this layer is incorrectly placed or incorrectly sized, or it has the wrong AVLayerVideoGravity setting. Part of the image is off the screen or cropped; that's why you don't see that part of what the camera sees while you are previewing.
Ok, I found the solution.
I used the
captureSession.sessionPreset = AVCaptureSessionPresetHigh
Instead of
captureSession.sessionPreset = AVCaptureSessionPresetPhoto
Then problem fixed.