I looked through a lot of documentation. I can't find an answer at all.
Every time I rotate my camera my UI rotates obviously but then my camera stays with that rotation.
It's like the coordinates are local still instead of facing global north.
What else can I look up?
import UIKit
import AVFoundation
class scannerViewController: UIViewController {
var session : AVCaptureSession!
var input : AVCaptureInput!
var previewLayer : AVCaptureVideoPreviewLayer!
var camera : AVCaptureDevice!
#IBOutlet weak var cameraPreview: UIView!
override func viewDidAppear(_ animated: Bool) {
super.viewDidAppear(animated)
session = AVCaptureSession()
camera = AVCaptureDevice.default(for: AVMediaType.video)
input = try? AVCaptureDeviceInput(device: camera!)
session.addInput(input)
previewLayer = AVCaptureVideoPreviewLayer(session: session)
cameraPreview.layer.addSublayer(previewLayer)
DispatchQueue.global(qos: .userInitiated).async {
self.session.startRunning()
DispatchQueue.main.async {
self.previewLayer.frame = self.cameraPreview.bounds
}
}
}
}
Related
I've been trying to write TrueDepth data as a quicktime movie. I have examined the example from https://developer.apple.com/documentation/avfoundation/cameras_and_media_capture/streaming_depth_data_from_the_truedepth_camera
I understand that it is possible to use AVCaptureMovieFileOutput() to output a quicktime movie, but I have no idea how to implement this. I have been trying to do something simple to start with such as just saving a capture session from the front facing camera to a quicktime. Any help would appreciated. This is what I have so far:
import UIKit
import AVFoundation
class ViewController: UIViewController {
#IBOutlet weak var previewView: UIView!
var captureSession:AVCaptureSession?
var videoPreviewLayer:AVCaptureVideoPreviewLayer?
var videoCaptureDevice: AVCaptureDevice?
var input: AnyObject?
var movieOutput = AVCaptureMovieFileOutput()
override func viewDidLoad() {
super.viewDidLoad()
// Do any additional setup after loading the view, typically from a nib.
videoCaptureDevice = AVCaptureDevice.default(.builtInTrueDepthCamera,
for: .video, position: .front)
do {
input = try AVCaptureDeviceInput(device: videoCaptureDevice!)
} catch {
print("video device error")
}
captureSession = AVCaptureSession()
captureSession?.addInput(input as! AVCaptureInput)
captureSession?.addOutput(movieOutput)
videoPreviewLayer = AVCaptureVideoPreviewLayer(session: captureSession!)
videoPreviewLayer?.videoGravity = AVLayerVideoGravity.resizeAspectFill
videoPreviewLayer?.frame = previewView.layer.bounds
previewView.layer.addSublayer(videoPreviewLayer!)
captureSession?.startRunning()
}
I am currently working on a QR Scan View in my Swift application.
I want to center the VideoPreview in the middle of my view.
The view looks like this:
The view (white) is called ScanView and I want to make the image preview the same size as the ScanView and center it in it.
Code snippet:
Thanks for every help!
here is a working solution:
import UIKit
import AVFoundation
class ViewController: UIViewController, AVCaptureVideoDataOutputSampleBufferDelegate, AVCaptureMetadataOutputObjectsDelegate {
#IBOutlet weak var innerView: UIView!
var session: AVCaptureSession?
var input: AVCaptureDeviceInput?
var previewLayer: AVCaptureVideoPreviewLayer?
override func viewDidLoad() {
super.viewDidLoad()
createSession()
}
override func viewDidAppear(_ animated: Bool) {
super.viewDidAppear(animated)
self.previewLayer?.frame.size = self.innerView.frame.size
}
private func createSession() {
do {
self.session = AVCaptureSession()
if let device = AVCaptureDevice.default(for: AVMediaType.video) {
self.input = try AVCaptureDeviceInput(device: device)
self.session?.addInput(self.input!)
self.previewLayer = AVCaptureVideoPreviewLayer(session: self.session!)
self.previewLayer?.frame.size = self.innerView.frame.size
self.previewLayer?.videoGravity = AVLayerVideoGravity.resizeAspectFill
self.innerView.layer.addSublayer(self.previewLayer!)
//______ 1. solution with Video camera ______//
let videoOutput = AVCaptureVideoDataOutput()
videoOutput.setSampleBufferDelegate(self, queue: DispatchQueue.main)
self.session?.canAddOutput(videoOutput)
self.session?.addOutput(videoOutput)
self.session?.startRunning()
//______ 2. solution with QR code ______//
let videoOutput = AVCaptureMetadataOutput()
videoOutput.setMetadataObjectsDelegate(self, queue: DispatchQueue.main)
self.session?.canAddOutput(videoOutput)
self.session?.addOutput(videoOutput)
// explanation here: https://stackoverflow.com/a/35642852/2450755
videoOutput.metadataObjectTypes = [AVMetadataObject.ObjectType.qr]
self.session?.startRunning()
}
} catch _ {
}
}
//MARK: AVCaptureVideoDataOutputSampleBufferDelegate
public func captureOutput(_ output: AVCaptureOutput, didOutput sampleBuffer: CMSampleBuffer, from connection: AVCaptureConnection) {
if let pixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer) {
let cameraImage = CIImage(cvPixelBuffer: pixelBuffer)
// awesome stuff here
}
}
//MARK: AVCaptureMetadataOutputObjectsDelegate
func setMetadataObjectsDelegate(_ objectsDelegate: AVCaptureMetadataOutputObjectsDelegate?, queue objectsCallbackQueue: DispatchQueue?) {
}
}
requirements:
setup: Privacy - Camera Usage Description
innerView must be initialized, I did by Storyboard with the following constraints:
here the result:
I have the same problem like Philip Dz. Finally fix the issue by moving the setupVideo() function from viewDidLoad to viewDidAppear:
call setupVideo() in viewDidLoad
call setupVideo() in viewDidAppear:
Perhaps I am chiming in a bit late but I have just implemented QRScanner and, depending on device that is running a video stream can be zoomed. This is achieved via AVCaptureDevice.videoZoomFactor property. So, in order to enhance the user experience for a small square QRScanner, the above code can be slightly modified by inserting the following line device.zoomFactor = min(YOUR_ZOOM_FACTOR_VALUE, device.activeFormat.videoMaxZoomFactor) before self.session?.startRunning()
I'm trying to add a button over a camera preview but it doesn't show up when I run the program (I have constraints). I looked into the code and tried to debug but I'm new to swift and Xcode and I'm new to debugging in general. I saw that when I commented out the camera preview layer the button showed up. Thanks!
import UIKit
import AVFoundation
import QuartzCore
class View1: UIViewController , AVCaptureVideoDataOutputSampleBufferDelegate{
let captureSession = AVCaptureSession()
var previewLayer:CALayer!
var captureDevice:AVCaptureDevice!
#IBOutlet weak var cameraView:UIView!
override func viewWillAppear(_ animated: Bool) {
super.viewWillAppear(animated)
prepareCamera()
}
func prepareCamera() {
captureSession.sessionPreset = AVCaptureSessionPreset1920x1080
if let availableDevices = AVCaptureDeviceDiscoverySession(deviceTypes: [.builtInWideAngleCamera], mediaType: AVMediaTypeVideo, position: .back).devices {
captureDevice = availableDevices.first
beginSession()
}
}
func beginSession() {
do {
let captureDeviceInput = try AVCaptureDeviceInput(device: captureDevice)
captureSession.addInput(captureDeviceInput)
} catch {
print(error.localizedDescription)
//Figure out what to do here
}
if let previewLayer = AVCaptureVideoPreviewLayer(session: captureSession) {
self.previewLayer = previewLayer
self.view.layer.addSublayer(self.previewLayer)
self.previewLayer.frame = self.view.layer.frame
captureSession.startRunning()
let dataOutput = AVCaptureVideoDataOutput()
dataOutput.videoSettings = [(kCVPixelBufferPixelFormatTypeKey as NSString): NSNumber(value: kCVPixelFormatType_32BGRA)]
dataOutput.alwaysDiscardsLateVideoFrames = true
if captureSession.canAddOutput(dataOutput){
captureSession.addOutput(dataOutput)
captureSession.commitConfiguration()
}
let queue = DispatchQueue(label: "com.PhotoAllergy.captureQueue")
dataOutput.setSampleBufferDelegate(self, queue: queue)
}
}
}
Maybe you could try setting the zPosition of the button to 1 or higher. YourButtonName.layer.zPostion = 2
Apple Documentation on ZPosition
You just have to add button as subview to view that is working as a preview for your avcamera.
class RecordVC {
#IBOutlet weak var vwRecordVideo : UIView!
#IBOutlet weak var btnGallary : UIButton!
override func viewDidLoad() {
super.viewDidLoad()
vwRecordVideo.addSubview(cameraButton)
vwRecordVideo.addSubview(btnGallary)
}
}
Im making a iPhone app with a AVFoundation camera but the camera is not scaling properly.
I think I have done a lot to make it the same size, I changed the video gravity to ResizeAspectFill and I changed the previewlayer.frame.size to self.layer.frame.size.
Why isn't my preview layer stretching over the entire view? Is it something I have typed wrong or just forgotten that I need to type out? Thanks!
Image: http://imgur.com/O713SoE
code:
import AVFoundation
import UIKit
import QuartzCore
class View1: UIViewController {
let captureSession = AVCaptureSession()
var previewLayer: CALayer!
var captureDevice: AVCaptureDevice!
#IBOutlet weak var photoButton: UIButton!
#IBOutlet weak var cameraView: UIView!
override func viewDidLoad() {
super.viewDidLoad()
photoButton.layer.zPosition = 1
}
#IBAction func photoButtonpressed(_ sender: UIButton) {
let button = sender as UIButton
if (button.tag == 1){
print("Photobutton clicked")
}
}
func prepareCamera(){
captureSession.sessionPreset = AVCaptureSessionPreset1920x1080
if let availableDevices = AVCaptureDeviceDiscoverySession(deviceTypes: [.builtInWideAngleCamera],
mediaType: AVMediaTypeVideo,
position: .back).devices {
captureDevice = availableDevices.first
beginSession()
}
}
func beginSession(){
do {
let captureDeviceInput = try AVCaptureDeviceInput(device: captureDevice)
captureSession.addInput(captureDeviceInput)
} catch {
print(error.localizedDescription)
}
if let previewLayer = AVCaptureVideoPreviewLayer(session: captureSession){
self.previewLayer = previewLayer
self.view.layer.addSublayer(self.previewLayer)
self.previewLayer.frame = self.view.layer.frame
self.previewLayer.bounds = self.view.bounds
self.previewLayer.contentsGravity = AVLayerVideoGravityResizeAspectFill
captureSession.startRunning()
let dataOutput = AVCaptureVideoDataOutput()
dataOutput.videoSettings = [(kCVPixelBufferPixelFormatTypeKey as NSString): NSNumber(value: kCVPixelFormatType_32BGRA)]
dataOutput.alwaysDiscardsLateVideoFrames = true
if captureSession.canAddOutput(dataOutput) {
captureSession.addOutput(dataOutput)
captureSession.commitConfiguration()
}
}
}
override func viewWillAppear(_ animated: Bool) {
super.viewWillAppear(animated)
prepareCamera()
}
override func didReceiveMemoryWarning() {
super.didReceiveMemoryWarning()
}
}
I copy your code and run on iOS 10.1.1, iPhone6, XCode 8.2.1 it works.
How you load View1? programmatically? initiate in storyboard? the view of View1 might have different size with your device screen.
I am trying to show the camera's feed in a UIView. Later I need to be able to analyze video frames, so I need to do this using AVFoundation, as I understand.
What I have so far:
import UIKit
import AVFoundation
class ViewController: UIViewController {
#IBOutlet weak var camView: UIView!
var captureSession:AVCaptureSession?
var videoPreviewLayer:AVCaptureVideoPreviewLayer?
var videoCaptureDevice: AVCaptureDevice?
var input: AnyObject?
override func viewDidLoad() {
super.viewDidLoad()
videoCaptureDevice = AVCaptureDevice.defaultDeviceWithMediaType(AVMediaTypeVideo)
do {
input = try AVCaptureDeviceInput(device: videoCaptureDevice)
} catch {
print("video device error")
}
captureSession = AVCaptureSession()
captureSession?.addInput(input as! AVCaptureInput)
videoPreviewLayer = AVCaptureVideoPreviewLayer(session: captureSession)
videoPreviewLayer?.videoGravity = AVLayerVideoGravityResizeAspectFill
videoPreviewLayer?.frame = camView.layer.bounds
captureSession?.startRunning()
}
}
The camView is visible, but it doesn't show anything.
The app asked for permission to use the camera at first run, and have been granted that permission.
Setting a breakpoint and inspecting captureSession, videoPreviewLayer, videoCaptureDevice and input confirms they have all been set.
The video preview layer is not added to the camView. Hence you cannot see the camera session running in the camView.
Add this line:
camView.layer.addSublayer(videoPreviewLayer)