Of course with iOS 10, you now have to do THIS
to use the phone's camera. On first launch, the user gets a question such as,
and all is well.
BUT we have a client app that is a "camera app": when you launch the app it simply immediately launches the camera, when the app is running the camera is running and is shown fullscreen. The code to do so is the usual way, see below.
The problem is - the first launch of the app on a phone, the user is asked the question; user says yes. But then, the camera is just black on devices we have tried. It does not crash (as it would if you forget the plist item) but it goes black and stays black.
If the user quits the app and launches it again - it's fine, everything works.
What the heck is the workflow for a "camera app"? I can't see a good solution, but there must be one for the various camera apps out there - which immediately go to fullscreen camera when you launch the app.
class CameraPlane:UIViewController
{
...
func cameraBegin()
{
captureSession = AVCaptureSession()
captureSession!.sessionPreset = AVCaptureSessionPresetPhoto
let backCamera = AVCaptureDevice.defaultDevice(withMediaType: AVMediaTypeVideo)
var error: NSError?
var input: AVCaptureDeviceInput!
do {
input = try AVCaptureDeviceInput(device: backCamera)
} catch let error1 as NSError
{
error = error1
input = nil
}
if ( error != nil )
{
print("probably on simulator? no camera?")
return;
}
if ( captureSession!.canAddInput(input) == false )
{
print("capture session problem?")
return;
}
captureSession!.addInput(input)
stillImageOutput = AVCaptureStillImageOutput()
stillImageOutput!.outputSettings = [AVVideoCodecKey: AVVideoCodecJPEG]
if ( captureSession!.canAddOutput(stillImageOutput) == false )
{
print("capture session with stillImageOutput problem?")
return;
}
captureSession!.addOutput(stillImageOutput)
previewLayer = AVCaptureVideoPreviewLayer(session: captureSession)
previewLayer!.videoGravity = AVLayerVideoGravityResizeAspectFill
// or, AVLayerVideoGravityResizeAspect
fixConnectionOrientation()
view.layer.addSublayer(previewLayer!)
captureSession!.startRunning()
previewLayer!.frame = view.bounds
}
Note: it's likely OP's code was actually working correctly in terms of the new ISO10 permission string, and OP had another problem causing the black screen.
From the code you've posted, I can't tell why you experience this kind of behavior. I can only give you the code that is working for me.
This code also runs on iOS 9. Note that I am loading the camera in viewDidAppear to make sure that all constraints are set.
import AVFoundation
class ViewController : UIViewController {
//the view where the camera feed is shown
#IBOutlet weak var cameraView: UIView!
var captureSession: AVCaptureSession = {
let session = AVCaptureSession()
session.sessionPreset = AVCaptureSessionPresetPhoto
return session
}()
var sessionOutput = AVCaptureStillImageOutput()
var previewLayer = AVCaptureVideoPreviewLayer()
override func viewDidAppear(_ animated: Bool) {
super.viewDidAppear(animated)
let devices = AVCaptureDevice.devices(withMediaType: AVMediaTypeVideo) as? [AVCaptureDevice]
guard let backCamera = (devices?.first { $0.position == .back }) else {
print("The back camera is currently not available")
return
}
do {
let input = try AVCaptureDeviceInput(device: backCamera)
if captureSession.canAddInput(input){
captureSession.addInput(input)
sessionOutput.outputSettings = [AVVideoCodecKey: AVVideoCodecJPEG]
if captureSession.canAddOutput(sessionOutput) {
captureSession.addOutput(sessionOutput)
captureSession.startRunning()
previewLayer = AVCaptureVideoPreviewLayer(session: captureSession)
previewLayer.videoGravity = AVLayerVideoGravityResizeAspectFill
previewLayer.connection.videoOrientation = .portrait
cameraView.layer.addSublayer(previewLayer)
previewLayer.position = CGPoint(x: cameraView.frame.width / 2, y: cameraView.frame.height / 2)
previewLayer.bounds = cameraView.frame
}
}
} catch {
print("Could not create a AVCaptureSession")
}
}
}
If you want the camera to show fullscreen you can simply use view instead of cameraView. In my camera implementation the camera feed does not cover the entire view, there's still some navigation stuff.
What's happening is that on that first launch, you're activating the camera before it can check what the permissions are and display the appropriate UIAlertController. What you'd want to do is include this code inside an if statement to check the status of the camera permissions (AVAuthorizationStatus). Make sure that if it's not allowed to ask for permission before displaying the camera. See this question for more help.
Related
I am creating a ViewController in which I want to have a somewhat small UIView in the corner of the ViewController to display the camera preview. I am using a function to do this. However when I pass in the small UIView into the function the camera preview is not showing up. The weird thing is if I tell the function to display the preview on self.view everything works fine and I can see the camera preview. For this reason I think the problem is with the way I insert the layer or something similar.
Here is the function I am using to display the preview...
func displayPreview(on view: UIView) throws {
guard let captureSession = self.captureSession, captureSession.isRunning else { throw CameraControllerError.captureSessionIsMissing }
self.previewLayer = AVCaptureVideoPreviewLayer(session: captureSession)
self.previewLayer?.videoGravity = AVLayerVideoGravity.resizeAspectFill
self.previewLayer?.connection?.videoOrientation = .portrait
view.layer.insertSublayer(self.previewLayer!, at: 0)
self.previewLayer?.frame = view.frame
}
I call this function from inside another function which handles setting up the capture session and other similar things.
func configureCameraController() {
cameraController.prepare {(error) in
if let error = error {
print("ERROR")
print(error)
}else{
}
print("hello")
try! self.cameraController.displayPreview(on: self.mirrorView)
}
}
configureCameraController()
How can I get the camera preview layer to show up on the smaller UIView?
Can you try adding the following
let rootLayer: CALayer = self.yourSmallerView.layer
rootLayer.masksToBounds = true
self.previewLayer.frame = rootLayer.bounds
rootLayer.addSublayer(self.previewLayer)
in place of
view.layer.insertSublayer(self.previewLayer!, at: 0)
Also ensure, yourSmallerView.contentMode = UIViewContentMode.scaleToFill
Here's the code I'm working with:
func toggleflash(On on: Bool){
guard let session = captureSession where session.running == true else {
return
}
session.beginConfiguration()
let input = session.inputs[0] as! AVCaptureDeviceInput
if input.device.isFlashModeSupported(.On) {
do {
try input.device.lockForConfiguration()
input.device.flashMode = .On
input.device.unlockForConfiguration()
}catch {}
}
session.commitConfiguration()
}
When I use this function to use flash on my front camera, the iPhone screen does everything you'd expect. It flashes, but it doesn't illuminate the subject the way that the iPhone camera app/snapchat/instagram does. So I wonder if it's not bright enough or if it's not working correctly.
I am using AvFoundation for camera.
This is my live preview:
It looks good. When user presses to "Button" I am creating a snapshot on same screen. (Like snapchat)
I am using following code for capturing image and showing it on the screen:
self.stillOutput.captureStillImageAsynchronouslyFromConnection(videoConnection){
(imageSampleBuffer : CMSampleBuffer!, _) in
let imageDataJpeg = AVCaptureStillImageOutput.jpegStillImageNSDataRepresentation(imageSampleBuffer)
let pickedImage: UIImage = UIImage(data: imageDataJpeg)!
self.captureSession.stopRunning()
self.previewImageView.frame = CGRect(x:0, y:0, width:UIScreen.mainScreen().bounds.width, height:UIScreen.mainScreen().bounds.height)
self.previewImageView.image = pickedImage
self.previewImageView.layer.zPosition = 100
}
After user captures an image screen looks like this:
Please look at the marked area. It wasn't looking on the live preview screen(Screenshot 1).
I mean live preview is not showing everything. But I am sure my live preview works well because I compared with other camera apps and everything was same as my live preview screen. I guess I have a problem with captured image.
I am creating live preview with following code:
override func viewWillAppear(animated: Bool) {
super.viewWillAppear(animated)
captureSession.sessionPreset = AVCaptureSessionPresetPhoto
let devices = AVCaptureDevice.devices()
for device in devices {
// Make sure this particular device supports video
if (device.hasMediaType(AVMediaTypeVideo)) {
// Finally check the position and confirm we've got the back camera
if(device.position == AVCaptureDevicePosition.Back) {
captureDevice = device as? AVCaptureDevice
}
}
}
if captureDevice != nil {
beginSession()
}
}
func beginSession() {
let err : NSError? = nil
do {
try captureSession.addInput(AVCaptureDeviceInput(device: captureDevice))
} catch{
}
captureSession.addOutput(stillOutput)
if err != nil {
print("error: \(err?.localizedDescription)")
}
previewLayer = AVCaptureVideoPreviewLayer(session: captureSession)
previewLayer?.videoGravity=AVLayerVideoGravityResizeAspectFill
self.cameraLayer.layer.addSublayer(previewLayer!)
previewLayer?.frame = self.cameraLayer.frame
captureSession.startRunning()
}
My cameraLayer :
How can I resolve this problem?
Presumably you are using an AVCaptureVideoPreviewLayer. So it sounds like this layer is incorrectly placed or incorrectly sized, or it has the wrong AVLayerVideoGravity setting. Part of the image is off the screen or cropped; that's why you don't see that part of what the camera sees while you are previewing.
Ok, I found the solution.
I used the
captureSession.sessionPreset = AVCaptureSessionPresetHigh
Instead of
captureSession.sessionPreset = AVCaptureSessionPresetPhoto
Then problem fixed.
hello i had a working code with AVCaptureVideoPreviewLayer.. it's a part barcode reader:
let captureDevice = AVCaptureDevice.defaultDeviceWithMediaType(AVMediaTypeVideo)
do {
let input = try AVCaptureDeviceInput(device: captureDevice) as AVCaptureDeviceInput
session.addInput(input)
print("input done..")
} catch let error as NSError {
print(error)
}
let output = AVCaptureMetadataOutput()
output.setMetadataObjectsDelegate(self, queue: dispatch_get_main_queue())
session.addOutput(output)
output.metadataObjectTypes = output.availableMetadataObjectTypes
previewLayer = AVCaptureVideoPreviewLayer(session: session) as AVCaptureVideoPreviewLayer
previewLayer.frame = self.view.bounds
previewLayer.videoGravity = AVLayerVideoGravityResizeAspectFill
self.view.layer.addSublayer(previewLayer)
self.view.bringSubviewToFront(self.highlightView)
session.startRunning()
It was starting and running also, there wasn't any error messeage and i used on my old iPhone too and i could see the picture of the camera. But 2 days ago my iphone's been replaced, i didn't change anything in the code. Now the app starting but i can see only black screen.
Does anybody know what could cause this?
Thank you!
This is the code I used to test this out:
override func viewDidLoad() {
super.viewDidLoad()
var session = AVCaptureSession.new()
session.sessionPreset = AVCaptureSessionPresetMedium
let captureDevice = AVCaptureDevice.defaultDeviceWithMediaType(AVMediaTypeVideo)
let input = AVCaptureDeviceInput(device: captureDevice, error: nil) as AVCaptureDeviceInput
session.addInput(input)
print("input done..")
var previewLayer = AVCaptureVideoPreviewLayer(session: session) as AVCaptureVideoPreviewLayer
previewLayer.frame = self.view.bounds
previewLayer.videoGravity = AVLayerVideoGravityResizeAspectFill
self.view.layer.addSublayer(previewLayer)
session.startRunning()
}
I appears that the problem was that you did not pass in the NSError required by the AVCaptureDeviceInput. Xcode was not sure what the try statement was trying to do as well. I've simply passed in nil into the constructor for now, but you will want to make sure that you handle it. I also removed the part on setting up the output, as that was not relevant to my testing. Feel free to add that back in.
I tested this on my iPhone 4 running iOS 7.1 for reference.
From iOS 8 you need to ask for the permission, you can do something like this
if ([AVCaptureDevice respondsToSelector:#selector(requestAccessForMediaType: completionHandler:)]) {
[AVCaptureDevice requestAccessForMediaType:AVMediaTypeVideo completionHandler:^(BOOL granted) {
if (granted) {
// Got the permission
} else {
// Permission has been denied.
}
}];
} else {
// below ios 6
}
you can execute permission method in swifth
if AVCaptureDevice.respondsToSelector(Selector("requestAccessForMediaType"))
{
AVCaptureDevice.requestAccessForMediaType(AVMediaTypeVideo, completionHandler: { (grandted:Bool) -> Void in
//Check if granted and if its true do camera work else permission denied
})
}
else
{
// Below iOS 6
}
I've managed to find some code that would give me access to the devices of a phone (such as the camera). The issue is that when I compile the code (and I'm printing the different devices) using Xcode, I get an empty array.
Here is what I wrote:
import UIKit
import AVFoundation
class ViewController: UIViewController {
let captureSession = AVCaptureSession()
var previewLayer : AVCaptureVideoPreviewLayer?
// If we find a device we'll store it here for later us
var captureDevice : AVCaptureDevice?
override func viewDidLoad() {
super.viewDidLoad()
// Do any additional setup after loading the view, typically from a nib.
captureSession.sessionPreset = AVCaptureSessionPresetHigh
let devices = AVCaptureDevice.devices()
println(devices)
// Loop through all the capture devices on this phone
for (device in devices) {
// Make sure this particular device supports video
if (device.hasMediaType(AVMediaTypeVideo)) {
// Finally check the position and confirm we've got the back camera
if(device.position == AVCaptureDevicePosition.Back) {
captureDevice = device as? AVCaptureDevice
if captureDevice != nil {
println("Capture device found")
beginSession()
}
}
}
}
}
func beginSession() {
var err : NSError? = nil
captureSession.addInput(AVCaptureDeviceInput(device: captureDevice, error: &err))
if err != nil {
println("error: \(err?.localizedDescription)")
}
previewLayer = AVCaptureVideoPreviewLayer(session: captureSession)
self.view.layer.addSublayer(previewLayer)
previewLayer?.frame = self.view.layer.frame
captureSession.startRunning()
}
}
Do you have any ideas as to why I am getting an empty array?
If you're only running it in the simulator, the array will always be empty because it has no physical hardware to choose from. In fact, if you try to access physical hardware inside the Simulator, it will crash. If you plug a device in and still get an empty array, let me know.
first check the current status of the authorization
AVCaptureDevice.authorizationStatusForMediaType(AVMediaTypeVideo)
more detail you can read this article
Check the current Status of Authorization for Swift 4
AVCaptureDevice.authorizationStatus(for: AVMediaType.video)