Errors while building a custom camera App in swift 2.0 - ios

I am building a custom camera application with a custom view for the camera. On adding the following code in the ViewWillAppear section I get the following error in the area I added the stars in:
Binary operator '!=' cannot be applied to operands of type 'Bool' and 'NilLiteralConvertible'
Any help is greatly appreciated.
captureSession = AVCaptureSession()
captureSession.sessionPreset = AVCaptureSessionPreset1920x1080
let backCamera = AVCaptureDevice.defaultDeviceWithMediaType(AVMediaTypeVideo)
var error : NSError?
var input: AVCaptureDeviceInput!
do {
input = try AVCaptureDeviceInput(device: backCamera)
} catch let error1 as NSError {
error = error1
input = nil
}
if error == nil && captureSession.canAddInput(input) != nil { *********
captureSession.addInput(input)
stillImageOutPut = AVCaptureStillImageOutput()
stillImageOutPut.outputSettings = [AVVideoCodecKey: AVVideoCodecJPEG]
if (captureSession.canAddOutput(stillImageOutPut) != nil){
captureSession.addOutput(stillImageOutPut)
previewLayer = AVCaptureVideoPreviewLayer(session: captureSession)
previewLayer.videoGravity = AVLayerVideoGravityResizeAspect
previewLayer.connection.videoOrientation = AVCaptureVideoOrientation.Portrait
cameraView.layer.addSublayer(previewLayer)
captureSession.startRunning()
}
}

captureSession.canAddInput(input) returns a Bool, so there is no need to check != nil. Your if statement can become:
if error == nil && captureSession.canAddInput(input) {
Also, you declared input as AVCaptureDeviceInput!. The ! means that it shouldn't ever be nil, but then if there is an error you assign nil to it. This will crash if there is ever an error. You should declare input as AVCaptureDeviceInput? and unwrap input where necessary.

Related

Custom camera view with UIImage as new overlay

I want to make a custom cameraView overlay. I want to use the overlay which is an image as a template. But the rect of clear space will change depends on phone.
Template:
I tried to create a view as container behind image. But the image that got captured will include the part that I dont want
self.session = AVCaptureSession()
self.session!.sessionPreset = AVCaptureSession.Preset.photo
let backCamera = AVCaptureDevice.default(for: AVMediaType.video)
var error: NSError?
var input: AVCaptureDeviceInput!
do {
input = try AVCaptureDeviceInput(device: backCamera!)
} catch let error1 as NSError {
error = error1
input = nil
print(error!.localizedDescription)
}
if error == nil && session!.canAddInput(input) {
session!.addInput(input)
stillImageOutput = AVCaptureStillImageOutput()
stillImageOutput?.outputSettings = [AVVideoCodecKey: AVVideoCodecJPEG]
if session!.canAddOutput(stillImageOutput!) {
session!.addOutput(stillImageOutput!)
videoPreviewLayer = AVCaptureVideoPreviewLayer(session: self.session!)
videoPreviewLayer!.videoGravity = AVLayerVideoGravity.resizeAspect
videoPreviewLayer!.connection?.videoOrientation = AVCaptureVideoOrientation.portrait
previewView.layer.addSublayer(videoPreviewLayer!)
session!.startRunning()
}
}
//inviewdidload
videoPreviewLayer!.frame = previewView.bounds//previewView is uiview behind image
The expected result is that the white space would be the custom camera view. also it seems that AVCaptureStillImageOutput was deprecated in iOS 10.0

AVCaptureDeviceInput initialization exception

When I try to create an instance of type AVCaptureDeviceInput I get the following error.
What have I done so far:
1) I have gain permissions for camera use and microphone.
2) I have tested the code on an iPhone 7 and an iPhone 5s.
3) When it displays the error I printed of the value of session.isRunning and it returns true value.
4) All the properties are retain strong.
Error Domain=AVFoundationErrorDomain Code=-11814 "Cannot Record" UserInfo={NSLocalizedDescription=Cannot Record, NSLocalizedRecoverySuggestion=Try recording again.
This is the code:
let session = AVCaptureSession()
self.session = session
session.sessionPreset = AVCaptureSessionPresetPhoto
do {
let input = try AVCaptureDeviceInput(device: device)
session.addInput(input)
if session.canAddInput(input) {
let stillImageOutput = AVCapturePhotoOutput()
self.stillImageOutput = stillImageOutput
let settings = AVCapturePhotoSettings()
stillImageOutput.capturePhoto(with: settings, delegate: self)
if session.canAddOutput(stillImageOutput) {
session.addOutput(stillImageOutput)
if let previewLayer = AVCaptureVideoPreviewLayer(session: session) {
self.previewLayer = previewLayer
previewLayer.videoGravity = AVLayerVideoGravityResizeAspect
previewLayer.connection!.videoOrientation = AVCaptureVideoOrientation.portrait
viewController.centerView.layer.insertSublayer(previewLayer, at: 0)
session.startRunning()
}
}
}
} catch {
print(error.localizedDescription)
}
There are several issues I find in your code, so the updated version with the comments on it would look like this:
let session = AVCaptureSession()
self.session = session
// `This method is used to start the flow of data from the inputs to the outputs connected to the AVCaptureSession instance that is the receiver.`
//session.startRunning() // Don't startRunning until everything is configured
session.sessionPreset = AVCaptureSessionPresetPhoto
do {
let input = try AVCaptureDeviceInput(device: device)
// session.addInput(input) // This one has to be after you check if you `canAddInput`
if session.canAddInput(input) {
session.addInput(input)
let stillImageOutput = AVCapturePhotoOutput()
self.stillImageOutput = stillImageOutput
let settings = AVCapturePhotoSettings()
// stillImageOutput.capturePhoto(with: settings, delegate: self) // This one might want to be called after you add it as an output to the `session`
if session.canAddOutput(stillImageOutput) {
session.addOutput(stillImageOutput)
if let previewLayer = AVCaptureVideoPreviewLayer(session: session) {
self.previewLayer = previewLayer
previewLayer.videoGravity = AVLayerVideoGravityResizeAspect
previewLayer.connection!.videoOrientation = AVCaptureVideoOrientation.portrait
viewController.centerView.layer.insertSublayer(previewLayer, at: 0)
session.startRunning()
}
stillImageOutput.capturePhoto(with: settings, delegate: self)
}
}
} catch {
print(error.localizedDescription)
}

No error printed however function doesn't run

I am attempting to make a camera view appear, as you can see in my code below I have it set up to display any errors and not break the program if any errors occur, however when I do run this code no error occurs or camera view is displayed. I am running it on an actual phone and the phone did request if it had permission to use the camera. Below is the code
override func viewWillAppear(animated: Bool) {
super.viewWillAppear(animated)
captureSession = AVCaptureSession()
captureSession?.sessionPreset = AVCaptureSessionPreset1920x1080
let backCamera = AVCaptureDevice.defaultDeviceWithMediaType(AVMediaTypeVideo)
var input : AVCaptureDeviceInput?
let error : NSError?
do {
input = try AVCaptureDeviceInput(device: backCamera)
} catch let error as NSError? {
print(error)
if error == nil && (captureSession?.canAddInput(input))!{
captureSession?.addInput(input)
videoOutput = AVCaptureVideoDataOutput()
//videoOutput?.outputSettings = [AVVideoCodecKey : AVVideoCodecKey]
if ((captureSession?.canAddOutput(videoOutput)) != nil){
captureSession?.addOutput(videoOutput)
previewLayer = AVCaptureVideoPreviewLayer(session: captureSession)
previewLayer?.videoGravity = AVLayerVideoGravityResizeAspect
previewLayer?.connection.videoOrientation = AVCaptureVideoOrientation.Portrait
cameraView.layer.addSublayer(previewLayer!)
captureSession?.startRunning()
}
}
}
}
The body of catch is executed only if an error has occurred. so since your code is inside the catch block, it won't execute as it doesn't find any error.
So remove the code from the catch block and put it outside as suggested by #penatheboss.
Don't put code in the catch. That is if something goes wrong.
do {
input = try AVCaptureDeviceInput(device: backCamera)
} catch let error as NSError? {
print(error)
return//Stop rest of code
}
if (captureSession?.canAddInput(input))!{
captureSession?.addInput(input)
videoOutput = AVCaptureVideoDataOutput()
//videoOutput?.outputSettings = [AVVideoCodecKey : AVVideoCodecKey]
if ((captureSession?.canAddOutput(videoOutput)) != nil){
captureSession?.addOutput(videoOutput)
previewLayer = AVCaptureVideoPreviewLayer(session: captureSession)
previewLayer?.videoGravity = AVLayerVideoGravityResizeAspect
previewLayer?.connection.videoOrientation = AVCaptureVideoOrientation.Portrait
cameraView.layer.addSublayer(previewLayer!)
captureSession?.startRunning()
}
}

Error: 'Call can throw but is not marked with try and error not handled'

I received the error stated above and have tried to amend this by adding in a do / catch block. For some reason the error won't go away. Does anyone know why this might be ?
override func viewDidAppear(animated: Bool) {
super.viewWillAppear(animated)
captureSession = AVCaptureSession()
captureSession?.sessionPreset = AVCaptureSessionPreset1920x1080
let backCamera = AVCaptureDevice.defaultDeviceWithMediaType(AVMediaTypeVideo)
do {
let input = AVCaptureDeviceInput(device: backCamera)
captureSession?.addInput(input)
stillImageOutput = AVCaptureStillImageOutput()
stillImageOutput?.outputSettings = [AVVideoCodecKey : AVVideoCodecJPEG]
if (captureSession?.canAddOutput(stillImageOutput) != nil){
captureSession?.addOutput(stillImageOutput)
previewLayer = AVCaptureVideoPreviewLayer(session: captureSession)
previewLayer?.videoGravity = AVLayerVideoGravityResizeAspect
previewLayer?.connection.videoOrientation = AVCaptureVideoOrientation.Portrait
oview.layer.addSublayer(previewLayer!)
captureSession?.startRunning()
}
} catch {
}
}
The clue is in the error description: Error: 'Call can throw but is not marked with try and error not handled'
You have not marked the call that can throw with a try
I don't which call you have in there that throws, but find the one that does and put try in front of it. If you are assigning a value, try needs to go on the right side of the =
EDIT
Just looked in the docs, looks like it's your
let input = AVCaptureDeviceInput(device: backCamera)
statement that can throw. Put a try after the = like this
let input = try AVCaptureDeviceInput(device: backCamera)
Then you can print(error) inside of your catch to see any potentiel errors
You are writing Swift code. Not Java or C++ code. Exceptions work differently. You need to use try for the single call that can throw, not for a large block of code.
I recommend you download the free Swift 2 book and learn how exceptions work in Swift 2. Similarity with other languages is just superficial.

iOS : How to activate image stabilization on an AVCaptureSession?

I'd like to capture stabilized images in my app but I haven't found the required configuration to acheive it.
This is my code :
let frontCamera = cameraWithPosition(AVCaptureDevicePosition.Front)
let captureSession = AVCaptureSession()
if captureSession.canSetSessionPreset(AVCaptureSessionPresetPhoto) {
captureSession.sessionPreset = AVCaptureSessionPresetPhoto
print("Session preset has been set ")
}
else {
print("Session preset couldn't be set ")
}
var error: NSError?
var input: AVCaptureDeviceInput!
do {
input = try AVCaptureDeviceInput(device: frontCamera)
} catch let error1 as NSError {
error = error1
input = nil
}
if error == nil && captureSession!.canAddInput(input) {
captureSession.addInput(input)
let stillImageOutput = AVCaptureStillImageOutput()
stillImageOutput.outputSettings = [AVVideoCodecKey: AVVideoCodecJPEG]
if stillImageOutput.stillImageStabilizationSupported {
stillImageOutput.automaticallyEnablesStillImageStabilizationWhenAvailable = true
print("Stabilization supported ")
}
else {
print("Stabilization is not supported ")
}
}
So the session preset are correctly set but the image stabilization is not supported.
What can I do to support image stabilisation ?
** 2nd ATTEMPT AFTER RHYTHMIC FISTMAN RESPONSE : **
I switched to the back camera, I've added the output to the captureSession before setting it, and I still don't have my image stabilized :
let backCamera = cameraWithPosition(AVCaptureDevicePosition.Back)
let captureSession = AVCaptureSession()
if captureSession.canSetSessionPreset(AVCaptureSessionPresetPhoto) {
captureSession.sessionPreset = AVCaptureSessionPresetPhoto
print("Session preset has been set ")
}
else {
print("Session preset couldn't be set ")
}
var error: NSError?
var input: AVCaptureDeviceInput!
do {
input = try AVCaptureDeviceInput(device: backCamera)
} catch let error1 as NSError {
error = error1
input = nil
}
if error == nil && captureSession.canAddInput(input) {
captureSession.addInput(input)
let stillImageOutput = AVCaptureStillImageOutput()
stillImageOutput.outputSettings = [AVVideoCodecKey: AVVideoCodecJPEG]
if captureSession.canAddOutput(stillImageOutput) {
captureSession.addOutput(stillImageOutput)
if stillImageOutput.stillImageStabilizationSupported == true {
stillImageOutput.automaticallyEnablesStillImageStabilizationWhenAvailable = true
print("Stabilization supported ")
}
else {
print("Stabilization is not supported ")
}
if stillImageOutput.stillImageStabilizationActive == true {
print("Stabilization is active ")
}
else {
print("Stabilization is not active ")
}
}
}
The result is :
Stabilization is not supported
Stabilization is not active
Firstly, you've forgotten to add your AVCaptureStillImageOutput to the AVCaptureSession. You must do that before querying its capabilities!
captureSession.addOutput(stillImageOutput)
Secondly, neither Digital nor Optical Image Stabilisation are supported on the front camera.
Thirdly, on the back camera, on supported platforms (digital appears to be available on 5S up) AVCaptureStillImageOutput automaticallyEnablesStillImageStabilizationWhenAvailable defaults to YES, so if you switch to the back camera - then you already will be using some form of image stabilisation.
NB: Optical Image Stabilisation is only available on the 6+ and 6S+ (although the linked technote has not been updated for the 6S models yet).

Resources