Device torch turns off when starting AVCaptureSession - ios

I'm using AVCaptureSession to capture video.
I want to light on the torch during the whole session, but once the session is started, the light turns automatically off.
There is a lot of posts here showing how to turn on the torch. It works, unless the capture session is started.
here's the way I start the session
guard let camera = AVCaptureDevice.default(for: .video) else { return }
self.captureSession.beginConfiguration()
let deviceInput = try AVCaptureDeviceInput(device: camera)
self.captureSession.addInput(deviceInput)
let videoOutput = AVCaptureVideoDataOutput()
videoOutput.setSampleBufferDelegate(self, queue: DispatchQueue(label: "com.axelife.axcapturer.samplebufferdelegate"))
self.captureSession.addOutput(videoOutput)
try camera.setLight(on: true)
self.captureSession.commitConfiguration()
DispatchQueue(label: "capturesession").async {
self.captureSession.startRunning()
}
And my code to turn on the light
extension AVCaptureDevice {
func setLight(on: Bool) throws {
try self.lockForConfiguration()
if on {
try self.setTorchModeOn(level: 1)
}
else {
self.torchMode = .off
}
self.unlockForConfiguration()
}
}
With that code, the light turns on during < 0.5 seconds, and turn back off automatically.

Ok, I figured out.
The torch simply must be lighted on after the session start.
So instead of:
try camera.setLight(on: true)
self.captureSession.commitConfiguration()
DispatchQueue(label: "capturesession").async {
self.captureSession.startRunning()
}
just do
self.captureSession.commitConfiguration()
DispatchQueue(label: "capturesession").async {
self.captureSession.startRunning()
try camera.setLight(on: true)
}

For me, the torch turns off when I change the videoOrientation of the capture preview layer. I turn it on again after that happens. (DispatchQueue.main.async is important, for some reason, it doesn't work if I leave it out)
previewConnection.videoOrientation = orientation
do {
try captureDevice.lockForConfiguration()
} catch {
print(error)
}
DispatchQueue.main.async { [weak self] in
self?.captureDevice.torchMode = .on
self?.captureDevice.unlockForConfiguration()
}

Related

Set resolution for camera in IOS for AI

I am working on a project where I start the camera and then capture images live to image processing. This project has guided me TensorFlow - Image Classification. One thing I can't figure out despite looking in the documentation and searching on Google is how to set the resolution of the camera.
Is it possible to either get it, or to set it programmatically? Here is the code where I create the camera:
private func addVideoDeviceInput() -> Bool {
/**Tries to get the default back camera.
*/
guard let camera = AVCaptureDevice.default(.builtInWideAngleCamera, for: .video, position: .back) else {
return false
}
do {
let videoDeviceInput = try AVCaptureDeviceInput(device: camera)
if session.canAddInput(videoDeviceInput) {
session.addInput(videoDeviceInput)
return true
}
else {
return false
}
}
catch {
fatalError("Cannot create video device input")
}
}
Yes, check out sessionPreset.
session.sessionPreset = .photo /// here!
if session.canAddInput(videoDeviceInput) {
session.addInput(videoDeviceInput)
return true
}

How to switch camera while video recording? (Ex: Snapchat, Facebook..etc)

I'm trying to switch camera while recording video like 'Snapchat' and 'Facebook' behavior. Switch camera work fine before start record video, but need to handle it separately when recording.
Any help appreciate...
public func switchCamera() {
guard isVideoRecording != true else {
//TODO: Handle switch camera when recording in here
return
}
guard session.isRunning == true else {
return
}
switch currentCamera {
case .front:
currentCamera = .rear
case .rear:
currentCamera = .front
}
session.stopRunning()
sessionQueue.async { [unowned self] in
// remove and re-add inputs and outputs
for input in self.session.inputs {
self.session.removeInput(input as! AVCaptureInput)
}
// add new input
self.addInputs()
self.session.startRunning()
}
}
You can try to write your frames with AVAssetWriter. Here you can find an example.

Keep torch on while taking video iOS swift

I built a camera app for auto capture. I want to keep the flash on as long as the camera is on. I set the following code :
cameraDevice = AVCaptureDevice.defaultDevice(withMediaType: AVMediaTypeVideo)
if (cameraDevice.hasTorch) {
do {
try cameraDevice.lockForConfiguration()
if cameraDevice.isTorchActive {
cameraDevice.torchMode = AVCaptureTorchMode.on
} else {
// sets the torch intensity to 100%
try cameraDevice.setTorchModeOnWithLevel(0.8)
}
cameraDevice.unlockForConfiguration()
} catch {
print(error)
}
}
But when I run the app, it only flashes for one time and then goes off. How can I solve this problem?
Call this method
Inside your camera active/Open func or When device camera active -
func flashActive() {
if let currentDevice = AVCaptureDevice.default(for: AVMediaType.video), currentDevice.hasTorch {
do {
try currentDevice.lockForConfiguration()
let torchOn = !currentDevice.isTorchActive
try currentDevice.setTorchModeOn(level:1.0)//Or whatever you want
currentDevice.torchMode = torchOn ? .on : .off
currentDevice.unlockForConfiguration()
} catch {
print("error")
}
}
}

AVCaptureDevice flashmode for front camera is not bright like iPhone camera app

Here's the code I'm working with:
func toggleflash(On on: Bool){
guard let session = captureSession where session.running == true else {
return
}
session.beginConfiguration()
let input = session.inputs[0] as! AVCaptureDeviceInput
if input.device.isFlashModeSupported(.On) {
do {
try input.device.lockForConfiguration()
input.device.flashMode = .On
input.device.unlockForConfiguration()
}catch {}
}
session.commitConfiguration()
}
When I use this function to use flash on my front camera, the iPhone screen does everything you'd expect. It flashes, but it doesn't illuminate the subject the way that the iPhone camera app/snapchat/instagram does. So I wonder if it's not bright enough or if it's not working correctly.

Crash in AVCaptureSession when adding an AVCaptureDeviceInput

I have a weird crash showing on Crashlytics when setting up a camera session.
The stacktrace shows that the crash occurred at the method addInput.
func setupCamSession(){
self.captureSession = AVCaptureSession()
self.cameraView.setSession(self.captureSession)
self.sessionQueue = dispatch_queue_create("com.myapp.camera_queue", DISPATCH_QUEUE_SERIAL)
self.setupResult = .Success
switch AVCaptureDevice.authorizationStatusForMediaType(AVMediaTypeVideo){
case .Authorized:
break //already we set it to success
case .NotDetermined:
// The user has not yet been presented with the option to grant video access.
// We suspend the session queue to delay session setup until the access request has completed
dispatch_suspend(self.sessionQueue)
AVCaptureDevice.requestAccessForMediaType(AVMediaTypeVideo, completionHandler: { (granted) -> Void in
if ( !granted ) {
self.setupResult = .CameraNotAuthorized
}
dispatch_resume(self.sessionQueue)
})
default:
self.setupResult = .CameraNotAuthorized
}
dispatch_async(self.sessionQueue){
if self.setupResult != .Success{
return
}
//link input to captureSession
guard let videoDevice = self.deviceWithMediaType(AVMediaTypeVideo, position: AVCaptureDevicePosition.Back) else{
AppLog("Video Device Unavailable")
self.setupResult = .SessionConfigurationFailed
return
}
var videoDeviceInput: AVCaptureDeviceInput!
do {
videoDeviceInput = try AVCaptureDeviceInput(device: videoDevice)
}catch {
AppLog("Could not create video device input")
}
/////////////////////////////////////////////////////
self.captureSession.beginConfiguration()
if self.captureSession.canAddInput(videoDeviceInput){
self.captureSession.addInput(videoDeviceInput)
self.videoDeviceInput = videoDeviceInput
self.videoDevice = videoDevice
dispatch_async(dispatch_get_main_queue()){
//update the cameraView layer on the main thread
let previewLayer : AVCaptureVideoPreviewLayer = self.cameraView.layer as! AVCaptureVideoPreviewLayer
previewLayer.connection.videoOrientation = AVCaptureVideoOrientation(ui:UIApplication.sharedApplication().statusBarOrientation)
}
}else{
AppLog("Could not add video device input to the session")
self.setupResult = .SessionConfigurationFailed
}
//link output to captureSession
let stillImageOutput = AVCaptureStillImageOutput()
if self.captureSession.canAddOutput(stillImageOutput){
self.captureSession.addOutput(stillImageOutput)
self.stillImageOutput = stillImageOutput
stillImageOutput.outputSettings = [AVVideoCodecKey : AVVideoCodecJPEG]
}else{
AppLog("Could not add still image output to the session")
self.setupResult = .SessionConfigurationFailed
}
self.captureSession.commitConfiguration()
/////////////////////////////////////////////////////
}
}
func runSession(){
dispatch_async(self.sessionQueue){
switch self.setupResult!{
case .Success:
self.videoDeviceInput!.device.addObserver(self, forKeyPath: "adjustingFocus", options: NSKeyValueObservingOptions.New, context: nil)
self.captureSession.addObserver(self, forKeyPath: "running", options: [.New], context: &SessionRunningContext)
self.captureSession.startRunning()
self.captureSessionRunning = self.captureSession.running
if !self.captureSessionRunning {
self.captureSession.removeObserver(self, forKeyPath: "running", context: &SessionRunningContext)
self.videoDeviceInput?.device?.removeObserver(self, forKeyPath: "adjustingFocus", context: nil)
}
default:
//Handle errors.
}
}
func stopCaptureSession(){
dispatch_async(self.sessionQueue){
if self.setupResult == .Success{
if self.captureSessionRunning{
self.captureSession.stopRunning()
self.videoDeviceInput?.device?.removeObserver(self, forKeyPath: "adjustingFocus", context: nil)
self.captureSession.removeObserver(self, forKeyPath: "running", context: &SessionRunningContext)
}
self.captureSessionRunning = false
}
}
}
The setupCamSession is called in viewDidLoad, the runSession in viewWillAppear and I have also a stopSession method in viewWillDisappear. Everything related to the camera session is dispatched on a background serial queue.
The crash doesn't happen 100% of the time and I am unable to reproduce the crash on the device I use.
Thanks
Make sure you are removing observers on deinit. I saw this occurring when coming back to the camera capture screen and I didn't remove the observer for adjustingFocus. Once I removed that in deinit all was well.
Had the same problem. It was resolved after adding an usage description for the Privacy - Camera Usage Description in the Info.plist file. This answer contains tips on how to set up the description:
Request Permission for Camera and Library in iOS 10 - Info.plist

Resources