I am working on a project where I start the camera and then capture images live to image processing. This project has guided me TensorFlow - Image Classification. One thing I can't figure out despite looking in the documentation and searching on Google is how to set the resolution of the camera.
Is it possible to either get it, or to set it programmatically? Here is the code where I create the camera:
private func addVideoDeviceInput() -> Bool {
/**Tries to get the default back camera.
*/
guard let camera = AVCaptureDevice.default(.builtInWideAngleCamera, for: .video, position: .back) else {
return false
}
do {
let videoDeviceInput = try AVCaptureDeviceInput(device: camera)
if session.canAddInput(videoDeviceInput) {
session.addInput(videoDeviceInput)
return true
}
else {
return false
}
}
catch {
fatalError("Cannot create video device input")
}
}
Yes, check out sessionPreset.
session.sessionPreset = .photo /// here!
if session.canAddInput(videoDeviceInput) {
session.addInput(videoDeviceInput)
return true
}
Related
I'm using AVCaptureSession to capture video.
I want to light on the torch during the whole session, but once the session is started, the light turns automatically off.
There is a lot of posts here showing how to turn on the torch. It works, unless the capture session is started.
here's the way I start the session
guard let camera = AVCaptureDevice.default(for: .video) else { return }
self.captureSession.beginConfiguration()
let deviceInput = try AVCaptureDeviceInput(device: camera)
self.captureSession.addInput(deviceInput)
let videoOutput = AVCaptureVideoDataOutput()
videoOutput.setSampleBufferDelegate(self, queue: DispatchQueue(label: "com.axelife.axcapturer.samplebufferdelegate"))
self.captureSession.addOutput(videoOutput)
try camera.setLight(on: true)
self.captureSession.commitConfiguration()
DispatchQueue(label: "capturesession").async {
self.captureSession.startRunning()
}
And my code to turn on the light
extension AVCaptureDevice {
func setLight(on: Bool) throws {
try self.lockForConfiguration()
if on {
try self.setTorchModeOn(level: 1)
}
else {
self.torchMode = .off
}
self.unlockForConfiguration()
}
}
With that code, the light turns on during < 0.5 seconds, and turn back off automatically.
Ok, I figured out.
The torch simply must be lighted on after the session start.
So instead of:
try camera.setLight(on: true)
self.captureSession.commitConfiguration()
DispatchQueue(label: "capturesession").async {
self.captureSession.startRunning()
}
just do
self.captureSession.commitConfiguration()
DispatchQueue(label: "capturesession").async {
self.captureSession.startRunning()
try camera.setLight(on: true)
}
For me, the torch turns off when I change the videoOrientation of the capture preview layer. I turn it on again after that happens. (DispatchQueue.main.async is important, for some reason, it doesn't work if I leave it out)
previewConnection.videoOrientation = orientation
do {
try captureDevice.lockForConfiguration()
} catch {
print(error)
}
DispatchQueue.main.async { [weak self] in
self?.captureDevice.torchMode = .on
self?.captureDevice.unlockForConfiguration()
}
I built a camera app for auto capture. I want to keep the flash on as long as the camera is on. I set the following code :
cameraDevice = AVCaptureDevice.defaultDevice(withMediaType: AVMediaTypeVideo)
if (cameraDevice.hasTorch) {
do {
try cameraDevice.lockForConfiguration()
if cameraDevice.isTorchActive {
cameraDevice.torchMode = AVCaptureTorchMode.on
} else {
// sets the torch intensity to 100%
try cameraDevice.setTorchModeOnWithLevel(0.8)
}
cameraDevice.unlockForConfiguration()
} catch {
print(error)
}
}
But when I run the app, it only flashes for one time and then goes off. How can I solve this problem?
Call this method
Inside your camera active/Open func or When device camera active -
func flashActive() {
if let currentDevice = AVCaptureDevice.default(for: AVMediaType.video), currentDevice.hasTorch {
do {
try currentDevice.lockForConfiguration()
let torchOn = !currentDevice.isTorchActive
try currentDevice.setTorchModeOn(level:1.0)//Or whatever you want
currentDevice.torchMode = torchOn ? .on : .off
currentDevice.unlockForConfiguration()
} catch {
print("error")
}
}
}
Does anyone know how to set up low light boost for your ios camera app? This is the code i have but can't make it to work.
let backCamera = AVCaptureDevice.defaultDevice(withMediaType: AVMediaTypeVideo)
do {
try backCamera?.lockForConfiguration()
if (backCamera?.isLowLightBoostSupported == true) {
backCamera?.automaticallyEnablesLowLightBoostWhenAvailable = true
}
backCamera?.unlockForConfiguration()
}
catch{
print(error)
}
You don`t need that. Put this in the didMoveToView:
captureSession!.sessionPreset = AVCaptureSessionPresetPhoto
Here's the code I'm working with:
func toggleflash(On on: Bool){
guard let session = captureSession where session.running == true else {
return
}
session.beginConfiguration()
let input = session.inputs[0] as! AVCaptureDeviceInput
if input.device.isFlashModeSupported(.On) {
do {
try input.device.lockForConfiguration()
input.device.flashMode = .On
input.device.unlockForConfiguration()
}catch {}
}
session.commitConfiguration()
}
When I use this function to use flash on my front camera, the iPhone screen does everything you'd expect. It flashes, but it doesn't illuminate the subject the way that the iPhone camera app/snapchat/instagram does. So I wonder if it's not bright enough or if it's not working correctly.
I am using AvFoundation for camera.
This is my live preview:
It looks good. When user presses to "Button" I am creating a snapshot on same screen. (Like snapchat)
I am using following code for capturing image and showing it on the screen:
self.stillOutput.captureStillImageAsynchronouslyFromConnection(videoConnection){
(imageSampleBuffer : CMSampleBuffer!, _) in
let imageDataJpeg = AVCaptureStillImageOutput.jpegStillImageNSDataRepresentation(imageSampleBuffer)
let pickedImage: UIImage = UIImage(data: imageDataJpeg)!
self.captureSession.stopRunning()
self.previewImageView.frame = CGRect(x:0, y:0, width:UIScreen.mainScreen().bounds.width, height:UIScreen.mainScreen().bounds.height)
self.previewImageView.image = pickedImage
self.previewImageView.layer.zPosition = 100
}
After user captures an image screen looks like this:
Please look at the marked area. It wasn't looking on the live preview screen(Screenshot 1).
I mean live preview is not showing everything. But I am sure my live preview works well because I compared with other camera apps and everything was same as my live preview screen. I guess I have a problem with captured image.
I am creating live preview with following code:
override func viewWillAppear(animated: Bool) {
super.viewWillAppear(animated)
captureSession.sessionPreset = AVCaptureSessionPresetPhoto
let devices = AVCaptureDevice.devices()
for device in devices {
// Make sure this particular device supports video
if (device.hasMediaType(AVMediaTypeVideo)) {
// Finally check the position and confirm we've got the back camera
if(device.position == AVCaptureDevicePosition.Back) {
captureDevice = device as? AVCaptureDevice
}
}
}
if captureDevice != nil {
beginSession()
}
}
func beginSession() {
let err : NSError? = nil
do {
try captureSession.addInput(AVCaptureDeviceInput(device: captureDevice))
} catch{
}
captureSession.addOutput(stillOutput)
if err != nil {
print("error: \(err?.localizedDescription)")
}
previewLayer = AVCaptureVideoPreviewLayer(session: captureSession)
previewLayer?.videoGravity=AVLayerVideoGravityResizeAspectFill
self.cameraLayer.layer.addSublayer(previewLayer!)
previewLayer?.frame = self.cameraLayer.frame
captureSession.startRunning()
}
My cameraLayer :
How can I resolve this problem?
Presumably you are using an AVCaptureVideoPreviewLayer. So it sounds like this layer is incorrectly placed or incorrectly sized, or it has the wrong AVLayerVideoGravity setting. Part of the image is off the screen or cropped; that's why you don't see that part of what the camera sees while you are previewing.
Ok, I found the solution.
I used the
captureSession.sessionPreset = AVCaptureSessionPresetHigh
Instead of
captureSession.sessionPreset = AVCaptureSessionPresetPhoto
Then problem fixed.