How to know does an iphone device support triple-cameras system? - ios

In my project, I am trying to support triple cameras system. But I don't know how to check whether a device has triple cameras?

you can do this with following code either device has builtin dual back camera or wide angle back camera
var currentDevice:AVCaptureDevice?
if let device = AVCaptureDevice.default(.builtInDualCamera, for: AVMediaType.video, position: .back)
{
currentDevice = device
}
else if let device = AVCaptureDevice.default(.builtInTripleCamera, for: .video, position: .back)
{
currentDevice = device
}
else if let device = AVCaptureDevice.default(.builtInWideAngleCamera, for: AVMediaType.video, position: .back)
{
currentDevice = device
}
else
{
print("Error: no camera available")
}
Hope this will helps you 😊

In the AVFoundation kit you have an AVCaptureDevice, which have an DeviceType which you can put as the default.
if let device = AVCaptureDevice.default(.builtInTripleCamera, for: .video, position: .back) {
}

Related

Set resolution for camera in IOS for AI

I am working on a project where I start the camera and then capture images live to image processing. This project has guided me TensorFlow - Image Classification. One thing I can't figure out despite looking in the documentation and searching on Google is how to set the resolution of the camera.
Is it possible to either get it, or to set it programmatically? Here is the code where I create the camera:
private func addVideoDeviceInput() -> Bool {
/**Tries to get the default back camera.
*/
guard let camera = AVCaptureDevice.default(.builtInWideAngleCamera, for: .video, position: .back) else {
return false
}
do {
let videoDeviceInput = try AVCaptureDeviceInput(device: camera)
if session.canAddInput(videoDeviceInput) {
session.addInput(videoDeviceInput)
return true
}
else {
return false
}
}
catch {
fatalError("Cannot create video device input")
}
}
Yes, check out sessionPreset.
session.sessionPreset = .photo /// here!
if session.canAddInput(videoDeviceInput) {
session.addInput(videoDeviceInput)
return true
}

error noCamerasAvailable, which is only on iPhone <8+ Swift

i have a strange issue which I can notice only on devices older than iPhone 8+,
I don't know how can I fix that but error message is very simple 'noCamerasAvailable,'
Everything with permissions should be okay cause it works on my iPhone XS Max and my friend's iPhone X. That is my simply code to display camera view
cameraController.prepare {(error) in
if let error = error {
print("Camera error:")
print(error)
}
try? self.cameraController.displayPreview(on: self.view)
}
Fixed I had to replace this
let session = AVCaptureDevice.DiscoverySession(deviceTypes: [.builtInDualCamera], mediaType: AVMediaType.video, position: .unspecified)
to this
let session = AVCaptureDevice.DiscoverySession(deviceTypes: [.builtInWideAngleCamera], mediaType: AVMediaType.video, position: .unspecified)

I get this ridiculous value when reading IOS camera's lens position

Using this code with the BuiltInWideAngleCamera in Swift on an iPhone XS MAX running iOS 12.1.2:
let lensPos: Float = AVCaptureDevice.currentLensPosition;
lockCameraForSettings();
self.inputDevice?.setFocusModeLocked(lensPosition: LensPos, completionHandler: { (time) -> Void in})
unlockCameraForShooting();
results in a crash:
[AVCaptureDevice setFocusModeLockedWithLensPosition:completionHandler:] The passed lensPosition -340282346638528859811704183484516925440.000000 is out-of-range [0, 1]'
The camera is running and visibly in focus on the screen preview. How is it possible for it to be in this configuration?
Inserting a constant value between 0-1 works, at least in that it does not throw an error.
I believe you mean to use .lensPosition instead of .currentLensPosition which is a special constant representing the position of the lens. You can only access .lensPosition when you are referencing a instance of type AVCaptureDevice.
var captureDevice: AVCaptureDevice?
// Plus models and X's
if let device = AVCaptureDevice.default(.builtInDualCamera,
for: .video, position: .back) {
captureDevice = device
// Single Lens devices.
} else if let device = AVCaptureDevice.default(.builtInWideAngleCamera,
for: .video, position: .back) {
captureDevice = device
} else {
// No camera was found, is it broke?
print("Missing expected back camera device.")
}
if let device = captureDevice {
// We have a device, do something with it.
print(device.lensPosition)
}

AVCaptureDevice DiscoverySession for iOS 9.0

I'm developing a QR and Matrix Code reader app. I'm getting AVCaptureDeviceInput with AvCaptureDevice.DiscoverySession. My problem is that It is only available after iOS 10.0. How can I get it for fallback versions?
// Get the back-facing camera for capturing videos
if #available(iOS 10.0, *) {
let deviceDiscoverySession = AVCaptureDevice.DiscoverySession(deviceTypes: [.builtInWideAngleCamera], mediaType: AVMediaType.video, position: .back)
guard let captureDevice = deviceDiscoverySession.devices.first else {
print("Failed to get the camera device")
return
}
} else {
// Fallback on earlier versions
}
do {
// Get an instance of the AVCaptureDeviceInput class using the previous device object.
let input = try AVCaptureDeviceInput(device: captureDevice)
// Set the input device on the capture session.
captureSession.addInput(input)
// Initialize a AVCaptureMetadataOutput object and set it as the output device to the capture session.
let captureMetadataOutput = AVCaptureMetadataOutput()
captureSession.addOutput(captureMetadataOutput)
// Set delegate and use the default dispatch queue to execute the call back
captureMetadataOutput.setMetadataObjectsDelegate(self, queue: DispatchQueue.main)
//TODO: Decide the data types!
captureMetadataOutput.metadataObjectTypes = [AVMetadataObject.ObjectType.dataMatrix]
//captureMetadataOutput.metadataObjectTypes = [AVMetadataObject.ObjectType.qr]
} catch {
// If any error occurs, simply print it out and don't continue any more.
print(error)
return
}
If you want to get list of AVCaptureDevice, you can use
let cameras = AVCaptureDevice.devices(for: AVMediaType.video)
in iOS 9

iOS Device not listed by AVCaptureDevice.devices() unless Quicktime is opened

I am trying to list the the devices connected to my machine using AVCaptureDevice.devices() in a Swift playground.
import Cocoa
import Foundation
import AVFoundation
import CoreMediaIO
var prop = CMIOObjectPropertyAddress(
mSelector: CMIOObjectPropertySelector(kCMIOHardwarePropertyAllowScreenCaptureDevices),
mScope: CMIOObjectPropertyScope(kCMIOObjectPropertyScopeGlobal),
mElement: CMIOObjectPropertyElement(kCMIOObjectPropertyElementMaster))
var allow : UInt32 = 1
let dataSize : UInt32 = 4
let zero : UInt32 = 0
CMIOObjectSetPropertyData(CMIOObjectID(kCMIOObjectSystemObject), &prop, zero, nil, dataSize, &allow)
var session = AVCaptureSession()
session.sessionPreset = AVCaptureSession.Preset.low
let devices = AVCaptureDevice.devices()
for device in devices {
let deviceID = device.uniqueID
let deviceName = device.localizedName
print("\(deviceID): \(deviceName)")
}
This gives me the following result even though my iPhone is connected to my computer
04-52-c7-c1-65-c4:input: Bose Tito
AppleHDAEngineInput:1B,0,1,0:1: Built-in Microphone
CC26311ECFEG1HNBA: FaceTime HD Camera
Now something I've noticed is that if I launch Quicktime Player, select New Movie Recording and selecting my device as a Camera source, then my device is listed
04-52-c7-c1-65-c4:input: Bose Tito
AppleHDAEngineInput:1B,0,1,0:1: Built-in Microphone
CC26311ECFEG1HNBA: FaceTime HD Camera
12345b7406eeb053e2d5cded2527315d6110a16e: tito
Is there anyway to prevent this?
You need to wait a bit for the device to appear.
Register for AVCaptureDeviceWasConnectedNotification to be notified once it becomes available.
How did you implement checking that the iOS device was ready to be used?
I have same issue and with a project I am working on.
At the moment if I have quicktime open it works, (i also hard coded the device ID but need to get rid of that)
override func viewDidLoad() {
super.viewDidLoad()
enableDalDevices()
camera.layer = CALayer()
let session:AVCaptureSession = AVCaptureSession()
session.sessionPreset = AVCaptureSession.Preset.high
let listdevices:Array = (AVCaptureDevice.devices())
// Grabs iOS device on system change value in array [X] depending on devices connected on your own Mac
let device:AVCaptureDevice = listdevices[6]
do {
try session.addInput(AVCaptureDeviceInput(device: device))
//Preview
let previewLayer:AVCaptureVideoPreviewLayer = AVCaptureVideoPreviewLayer(session: session)
let myView:NSView = self.view
previewLayer.frame = myView.bounds
previewLayer.videoGravity = AVLayerVideoGravity.resizeAspectFill
previewLayer.connection?.videoOrientation = AVCaptureVideoOrientation.landscapeRight
self.camera.layer?.addSublayer(previewLayer)
session.startRunning()
print(listdevices)
} catch {
}
}
Using #Valerian's answer to help answer #adamprocter, here is how I ended up solving it.
override func viewDidLoad() {
super.viewDidLoad()
enableDalDevices()
camera.layer = CALayer()
let session:AVCaptureSession = AVCaptureSession()
session.sessionPreset = AVCaptureSession.Preset.high
let discoverySession = AVCaptureDevice.DiscoverySession(deviceTypes: [.externalUnknown], mediaType: .muxed, position: .unspecified)
//You get the devices here and their IDs and do whatever you want
showDevices(devices: discoverySession.devices)
//Register for a notification just like #Valerian said earlier
NotificationCenter.default.addObserver(self, selector: #selector(newDevice), name: NSNotification.Name.AVCaptureDeviceWasConnected, object: nil)
}
#objc func newDevice() {
let discoverySession = AVCaptureDevice.DiscoverySession(deviceTypes: [.externalUnknown], mediaType: .muxed, position: .unspecified)
showDevices(devices: discoverySession.devices)
}

Resources