iOS Native Camera AVFoundation brightness exposure difference - ios

AVFoundation brightness/exposure is different than the native camera. Is it possible to replicate the native camera settings with the AVFoundation framework?

auto-exposure in Swift 4.2
// MARK: Exposure Methods
#objc
func tapToExpose(recognizer: UIGestureRecognizer){
if activeInput.device.isExposurePointOfInterestSupported{
let point = recognizer.location(in: camPreview)
// The tap location is converted from the `GestureRecognizer` to the preview's coordinates.
let pointOfInterest = previewLayer.captureDevicePointConverted(fromLayerPoint: point)
// Then , the second conversion is made for the coordinate space of the camera.
showMarkerAtPoint(point: point, marker: exposureMarker)
exposeAtPoint(pointOfInterest)
}
}
func exposeAtPoint(_ point: CGPoint){
let device = activeInput.device
if device.isExposurePointOfInterestSupported, device.isFocusModeSupported(.continuousAutoFocus){
do{
try device.lockForConfiguration()
device.exposurePointOfInterest = point
device.exposureMode = .continuousAutoExposure
if device.isFocusModeSupported(.locked){
// Now let us add the illumination for the `observeValueForKeyPath` method,
device.addObserver(self, forKeyPath: kExposure, options: .new, context: &adjustingExposureContext)
device.unlockForConfiguration()
}
}
catch{
print("Error Exposing on POI: \(String(describing: error.localizedDescription))")
}
}
}
// MARK: KVO
override func observeValue(forKeyPath keyPath: String?, of object: Any?, change: [NSKeyValueChangeKey : Any]?, context: UnsafeMutableRawPointer?) {
// First , check to make sure that the context matches the adjusting exposure context,
// Otherwise, pass the observation on to super.
if context == &adjustingExposureContext {
let device = object as! AVCaptureDevice
// then determine if the camera has stopped adjusting exposure , and if the locked mode is supported.
if !device.isAdjustingExposure , device.isExposureModeSupported(.locked){
// remove self from the observer to stop subsequent notification,
device.removeObserver(self, forKeyPath: kExposure, context: &adjustingExposureContext)
DispatchQueue.main.async {
do{
try device.lockForConfiguration()
device.exposureMode = .locked
device.unlockForConfiguration()
}
catch{
print("Error exposing on POI: \(String(describing: error.localizedDescription))")
}
}// DispatchQueue.main.async
}// if !device.isAdjustingExposure , device.isExposureModeSupported(.locked)
}// if context == &adjustingExposureContext {
else{
super.observeValue(forKeyPath: keyPath, of: object, change: change, context: context)
}
}

If you're looking for auto-exposure, set a key-value observer for "adjustingExposure" and set the exposure mode to AVCaptureExposureModeContinuousAutoExposure. If you're trying to implement manual exposure, have a look at this WWDC session on camera controls.

Related

How to get WhiteBalance (Kelvin) from captureOutput

Apple has various ways to change and view the Kelvin of using AVCaptureDevice
https://developer.apple.com/documentation/avfoundation/avcapturedevice/white_balance
Example:
guard let videoDevice = AVCaptureDevice
.default(.builtInWideAngleCamera, for: .video, position: .back) else {
return
}
guard
let videoDeviceInput = try? AVCaptureDeviceInput(device: videoDevice),
captureSession.canAddInput(videoDeviceInput) else {
print("There seems to be a problem with the camera on your device.")
return
}
captureSession.addInput(videoDeviceInput)
let kelvin = videoDevice.temperatureAndTintValues(for: videoDevice.deviceWhiteBalanceGains)
print("Kelvin temp \(kelvin.temperature)")
print("Kelvin tint \(kelvin.tint)")
let captureOutput = AVCaptureVideoDataOutput()
captureOutput.videoSettings = [kCVPixelBufferPixelFormatTypeKey as String: Int(kCVPixelFormatType_32BGRA)]
captureOutput.setSampleBufferDelegate(self, queue: DispatchQueue.global(qos: DispatchQoS.QoSClass.default))
captureSession.addOutput(captureOutput)
This will always return
Kelvin temp 3900.0889
Kelvin tint 4.966322
How can I get the White Balance (Kelvin value) through the live camera feed?
It is giving you one value because videoDevice.temperatureAndTintValues(for: videoDevice.deviceWhiteBalanceGains) is called only once. To get an updating value following the camera feed, you have two options:
Key-value observing, which will notify when the WB changes
Call the function videoDevice.temperatureAndTintValues(for: videoDevice.deviceWhiteBalanceGains) for each frame.
I would suggest you use the second, key-value observing is somewhat annoying. In that case, I guess you already have implemented the method func captureOutput(_ output: AVCaptureOutput, didOutput sampleBuffer: CMSampleBuffer, from connection: AVCaptureConnection) of the AVCaptureVideoDataOutputSampleBufferDelegate. That method is called every time a frame is returned, so, you can update your videoDevice.temperatureAndTintValues for each frame of the live camera feed.
For the key-value observing, you first setup the observer (e.g in viewDidAppear), for example:
func addObserver() {
self.addObserver(self, forKeyPath: "videoDevice.deviceWhiteBalanceGains", options: .new, context: &DeviceWhiteBalanceGainsContext)
}
Keep a reference to the videoDevice, declaring it this way:
#objc dynamic var videoDevice : AVCaptureDevice!
Then #objc and dynamic are needed for the key-value observing.
Now you can implement this function, which will be called every time the observed value changes:
override func observeValue(forKeyPath keyPath: String?, of object: Any?, change: [NSKeyValueChangeKey : Any]?, context: UnsafeMutableRawPointer?) {
guard let context = context else {
super.observeValue(forKeyPath: keyPath, of: object, change: change, context: nil)
return
}
if context == &DeviceWhiteBalanceGainsContext {
// do your work on WB here
}
}
Finally, you can define the context this way (I have it outside my ViewController):
private var DeviceWhiteBalanceGainsContext = 0
I have implemented both methods in my apps and they both work well.
WARNING: sometimes, the WB values will be outside the allowed range (especially at startup) and the API raises an exception. Make sure to handle this otherwise, the app will crash.

AVPictureInPictureController cannot activate programmatically during a scrolling

I have a video in a cell, if I put it in PIP mode with the button everything works fine but if I do it programmatically when the cell go out of screen doesn't not automatically put in PIP mode, can just activate it via a button?
func scrollViewDidScroll(_ scrollView: UIScrollView) {
guard let cell = self.playerCell, self.playerController != nil else {
return
}
let rect = self.view.convert(cell.frame, from: scrollView)
if rect.origin.y < 0, self.pipController == nil {
self.pipController = self.startPictureInPicture()
}
}
func startPictureInPicture() -> AVPictureInPictureController? {
guard AVPictureInPictureController.isPictureInPictureSupported() else {
return nil
}
let layer = AVPlayerLayer(player: self.playerController.player)
try? AVAudioSession.sharedInstance().setActive(true)
if let pipController = AVPictureInPictureController(playerLayer: layer) {
if pipController.isPictureInPicturePossible {
pipController.startPictureInPicture()
} else {
pipController.addObserver(self, forKeyPath: "isPictureInPicturePossible", options: [.new, .initial], context: nil)
}
}
return nil
}
override func observeValue(forKeyPath keyPath: String?, of object: Any?, change: [NSKeyValueChangeKey : Any]?, context: UnsafeMutableRawPointer?) {
if keyPath == "isPictureInPicturePossible", let pipController = object as? AVPictureInPictureController, pipController.isPictureInPicturePossible {
pipController.startPictureInPicture()
}
}
UPDATE: the debug console always show a warning Unbalanced calls to begin/end appearance transitions for <UIViewController>. but this I have solved with:
DispatchQueue.main.asyncAfter(deadline: .now() + 0.1) {
pipController.startPictureInPicture()
}
As per Apple's guidelines, we should not start PIP mode without user's action.
Only begin Picture in Picture playback in response to user interaction and never programmatically. The App Store review team rejects apps that fail to follow this requirement.
Source: https://developer.apple.com/documentation/avkit/adopting_picture_in_picture_in_a_custom_player
I would suggest you create your own custom PIP view inside the app to avoid rejection.

Is it possible to use KVO to track changes to the `MKMapView.camera.heading` property?

I am trying to track changes to the map orientation using the camera heading property.
// Register for notifications of changes to camera
if let camera = self.mapView?.camera {
self.camera = camera
camera.addObserver(self, forKeyPath: "heading", options: NSKeyValueObservingOptions.new, context: &MapViewController.myContext)
}
...
override func observeValue(forKeyPath keyPath: String?, of object: Any?, change: [NSKeyValueChangeKey : Any]?, context: UnsafeMutableRawPointer?) {
if context == &MapViewController.myContext {
if keyPath == "heading" {
if let heading = change?[NSKeyValueChangeKey.newKey] as? CLLocationDirection {
self.heading = heading
}
}
} else {
super.observeValue(forKeyPath: keyPath, of: object, change: change, context: context)
}
}
The camera property mode is copy so you are not observing the current camera instance just one that was copied when you called the getter. You need to observe key path of #"camera.heading" or #"camera" on the map view and hope that a new camera object is set internally when the heading changes.

Key value observing exposureDuration

I'm building a camera app, and I'm trying to expose the current exposure duration to the user. Since this value constantly changes until manually set, I need to use kvo to stream the values to the user. I've successfully done this with the ISO, and can observe changes to the exposureDuration, but cannot coerce the new value to a CMTime object (which is what exposureDuration is). Below is the code I'm using to try and accomplish this:
override init() {
super.init()
captureDevice = self.selectCamera()
captureDevice?.addObserver(self, forKeyPath: "ISO", options: .New, context: &isoContext)
captureDevice?.addObserver(self, forKeyPath: "exposureDuration", options: .New, context: &shutterContext)
}
deinit {
captureDevice?.removeObserver(self, forKeyPath: "ISO")
captureDevice?.removeObserver(self, forKeyPath: "exposureDuration")
}
override func observeValueForKeyPath(keyPath: String?, ofObject object: AnyObject?, change: [String : AnyObject]?, context: UnsafeMutablePointer<Void>) {
let newValue = change?[NSKeyValueChangeNewKey]
if context == &isoContext {
store.iso.value = newValue as! Float
} else if context == &shutterContext {
// The app crashes at this line.
// Thread 1: EXC_BREAKPOINT (code=1, subcode=0x100091670)
// newValue is "AnyObject" in the debug area
store.shutterSpeed.value = newValue as! CMTime
}
}
Am I doing something wrong, or is this a legitimate bug that I need to file with apple?
exposureDuration's newValue is not CMTime, but NSValue.
This is fixed code(swift3).
store.shutterSpeed.value = (newValue as! NSValue).timeValue

Detect when camera is auto-focusing

I am working on video application. I want to discard the video frames when camera is autofocusing. During autofocus image captured become blurred and image processing for that frame become bad but once autofocus is done, image processing become excellent. Any body give me solution?
adjustingFocus property.
Indicates whether the device is currently adjusting its focus setting. (read-only)
*Notes: You can observe changes to the value of this property using Key-value observing.
iOS 4.0 and later
https://developer.apple.com/library/ios/documentation/AVFoundation/Reference/AVCaptureDevice_Class/Reference/Reference.html#//apple_ref/occ/instp/AVCaptureDevice/adjustingFocus
Following is a sample code in Swift 3.x.
First a observer should be added to the selected capture device at camera initialization.
captureDevice.addObserver(self, forKeyPath: "adjustingFocus", options: [.new], context: nil)
Then observeValue method is overridden. By accessing the optional value returned by the method, autoFocussing frames can be identified.
override func observeValue(forKeyPath keyPath: String?, of object: Any?, change: [NSKeyValueChangeKey : Any]?, context: UnsafeMutableRawPointer?) {
guard let key = keyPath, let changes = change else {
return
}
if key == "adjustingFocus" {
let changedValue = changes[.newKey]
if (changedValue! as! Bool){
// camera is auto-focussing
}else{
// camera is not auto-focussing
}
}
}
Example on Swift 4+
class ViewController: UIViewController, AVCapturePhotoCaptureDelegate {
//#objc var captureDevice: AVCaptureDevice?
override func viewWillAppear(_ animated: Bool) {
super.viewWillAppear(animated)
self.addObservers()
}
func addObservers() {
self.addObserver(self, forKeyPath: "captureDevice.adjustingFocus", options: .new, context: nil)
}
func removeObservers() {
self.removeObserver(self, forKeyPath: "captureDevice.adjustingFocus")
}
override func observeValue(forKeyPath keyPath: String?, of object: Any?, change: [NSKeyValueChangeKey: Any]?, context: UnsafeMutableRawPointer?) {
if keyPath == "captureDevice.adjustingFocus" {
print("============== adjustingFocus: \(self.captureDevice?.lensPosition)")
}
} //End of class
Observing adjustingFocus is not working for me. It's always no. And I find this.
Note that when traditional contrast detect auto-focus is in use, the AVCaptureDevice adjustingFocus property flips to YES when a focus is underway, and flips back to NO when it is done. When phase detect autofocus is in use, the adjustingFocus property does not flip to YES, as the phase detect method tends to focus more frequently, but in small, sometimes imperceptible amounts. You can observe the AVCaptureDevice lensPosition property to see lens movements that are driven by phase detect AF.
from Apple
I have not try it yet, I will try and update later.
Edit. I have try it, and confirm this's right.

Resources