Apple has various ways to change and view the Kelvin of using AVCaptureDevice
https://developer.apple.com/documentation/avfoundation/avcapturedevice/white_balance
Example:
guard let videoDevice = AVCaptureDevice
.default(.builtInWideAngleCamera, for: .video, position: .back) else {
return
}
guard
let videoDeviceInput = try? AVCaptureDeviceInput(device: videoDevice),
captureSession.canAddInput(videoDeviceInput) else {
print("There seems to be a problem with the camera on your device.")
return
}
captureSession.addInput(videoDeviceInput)
let kelvin = videoDevice.temperatureAndTintValues(for: videoDevice.deviceWhiteBalanceGains)
print("Kelvin temp \(kelvin.temperature)")
print("Kelvin tint \(kelvin.tint)")
let captureOutput = AVCaptureVideoDataOutput()
captureOutput.videoSettings = [kCVPixelBufferPixelFormatTypeKey as String: Int(kCVPixelFormatType_32BGRA)]
captureOutput.setSampleBufferDelegate(self, queue: DispatchQueue.global(qos: DispatchQoS.QoSClass.default))
captureSession.addOutput(captureOutput)
This will always return
Kelvin temp 3900.0889
Kelvin tint 4.966322
How can I get the White Balance (Kelvin value) through the live camera feed?
It is giving you one value because videoDevice.temperatureAndTintValues(for: videoDevice.deviceWhiteBalanceGains) is called only once. To get an updating value following the camera feed, you have two options:
Key-value observing, which will notify when the WB changes
Call the function videoDevice.temperatureAndTintValues(for: videoDevice.deviceWhiteBalanceGains) for each frame.
I would suggest you use the second, key-value observing is somewhat annoying. In that case, I guess you already have implemented the method func captureOutput(_ output: AVCaptureOutput, didOutput sampleBuffer: CMSampleBuffer, from connection: AVCaptureConnection) of the AVCaptureVideoDataOutputSampleBufferDelegate. That method is called every time a frame is returned, so, you can update your videoDevice.temperatureAndTintValues for each frame of the live camera feed.
For the key-value observing, you first setup the observer (e.g in viewDidAppear), for example:
func addObserver() {
self.addObserver(self, forKeyPath: "videoDevice.deviceWhiteBalanceGains", options: .new, context: &DeviceWhiteBalanceGainsContext)
}
Keep a reference to the videoDevice, declaring it this way:
#objc dynamic var videoDevice : AVCaptureDevice!
Then #objc and dynamic are needed for the key-value observing.
Now you can implement this function, which will be called every time the observed value changes:
override func observeValue(forKeyPath keyPath: String?, of object: Any?, change: [NSKeyValueChangeKey : Any]?, context: UnsafeMutableRawPointer?) {
guard let context = context else {
super.observeValue(forKeyPath: keyPath, of: object, change: change, context: nil)
return
}
if context == &DeviceWhiteBalanceGainsContext {
// do your work on WB here
}
}
Finally, you can define the context this way (I have it outside my ViewController):
private var DeviceWhiteBalanceGainsContext = 0
I have implemented both methods in my apps and they both work well.
WARNING: sometimes, the WB values will be outside the allowed range (especially at startup) and the API raises an exception. Make sure to handle this otherwise, the app will crash.
Related
I've posted a more in-depth question to try and get to the bottom of the issue, but in a brief:
I'm attempting to show a PHP/JS-based web application (Laravel) through a WKWebView. However, due to the nature of the script's redirecting properties, the only code I've gotten to actually detect the URL change is with #keyPath(WKWebView.url):
override func viewDidLoad() {
super.viewDidLoad()
webView.navigationDelegate = self
webView.uiDelegate = self
webView.addObserver(self, forKeyPath: #keyPath(WKWebView.url), options: .new, context: nil)
}
override func observeValue(forKeyPath keyPath: String?, of object: Any?, change: [NSKeyValueChangeKey : Any]?, context: UnsafeMutableRawPointer?) {
if keyPath == #keyPath(WKWebView.url) {
print("URL Change:", self.webView.url?.absoluteString ?? "# No value provided")
}
}
However, the output to console is always the same:
URL Change: # No value provided
So I know that the KVO for WKWebView.url is able to fire upon script-based redirection within the WebView. In fact, if you take a look at my other question, it is the only code that can detect this sort of redirection – which is strange, because when launched in Safari (both iOS and macOS), the URL bar is able to reflect those redirected changes to the URL's value. However, when in the WKWebView, none of the WKNavigationDelegate functions are able to detect such a change to the URL.
Is there any way to obtain the URL directly from the keyPath value of WKWebView.url when fired? Are there any alternatives, not described in my previously-mentioned question, that could obtain the URL?
Trying to obtain the URL value from webView.url seems to always return nil.
EDIT: I am able to get the exact URL value with the observerValue function code:
if let key = change?[NSKeyValueChangeKey.newKey] {
print("URL: \(key)") // url value
}
However, I am unable to cast it as a String or pass it to another function otherwise. Is there any way to set this key as a variable, if it .contains("https://")?
I was able to assign the KVO WKWebView.url to a String variable. From there, I was able to pass the String value to a function that then handles each output I'm looking for:
var cURL = ""
override func observeValue(forKeyPath keyPath: String?, of object: Any?, change: [NSKeyValueChangeKey : Any]?, context: UnsafeMutableRawPointer?) {
if let key = change?[NSKeyValueChangeKey.newKey] {
cURL = "\(key)" // Assign key-value to String
print("cURL:", cURL) // Print key-value
cURLChange(url: cURL) // Pass key-value to function
}
}
func cURLChange(url: String) {
if cURL.contains("/projects/new") {
print("User launched new project view")
// Do something
} else {
// Do something else
}
}
A similar solution, using a more modern method (with less hassle), was provided here.
var cURL = ""
var webView: WKWebView!
var webViewURLObserver: NSKeyValueObservation?
override func viewDidLoad() {
super.viewDidLoad()
// 1. Assign changed value to variable
webViewURLObserver = webView.observe(\.url, options: .new) { webView, change in
self.cURL = "\(String(describing: change.newValue))" }
// 2. Print value of WKWebView URL
webViewURLObserver = webView.observe(\.url, options: .new) { webView, change in
print("URL: \(String(describing: change.newValue))"
)}
By using NSKeyValueObservation of an object, you don't need to remove observers or check observer values by keyPath. You can simply set it to observe an object (ie. WKWebView) and run code when a change is observed.
I'm building a camera app, and I'm trying to expose the current exposure duration to the user. Since this value constantly changes until manually set, I need to use kvo to stream the values to the user. I've successfully done this with the ISO, and can observe changes to the exposureDuration, but cannot coerce the new value to a CMTime object (which is what exposureDuration is). Below is the code I'm using to try and accomplish this:
override init() {
super.init()
captureDevice = self.selectCamera()
captureDevice?.addObserver(self, forKeyPath: "ISO", options: .New, context: &isoContext)
captureDevice?.addObserver(self, forKeyPath: "exposureDuration", options: .New, context: &shutterContext)
}
deinit {
captureDevice?.removeObserver(self, forKeyPath: "ISO")
captureDevice?.removeObserver(self, forKeyPath: "exposureDuration")
}
override func observeValueForKeyPath(keyPath: String?, ofObject object: AnyObject?, change: [String : AnyObject]?, context: UnsafeMutablePointer<Void>) {
let newValue = change?[NSKeyValueChangeNewKey]
if context == &isoContext {
store.iso.value = newValue as! Float
} else if context == &shutterContext {
// The app crashes at this line.
// Thread 1: EXC_BREAKPOINT (code=1, subcode=0x100091670)
// newValue is "AnyObject" in the debug area
store.shutterSpeed.value = newValue as! CMTime
}
}
Am I doing something wrong, or is this a legitimate bug that I need to file with apple?
exposureDuration's newValue is not CMTime, but NSValue.
This is fixed code(swift3).
store.shutterSpeed.value = (newValue as! NSValue).timeValue
AVFoundation brightness/exposure is different than the native camera. Is it possible to replicate the native camera settings with the AVFoundation framework?
auto-exposure in Swift 4.2
// MARK: Exposure Methods
#objc
func tapToExpose(recognizer: UIGestureRecognizer){
if activeInput.device.isExposurePointOfInterestSupported{
let point = recognizer.location(in: camPreview)
// The tap location is converted from the `GestureRecognizer` to the preview's coordinates.
let pointOfInterest = previewLayer.captureDevicePointConverted(fromLayerPoint: point)
// Then , the second conversion is made for the coordinate space of the camera.
showMarkerAtPoint(point: point, marker: exposureMarker)
exposeAtPoint(pointOfInterest)
}
}
func exposeAtPoint(_ point: CGPoint){
let device = activeInput.device
if device.isExposurePointOfInterestSupported, device.isFocusModeSupported(.continuousAutoFocus){
do{
try device.lockForConfiguration()
device.exposurePointOfInterest = point
device.exposureMode = .continuousAutoExposure
if device.isFocusModeSupported(.locked){
// Now let us add the illumination for the `observeValueForKeyPath` method,
device.addObserver(self, forKeyPath: kExposure, options: .new, context: &adjustingExposureContext)
device.unlockForConfiguration()
}
}
catch{
print("Error Exposing on POI: \(String(describing: error.localizedDescription))")
}
}
}
// MARK: KVO
override func observeValue(forKeyPath keyPath: String?, of object: Any?, change: [NSKeyValueChangeKey : Any]?, context: UnsafeMutableRawPointer?) {
// First , check to make sure that the context matches the adjusting exposure context,
// Otherwise, pass the observation on to super.
if context == &adjustingExposureContext {
let device = object as! AVCaptureDevice
// then determine if the camera has stopped adjusting exposure , and if the locked mode is supported.
if !device.isAdjustingExposure , device.isExposureModeSupported(.locked){
// remove self from the observer to stop subsequent notification,
device.removeObserver(self, forKeyPath: kExposure, context: &adjustingExposureContext)
DispatchQueue.main.async {
do{
try device.lockForConfiguration()
device.exposureMode = .locked
device.unlockForConfiguration()
}
catch{
print("Error exposing on POI: \(String(describing: error.localizedDescription))")
}
}// DispatchQueue.main.async
}// if !device.isAdjustingExposure , device.isExposureModeSupported(.locked)
}// if context == &adjustingExposureContext {
else{
super.observeValue(forKeyPath: keyPath, of: object, change: change, context: context)
}
}
If you're looking for auto-exposure, set a key-value observer for "adjustingExposure" and set the exposure mode to AVCaptureExposureModeContinuousAutoExposure. If you're trying to implement manual exposure, have a look at this WWDC session on camera controls.
When using an AVPlayer, is there a way to get the progress of playbackLikelyToKeepUp? I was thinking I could look at loadedTimeRanges to see how much has been buffered so far, but from what I understand, the playbackLikelyToKeepUp property is some internally made prediction and does not provide a value of how much data is needed for it to be true.
To put this into perspective, what I'm trying to do is to have a progress view that reaches 100% just as the video starts playing.
To begin playing when AVPlayer will play continuously, you can observe the Key-Value changes on the playbackLikelyToKeepUp, like this:
let PlayerKeepUp = "playbackLikelyToKeepUp"
var isPlayerReady:Bool = false
and then in your Initialiser you add:
// adding the Observers for Status:
self.player?.currentItem?.addObserver(self, forKeyPath: PlayerKeepUp, options: ([NSKeyValueObservingOptions.New, NSKeyValueObservingOptions.Old]), context: &PlayerItemObserverContext)
And Finally, to track when the player is ready to play without getting stuck:
// MARK: KVO Observing Methods:
override public func observeValueForKeyPath(keyPath: String?, ofObject object: AnyObject?, change: [String : AnyObject]?, context: UnsafeMutablePointer<Void>) {
switch (keyPath, context) {
case (PlayerKeepUp as String, &PlayerItemObserverContext):
if(self.player.currentItem?.playbackLikelyToKeepUp == true) {
self.isPlayerReady = true
// HERE YOU FILL UP YOUR PROGRESS VIEW :-)
self.delegate?.playerReady(self.playerURL! as String)
}
break
default:
super.observeValueForKeyPath(keyPath, ofObject: object, change: change, context: context)
}
}
Last but not least, remember to Remove your KVO observance or else you'll crash when de-allocating the player:
deinit {
// Remove observer:
self.player?.currentItem?.removeObserver(self, forKeyPath: PlayerKeepUp, context: &PlayerItemObserverContext)
}
Hope this helps :-)
I am working on video application. I want to discard the video frames when camera is autofocusing. During autofocus image captured become blurred and image processing for that frame become bad but once autofocus is done, image processing become excellent. Any body give me solution?
adjustingFocus property.
Indicates whether the device is currently adjusting its focus setting. (read-only)
*Notes: You can observe changes to the value of this property using Key-value observing.
iOS 4.0 and later
https://developer.apple.com/library/ios/documentation/AVFoundation/Reference/AVCaptureDevice_Class/Reference/Reference.html#//apple_ref/occ/instp/AVCaptureDevice/adjustingFocus
Following is a sample code in Swift 3.x.
First a observer should be added to the selected capture device at camera initialization.
captureDevice.addObserver(self, forKeyPath: "adjustingFocus", options: [.new], context: nil)
Then observeValue method is overridden. By accessing the optional value returned by the method, autoFocussing frames can be identified.
override func observeValue(forKeyPath keyPath: String?, of object: Any?, change: [NSKeyValueChangeKey : Any]?, context: UnsafeMutableRawPointer?) {
guard let key = keyPath, let changes = change else {
return
}
if key == "adjustingFocus" {
let changedValue = changes[.newKey]
if (changedValue! as! Bool){
// camera is auto-focussing
}else{
// camera is not auto-focussing
}
}
}
Example on Swift 4+
class ViewController: UIViewController, AVCapturePhotoCaptureDelegate {
//#objc var captureDevice: AVCaptureDevice?
override func viewWillAppear(_ animated: Bool) {
super.viewWillAppear(animated)
self.addObservers()
}
func addObservers() {
self.addObserver(self, forKeyPath: "captureDevice.adjustingFocus", options: .new, context: nil)
}
func removeObservers() {
self.removeObserver(self, forKeyPath: "captureDevice.adjustingFocus")
}
override func observeValue(forKeyPath keyPath: String?, of object: Any?, change: [NSKeyValueChangeKey: Any]?, context: UnsafeMutableRawPointer?) {
if keyPath == "captureDevice.adjustingFocus" {
print("============== adjustingFocus: \(self.captureDevice?.lensPosition)")
}
} //End of class
Observing adjustingFocus is not working for me. It's always no. And I find this.
Note that when traditional contrast detect auto-focus is in use, the AVCaptureDevice adjustingFocus property flips to YES when a focus is underway, and flips back to NO when it is done. When phase detect autofocus is in use, the adjustingFocus property does not flip to YES, as the phase detect method tends to focus more frequently, but in small, sometimes imperceptible amounts. You can observe the AVCaptureDevice lensPosition property to see lens movements that are driven by phase detect AF.
from Apple
I have not try it yet, I will try and update later.
Edit. I have try it, and confirm this's right.