AVCapture​Photo​Output isFlashScene Key-Value-Observing - ios

I am following Apple's latest sample code AVCam Swift, which is updated to use AVCapture​Photo​Output.
var isFlashScene: Bool { get }
A Boolean value indicating whether the scene currently being previewed
by the camera warrants use of the flash. This property’s value changes
depending on the scene currently visible to the camera. For example,
you might use this property to highlight the flash control in your
app’s camera UI, indicating to the user that the scene is dark enough
that enabling the flash might be desirable. If the photo capture
output’s supportedFlashModes value is off, this property’s value is
always false. This property supports Key-value observing.
I am trying to Key-value observe this so when Auto Flash Mode indicates that this is a scene that flash will fire (Just like the stock iOS Camera App), So I can change the UI, just like the documentation notes.
So I set it up like this:
private let photoOutput = AVCapturePhotoOutput()
private var FlashSceneContext = 0
self.addObserver(self, forKeyPath: "photoOutput.isFlashScene", options: .new, context: &FlashSceneContext)
override func observeValue(forKeyPath keyPath: String?, of object: Any?, change: [NSKeyValueChangeKey : Any]?, context: UnsafeMutableRawPointer?) {
if context == & FlashSceneContext {
print ("Flash Scene Changed")
}
}
Above never shows a change. Even if I put a log in to check
print (self.photoOutput.isFlashScene)
This comes out as False all the time though out the app.
I also tried :
self.photoOutput.addObserver(self, forKeyPath: "isFlashScene", options: .new, context: &FlashSceneContext)
.... still no change is Flash Scene, it's stuck on False.

self.photoOutput.addObserver(self, forKeyPath: "isFlashScene", options: .new, context: &FlashSceneContext)
Above was the proper way to setup the KVO.
photoSettingsForSceneMonitoring has to be implemented:
let photoSettings = AVCapturePhotoSettings()
photoSettings.flashMode = .auto
photoSettings.isAutoStillImageStabilizationEnabled = true
self.photoOutput.photoSettingsForSceneMonitoring = photoSettings
Works!

Related

How to get notified when Flash is ON or OFF in Auto flash mode due to low light

Am using UIImagePickerController to present the camera and initially make the flash mode to Auto.
videoCapturer.sourceType = UIImagePickerControllerSourceType.Camera
videoCapturer.mediaTypes = [kUTTypeMovie as String]
videoCapturer.cameraFlashMode = UIImagePickerControllerCameraFlashMode.Auto
[self .presentViewController(videoCapturer, animated: true, completion: nil)]
i want to get notified when the flash is set to ON or off according to the lighting.
Just use KVO.
let capture = AVCaptureDevice.defaultDeviceWithMediaType(AVMediaTypeVideo)
capture.addObserver(self, forKeyPath: "torchActive", options: NSKeyValueObservingOptions.New.union(.Initial), context: nil)
And implement this method:
public override func observeValueForKeyPath(keyPath: String?, ofObject object: AnyObject?, change: [String : AnyObject]?, context: UnsafeMutablePointer<Void>) {
if keyPath == "torchActive" {
// do something when torchActive changed
} else {
super.observeValueForKeyPath(keyPath, ofObject: object, change: change, context: context)
}
}
Here is Apple's description of torchActive:
#property torchActive
#abstract
Indicates whether the receiver's torch is currently active.
#discussion
The value of this property is a BOOL indicating whether the receiver's torch is
currently active. If the current torchMode is AVCaptureTorchModeAuto and isTorchActive
is YES, the torch will illuminate once a recording starts (see AVCaptureOutput.h
-startRecordingToOutputFileURL:recordingDelegate:). This property is key-value observable.

Why AVPlayerItem's canPlayFastForward method returns False?

I really want to implement fast forward and reverse play with AVFoundation.
As far as I know I can only play 0.0 ~ 2.0 rate with AVPlayer if AVPlayerItem's canPlayReverse and canPlayFastForward returns False.
But I need -1.0 and also rate over 2.0.
My problem is that I just can't find when and why the results is false.
There is no mention about when canPlayFastForward returns false on Apple's doc.
Can anyone explain when and why the results of canPlayFastForward & canPlayReverse is false and how can I change it to true?
Possibility is you are checking the AVPlayerItem's canPlayReverse or canPlayFastForward before the AVPlayerItem's property status changes to .readToPlay. If you are doing so, you will always get false.
Don't do like this:
import AVFoundation
let anAsset = AVAsset(URL: <#A URL#>)
let playerItem = AVPlayerItem(asset: anAsset)
let canPlayFastForward = playerItem.canPlayFastForward
if (canPlayFastForward){
print("This line won't execute")
}
Instead observe the AVPlayerItem's property status. Following is the documentation from Apple:
AVPlayerItem objects are dynamic. The value of
AVPlayerItem.canPlayFastForward will change to YES for all file-based
assets and some streaming based assets (if the source playlist offers
media that allows it) at the time the item becomes ready to play. The
way to get notified when the player item is ready to play is by
observing the AVPlayerItem.status property via Key-Value Observing
(KVO).
import AVFoundation
dynamic var songItem:AVPlayerItem! //Make it instance variable
let anAsset = AVAsset(URL: <#A URL#>)
let songItem = AVPlayerItem(asset: anAsset)
playerItem.addObserver(self, forKeyPath: "status", options: .new, context: nil)
Ovveride the observeValue method in the same class:
override func observeValue(forKeyPath keyPath: String?, of object: Any?, change: [NSKeyValueChangeKey : Any]?, context: UnsafeMutableRawPointer?) {
if let status = change?[.newKey] as? Int{
if(status == AVPlayerItemStatus.readyToPlay.rawValue){
yourPlayer.rate = 2.0 // or whatever you want
}
}
}
Don't forget to remove this class from songItem's status observer
deinit {
playerItem.removeObserver(self, forKeyPath: "status")
}

Progress of playbackLikelyToKeepUp

When using an AVPlayer, is there a way to get the progress of playbackLikelyToKeepUp? I was thinking I could look at loadedTimeRanges to see how much has been buffered so far, but from what I understand, the playbackLikelyToKeepUp property is some internally made prediction and does not provide a value of how much data is needed for it to be true.
To put this into perspective, what I'm trying to do is to have a progress view that reaches 100% just as the video starts playing.
To begin playing when AVPlayer will play continuously, you can observe the Key-Value changes on the playbackLikelyToKeepUp, like this:
let PlayerKeepUp = "playbackLikelyToKeepUp"
var isPlayerReady:Bool = false
and then in your Initialiser you add:
// adding the Observers for Status:
self.player?.currentItem?.addObserver(self, forKeyPath: PlayerKeepUp, options: ([NSKeyValueObservingOptions.New, NSKeyValueObservingOptions.Old]), context: &PlayerItemObserverContext)
And Finally, to track when the player is ready to play without getting stuck:
// MARK: KVO Observing Methods:
override public func observeValueForKeyPath(keyPath: String?, ofObject object: AnyObject?, change: [String : AnyObject]?, context: UnsafeMutablePointer<Void>) {
switch (keyPath, context) {
case (PlayerKeepUp as String, &PlayerItemObserverContext):
if(self.player.currentItem?.playbackLikelyToKeepUp == true) {
self.isPlayerReady = true
// HERE YOU FILL UP YOUR PROGRESS VIEW :-)
self.delegate?.playerReady(self.playerURL! as String)
}
break
default:
super.observeValueForKeyPath(keyPath, ofObject: object, change: change, context: context)
}
}
Last but not least, remember to Remove your KVO observance or else you'll crash when de-allocating the player:
deinit {
// Remove observer:
self.player?.currentItem?.removeObserver(self, forKeyPath: PlayerKeepUp, context: &PlayerItemObserverContext)
}
Hope this helps :-)

How to check status of AVPlayer?

I thought that I could check the status of an AVPlayer simply by the property "rate".
This is how I create a player instance:
player = AVPlayer(URL: stream) // streaming from the internet
player!.play()
At some later point I would do something like this
println(player!.rate)
This is what I discovered:
In Simulator I get "0.0" in case the player is not running or "1.0" if it is running.
If I start the player but interrupt the internet connection it changes values from 1 to 0.
However, on my iPhone the property keeps value 1 even if I enter Airplane Mode?!
Do you have any idea why that happens and how I could check the stream condition otherwise?
I have tried an observer so far:
player!.addObserver(self, forKeyPath: "status", options: NSKeyValueObservingOptions.New, context: nil)
But even the method "observeValueForKeyPath" does not get fired in my iPhone test case.
Check out the Apple docs here
and scroll to the "Key-Value Observing" section. Especially #3 in that section.
It helped me get my implementation to work. My resulting code looks like this:
//Global
var player = AVPlayer()
func setUpPlayer() {
//...
// Setting up your player code goes here
self.player.addObserver(self, forKeyPath: "status", options: NSKeyValueObservingOptions(), context: nil)
//...
}
// catch changes to status
override func observeValueForKeyPath(keyPath: String?, ofObject object: AnyObject?, change: [String : AnyObject]?, context: UnsafeMutablePointer<Void>) {
print("obsrved")
}
I could not make it work with adding an observer on the currentItem as user #gabbler suggested.
However it helped using the notification center like this:
NSNotificationCenter.defaultCenter().addObserverForName(
AVPlayerItemFailedToPlayToEndTimeNotification,
object: nil,
queue: nil,
usingBlock: { notification in
self.stop()
})
Note that stop() is a method in the same class which stops the stream as if a stop button were clicked.

Detect when camera is auto-focusing

I am working on video application. I want to discard the video frames when camera is autofocusing. During autofocus image captured become blurred and image processing for that frame become bad but once autofocus is done, image processing become excellent. Any body give me solution?
adjustingFocus property.
Indicates whether the device is currently adjusting its focus setting. (read-only)
*Notes: You can observe changes to the value of this property using Key-value observing.
iOS 4.0 and later
https://developer.apple.com/library/ios/documentation/AVFoundation/Reference/AVCaptureDevice_Class/Reference/Reference.html#//apple_ref/occ/instp/AVCaptureDevice/adjustingFocus
Following is a sample code in Swift 3.x.
First a observer should be added to the selected capture device at camera initialization.
captureDevice.addObserver(self, forKeyPath: "adjustingFocus", options: [.new], context: nil)
Then observeValue method is overridden. By accessing the optional value returned by the method, autoFocussing frames can be identified.
override func observeValue(forKeyPath keyPath: String?, of object: Any?, change: [NSKeyValueChangeKey : Any]?, context: UnsafeMutableRawPointer?) {
guard let key = keyPath, let changes = change else {
return
}
if key == "adjustingFocus" {
let changedValue = changes[.newKey]
if (changedValue! as! Bool){
// camera is auto-focussing
}else{
// camera is not auto-focussing
}
}
}
Example on Swift 4+
class ViewController: UIViewController, AVCapturePhotoCaptureDelegate {
//#objc var captureDevice: AVCaptureDevice?
override func viewWillAppear(_ animated: Bool) {
super.viewWillAppear(animated)
self.addObservers()
}
func addObservers() {
self.addObserver(self, forKeyPath: "captureDevice.adjustingFocus", options: .new, context: nil)
}
func removeObservers() {
self.removeObserver(self, forKeyPath: "captureDevice.adjustingFocus")
}
override func observeValue(forKeyPath keyPath: String?, of object: Any?, change: [NSKeyValueChangeKey: Any]?, context: UnsafeMutableRawPointer?) {
if keyPath == "captureDevice.adjustingFocus" {
print("============== adjustingFocus: \(self.captureDevice?.lensPosition)")
}
} //End of class
Observing adjustingFocus is not working for me. It's always no. And I find this.
Note that when traditional contrast detect auto-focus is in use, the AVCaptureDevice adjustingFocus property flips to YES when a focus is underway, and flips back to NO when it is done. When phase detect autofocus is in use, the adjustingFocus property does not flip to YES, as the phase detect method tends to focus more frequently, but in small, sometimes imperceptible amounts. You can observe the AVCaptureDevice lensPosition property to see lens movements that are driven by phase detect AF.
from Apple
I have not try it yet, I will try and update later.
Edit. I have try it, and confirm this's right.

Resources