I am working on video application. I want to discard the video frames when camera is autofocusing. During autofocus image captured become blurred and image processing for that frame become bad but once autofocus is done, image processing become excellent. Any body give me solution?
adjustingFocus property.
Indicates whether the device is currently adjusting its focus setting. (read-only)
*Notes: You can observe changes to the value of this property using Key-value observing.
iOS 4.0 and later
https://developer.apple.com/library/ios/documentation/AVFoundation/Reference/AVCaptureDevice_Class/Reference/Reference.html#//apple_ref/occ/instp/AVCaptureDevice/adjustingFocus
Following is a sample code in Swift 3.x.
First a observer should be added to the selected capture device at camera initialization.
captureDevice.addObserver(self, forKeyPath: "adjustingFocus", options: [.new], context: nil)
Then observeValue method is overridden. By accessing the optional value returned by the method, autoFocussing frames can be identified.
override func observeValue(forKeyPath keyPath: String?, of object: Any?, change: [NSKeyValueChangeKey : Any]?, context: UnsafeMutableRawPointer?) {
guard let key = keyPath, let changes = change else {
return
}
if key == "adjustingFocus" {
let changedValue = changes[.newKey]
if (changedValue! as! Bool){
// camera is auto-focussing
}else{
// camera is not auto-focussing
}
}
}
Example on Swift 4+
class ViewController: UIViewController, AVCapturePhotoCaptureDelegate {
//#objc var captureDevice: AVCaptureDevice?
override func viewWillAppear(_ animated: Bool) {
super.viewWillAppear(animated)
self.addObservers()
}
func addObservers() {
self.addObserver(self, forKeyPath: "captureDevice.adjustingFocus", options: .new, context: nil)
}
func removeObservers() {
self.removeObserver(self, forKeyPath: "captureDevice.adjustingFocus")
}
override func observeValue(forKeyPath keyPath: String?, of object: Any?, change: [NSKeyValueChangeKey: Any]?, context: UnsafeMutableRawPointer?) {
if keyPath == "captureDevice.adjustingFocus" {
print("============== adjustingFocus: \(self.captureDevice?.lensPosition)")
}
} //End of class
Observing adjustingFocus is not working for me. It's always no. And I find this.
Note that when traditional contrast detect auto-focus is in use, the AVCaptureDevice adjustingFocus property flips to YES when a focus is underway, and flips back to NO when it is done. When phase detect autofocus is in use, the adjustingFocus property does not flip to YES, as the phase detect method tends to focus more frequently, but in small, sometimes imperceptible amounts. You can observe the AVCaptureDevice lensPosition property to see lens movements that are driven by phase detect AF.
from Apple
I have not try it yet, I will try and update later.
Edit. I have try it, and confirm this's right.
Related
Screenshot prevention is not possible that i Understand but we can do the same as snapchatdoes,We can Detect it.
My application consist of more than 10+ controller so on every page addobserver is bit tedious so want the solution if i can place it on appdelegate/Scenedelegate or any other so that on whichever controller screenshot Captured i l be notified.Placing is the main required thing here
Something like reachability which works in similar way for network detection
Here is the Code :
func detectScreenShot(action: #escaping () -> ()) {
UIScreen.main.addObserver(self, forKeyPath: "captured", options: .new, context: nil)
let mainQueue = OperationQueue.main
NotificationCenter.default.addObserver(forName: UIApplication.userDidTakeScreenshotNotification, object: nil, queue: mainQueue) { notification in
// executes after screenshot
print(notification)
action()
}
}
override func observeValue(forKeyPath keyPath: String?, of object: Any?, change: [NSKeyValueChangeKey: Any]?, context: UnsafeMutableRawPointer?) {
if (keyPath == "captured") {
let isCaptured = UIScreen.main.isCaptured
print(isCaptured)
}
}
I think you can implement this by making a BaseViewController,And all the other View Controllers should be inherited by the BaseViewController,So u just have to observe the screenshot detection in BaseViewController and you don't have to write the code on every ViewController
I am following Apple's latest sample code AVCam Swift, which is updated to use AVCapturePhotoOutput.
var isFlashScene: Bool { get }
A Boolean value indicating whether the scene currently being previewed
by the camera warrants use of the flash. This property’s value changes
depending on the scene currently visible to the camera. For example,
you might use this property to highlight the flash control in your
app’s camera UI, indicating to the user that the scene is dark enough
that enabling the flash might be desirable. If the photo capture
output’s supportedFlashModes value is off, this property’s value is
always false. This property supports Key-value observing.
I am trying to Key-value observe this so when Auto Flash Mode indicates that this is a scene that flash will fire (Just like the stock iOS Camera App), So I can change the UI, just like the documentation notes.
So I set it up like this:
private let photoOutput = AVCapturePhotoOutput()
private var FlashSceneContext = 0
self.addObserver(self, forKeyPath: "photoOutput.isFlashScene", options: .new, context: &FlashSceneContext)
override func observeValue(forKeyPath keyPath: String?, of object: Any?, change: [NSKeyValueChangeKey : Any]?, context: UnsafeMutableRawPointer?) {
if context == & FlashSceneContext {
print ("Flash Scene Changed")
}
}
Above never shows a change. Even if I put a log in to check
print (self.photoOutput.isFlashScene)
This comes out as False all the time though out the app.
I also tried :
self.photoOutput.addObserver(self, forKeyPath: "isFlashScene", options: .new, context: &FlashSceneContext)
.... still no change is Flash Scene, it's stuck on False.
self.photoOutput.addObserver(self, forKeyPath: "isFlashScene", options: .new, context: &FlashSceneContext)
Above was the proper way to setup the KVO.
photoSettingsForSceneMonitoring has to be implemented:
let photoSettings = AVCapturePhotoSettings()
photoSettings.flashMode = .auto
photoSettings.isAutoStillImageStabilizationEnabled = true
self.photoOutput.photoSettingsForSceneMonitoring = photoSettings
Works!
I am trying to track changes to the map orientation using the camera heading property.
// Register for notifications of changes to camera
if let camera = self.mapView?.camera {
self.camera = camera
camera.addObserver(self, forKeyPath: "heading", options: NSKeyValueObservingOptions.new, context: &MapViewController.myContext)
}
...
override func observeValue(forKeyPath keyPath: String?, of object: Any?, change: [NSKeyValueChangeKey : Any]?, context: UnsafeMutableRawPointer?) {
if context == &MapViewController.myContext {
if keyPath == "heading" {
if let heading = change?[NSKeyValueChangeKey.newKey] as? CLLocationDirection {
self.heading = heading
}
}
} else {
super.observeValue(forKeyPath: keyPath, of: object, change: change, context: context)
}
}
The camera property mode is copy so you are not observing the current camera instance just one that was copied when you called the getter. You need to observe key path of #"camera.heading" or #"camera" on the map view and hope that a new camera object is set internally when the heading changes.
Am using UIImagePickerController to present the camera and initially make the flash mode to Auto.
videoCapturer.sourceType = UIImagePickerControllerSourceType.Camera
videoCapturer.mediaTypes = [kUTTypeMovie as String]
videoCapturer.cameraFlashMode = UIImagePickerControllerCameraFlashMode.Auto
[self .presentViewController(videoCapturer, animated: true, completion: nil)]
i want to get notified when the flash is set to ON or off according to the lighting.
Just use KVO.
let capture = AVCaptureDevice.defaultDeviceWithMediaType(AVMediaTypeVideo)
capture.addObserver(self, forKeyPath: "torchActive", options: NSKeyValueObservingOptions.New.union(.Initial), context: nil)
And implement this method:
public override func observeValueForKeyPath(keyPath: String?, ofObject object: AnyObject?, change: [String : AnyObject]?, context: UnsafeMutablePointer<Void>) {
if keyPath == "torchActive" {
// do something when torchActive changed
} else {
super.observeValueForKeyPath(keyPath, ofObject: object, change: change, context: context)
}
}
Here is Apple's description of torchActive:
#property torchActive
#abstract
Indicates whether the receiver's torch is currently active.
#discussion
The value of this property is a BOOL indicating whether the receiver's torch is
currently active. If the current torchMode is AVCaptureTorchModeAuto and isTorchActive
is YES, the torch will illuminate once a recording starts (see AVCaptureOutput.h
-startRecordingToOutputFileURL:recordingDelegate:). This property is key-value observable.
When using an AVPlayer, is there a way to get the progress of playbackLikelyToKeepUp? I was thinking I could look at loadedTimeRanges to see how much has been buffered so far, but from what I understand, the playbackLikelyToKeepUp property is some internally made prediction and does not provide a value of how much data is needed for it to be true.
To put this into perspective, what I'm trying to do is to have a progress view that reaches 100% just as the video starts playing.
To begin playing when AVPlayer will play continuously, you can observe the Key-Value changes on the playbackLikelyToKeepUp, like this:
let PlayerKeepUp = "playbackLikelyToKeepUp"
var isPlayerReady:Bool = false
and then in your Initialiser you add:
// adding the Observers for Status:
self.player?.currentItem?.addObserver(self, forKeyPath: PlayerKeepUp, options: ([NSKeyValueObservingOptions.New, NSKeyValueObservingOptions.Old]), context: &PlayerItemObserverContext)
And Finally, to track when the player is ready to play without getting stuck:
// MARK: KVO Observing Methods:
override public func observeValueForKeyPath(keyPath: String?, ofObject object: AnyObject?, change: [String : AnyObject]?, context: UnsafeMutablePointer<Void>) {
switch (keyPath, context) {
case (PlayerKeepUp as String, &PlayerItemObserverContext):
if(self.player.currentItem?.playbackLikelyToKeepUp == true) {
self.isPlayerReady = true
// HERE YOU FILL UP YOUR PROGRESS VIEW :-)
self.delegate?.playerReady(self.playerURL! as String)
}
break
default:
super.observeValueForKeyPath(keyPath, ofObject: object, change: change, context: context)
}
}
Last but not least, remember to Remove your KVO observance or else you'll crash when de-allocating the player:
deinit {
// Remove observer:
self.player?.currentItem?.removeObserver(self, forKeyPath: PlayerKeepUp, context: &PlayerItemObserverContext)
}
Hope this helps :-)