When using an AVPlayer, is there a way to get the progress of playbackLikelyToKeepUp? I was thinking I could look at loadedTimeRanges to see how much has been buffered so far, but from what I understand, the playbackLikelyToKeepUp property is some internally made prediction and does not provide a value of how much data is needed for it to be true.
To put this into perspective, what I'm trying to do is to have a progress view that reaches 100% just as the video starts playing.
To begin playing when AVPlayer will play continuously, you can observe the Key-Value changes on the playbackLikelyToKeepUp, like this:
let PlayerKeepUp = "playbackLikelyToKeepUp"
var isPlayerReady:Bool = false
and then in your Initialiser you add:
// adding the Observers for Status:
self.player?.currentItem?.addObserver(self, forKeyPath: PlayerKeepUp, options: ([NSKeyValueObservingOptions.New, NSKeyValueObservingOptions.Old]), context: &PlayerItemObserverContext)
And Finally, to track when the player is ready to play without getting stuck:
// MARK: KVO Observing Methods:
override public func observeValueForKeyPath(keyPath: String?, ofObject object: AnyObject?, change: [String : AnyObject]?, context: UnsafeMutablePointer<Void>) {
switch (keyPath, context) {
case (PlayerKeepUp as String, &PlayerItemObserverContext):
if(self.player.currentItem?.playbackLikelyToKeepUp == true) {
self.isPlayerReady = true
// HERE YOU FILL UP YOUR PROGRESS VIEW :-)
self.delegate?.playerReady(self.playerURL! as String)
}
break
default:
super.observeValueForKeyPath(keyPath, ofObject: object, change: change, context: context)
}
}
Last but not least, remember to Remove your KVO observance or else you'll crash when de-allocating the player:
deinit {
// Remove observer:
self.player?.currentItem?.removeObserver(self, forKeyPath: PlayerKeepUp, context: &PlayerItemObserverContext)
}
Hope this helps :-)
Related
Am using UIImagePickerController to present the camera and initially make the flash mode to Auto.
videoCapturer.sourceType = UIImagePickerControllerSourceType.Camera
videoCapturer.mediaTypes = [kUTTypeMovie as String]
videoCapturer.cameraFlashMode = UIImagePickerControllerCameraFlashMode.Auto
[self .presentViewController(videoCapturer, animated: true, completion: nil)]
i want to get notified when the flash is set to ON or off according to the lighting.
Just use KVO.
let capture = AVCaptureDevice.defaultDeviceWithMediaType(AVMediaTypeVideo)
capture.addObserver(self, forKeyPath: "torchActive", options: NSKeyValueObservingOptions.New.union(.Initial), context: nil)
And implement this method:
public override func observeValueForKeyPath(keyPath: String?, ofObject object: AnyObject?, change: [String : AnyObject]?, context: UnsafeMutablePointer<Void>) {
if keyPath == "torchActive" {
// do something when torchActive changed
} else {
super.observeValueForKeyPath(keyPath, ofObject: object, change: change, context: context)
}
}
Here is Apple's description of torchActive:
#property torchActive
#abstract
Indicates whether the receiver's torch is currently active.
#discussion
The value of this property is a BOOL indicating whether the receiver's torch is
currently active. If the current torchMode is AVCaptureTorchModeAuto and isTorchActive
is YES, the torch will illuminate once a recording starts (see AVCaptureOutput.h
-startRecordingToOutputFileURL:recordingDelegate:). This property is key-value observable.
I need to be notified when a control button (on a video) is pressed. For example if I tap on the "pause" or on "full scren" button I need to implement some logic. Can I override methods of AVPlayerViewController? I found AVPlayerViewControllerDelegate but I can't find any methods to override.
I also tried to add an observer to the AVPlayer
player.addObserver(self, forKeyPath: "status", options:NSKeyValueObservingOptions(), context: nil)
and I used:
override func observeValueForKeyPath(keyPath: String,
ofObject object: AnyObject, change: [String : AnyObject],
context: UnsafeMutablePointer<Void>) {
...
}
but I get a notification only when the video is played: this method isn't called if I tap on a control button.
Thanks
keypaths are different, for swift, to check play/pause after clicked:
player .addObserver(self, forKeyPath: "rate", options: NSKeyValueObservingOptions.New, context: nil)
and in
observeValueForKeyPath
check like this
if ((change!["new"] as! Int) == 1)
^ this returns if video played or paused for true/false cases
AVPlayer emit an NSNotification when pause is pressed. AVPlayerDidPauseNotification (or something like that). You can register a listener in the NSNotificationCenter for this notification.
I'm using the AVPlayer class to read streams.
I have to monitor playback.
Here is my question : Is it possible to detect when the player is stopped by the user ?
I looked at MPMoviePlayerController. If the user stopped the video, this controller sends a notification : MPMovieFinishReasonUserExited. Is there an equivalent ?
You can monitor rate property by adding observer on the player for key rate.
A value of 0.0 means pauses the video, while a value of 1.0 play at the natural rate of the current item.
Apple documentation and this topic.
Hope this helps.
here's the swift 3 code for #Thlbaut's answer
self.avPlayer?.addObserver(self, forKeyPath: "rate", options: NSKeyValueObservingOptions(rawValue: 0), context: nil)
then
override func observeValue(forKeyPath keyPath: String?, of object: Any?, change: [NSKeyValueChangeKey : Any]?, context: UnsafeMutableRawPointer?) {
if keyPath == "rate" {
if let playRate = self.avPlayer?.rate {
if playRate == 0.0 {
print("playback paused")
} else {
print("playback started")
}
}
}
}
I thought that I could check the status of an AVPlayer simply by the property "rate".
This is how I create a player instance:
player = AVPlayer(URL: stream) // streaming from the internet
player!.play()
At some later point I would do something like this
println(player!.rate)
This is what I discovered:
In Simulator I get "0.0" in case the player is not running or "1.0" if it is running.
If I start the player but interrupt the internet connection it changes values from 1 to 0.
However, on my iPhone the property keeps value 1 even if I enter Airplane Mode?!
Do you have any idea why that happens and how I could check the stream condition otherwise?
I have tried an observer so far:
player!.addObserver(self, forKeyPath: "status", options: NSKeyValueObservingOptions.New, context: nil)
But even the method "observeValueForKeyPath" does not get fired in my iPhone test case.
Check out the Apple docs here
and scroll to the "Key-Value Observing" section. Especially #3 in that section.
It helped me get my implementation to work. My resulting code looks like this:
//Global
var player = AVPlayer()
func setUpPlayer() {
//...
// Setting up your player code goes here
self.player.addObserver(self, forKeyPath: "status", options: NSKeyValueObservingOptions(), context: nil)
//...
}
// catch changes to status
override func observeValueForKeyPath(keyPath: String?, ofObject object: AnyObject?, change: [String : AnyObject]?, context: UnsafeMutablePointer<Void>) {
print("obsrved")
}
I could not make it work with adding an observer on the currentItem as user #gabbler suggested.
However it helped using the notification center like this:
NSNotificationCenter.defaultCenter().addObserverForName(
AVPlayerItemFailedToPlayToEndTimeNotification,
object: nil,
queue: nil,
usingBlock: { notification in
self.stop()
})
Note that stop() is a method in the same class which stops the stream as if a stop button were clicked.
I am working on video application. I want to discard the video frames when camera is autofocusing. During autofocus image captured become blurred and image processing for that frame become bad but once autofocus is done, image processing become excellent. Any body give me solution?
adjustingFocus property.
Indicates whether the device is currently adjusting its focus setting. (read-only)
*Notes: You can observe changes to the value of this property using Key-value observing.
iOS 4.0 and later
https://developer.apple.com/library/ios/documentation/AVFoundation/Reference/AVCaptureDevice_Class/Reference/Reference.html#//apple_ref/occ/instp/AVCaptureDevice/adjustingFocus
Following is a sample code in Swift 3.x.
First a observer should be added to the selected capture device at camera initialization.
captureDevice.addObserver(self, forKeyPath: "adjustingFocus", options: [.new], context: nil)
Then observeValue method is overridden. By accessing the optional value returned by the method, autoFocussing frames can be identified.
override func observeValue(forKeyPath keyPath: String?, of object: Any?, change: [NSKeyValueChangeKey : Any]?, context: UnsafeMutableRawPointer?) {
guard let key = keyPath, let changes = change else {
return
}
if key == "adjustingFocus" {
let changedValue = changes[.newKey]
if (changedValue! as! Bool){
// camera is auto-focussing
}else{
// camera is not auto-focussing
}
}
}
Example on Swift 4+
class ViewController: UIViewController, AVCapturePhotoCaptureDelegate {
//#objc var captureDevice: AVCaptureDevice?
override func viewWillAppear(_ animated: Bool) {
super.viewWillAppear(animated)
self.addObservers()
}
func addObservers() {
self.addObserver(self, forKeyPath: "captureDevice.adjustingFocus", options: .new, context: nil)
}
func removeObservers() {
self.removeObserver(self, forKeyPath: "captureDevice.adjustingFocus")
}
override func observeValue(forKeyPath keyPath: String?, of object: Any?, change: [NSKeyValueChangeKey: Any]?, context: UnsafeMutableRawPointer?) {
if keyPath == "captureDevice.adjustingFocus" {
print("============== adjustingFocus: \(self.captureDevice?.lensPosition)")
}
} //End of class
Observing adjustingFocus is not working for me. It's always no. And I find this.
Note that when traditional contrast detect auto-focus is in use, the AVCaptureDevice adjustingFocus property flips to YES when a focus is underway, and flips back to NO when it is done. When phase detect autofocus is in use, the adjustingFocus property does not flip to YES, as the phase detect method tends to focus more frequently, but in small, sometimes imperceptible amounts. You can observe the AVCaptureDevice lensPosition property to see lens movements that are driven by phase detect AF.
from Apple
I have not try it yet, I will try and update later.
Edit. I have try it, and confirm this's right.