My app crashes when I tap either volume button. In my view controller, I am calling setActive on AVAudioSession.sharedInstance() and when the user taps a button I play a song with AVAudioPlayer. In this view controller, whenever a volume button is pressed, wether the player is playing or not, the app crashes with EXC_BAD_ACCESS. I have seen an error message in the debugger occasionally complaining about key-value observing for outputVolume.
Any ideas why my app is crashing?
A CLUE: There are two ways I can get to the view controller that is causing the crash. One way causes the crash and the other does not. Either way I am pushing the view controller on to the navigation controller in the same way.
Have you added something similar? :
let audioSession = AVAudioSession.sharedInstance()
audioSession.addObserver(self, forKeyPath: "outputVolume",
options: NSKeyValueObservingOptions.new, context: nil)
if you did, then you need to override this method and do whatever you need in it :
override func observeValue(forKeyPath keyPath: String?, of object: Any?, change: [NSKeyValueChangeKey : Any]?, context: UnsafeMutableRawPointer?) {
if keyPath == "outputVolume"{
}
}
Related
Screenshot prevention is not possible that i Understand but we can do the same as snapchatdoes,We can Detect it.
My application consist of more than 10+ controller so on every page addobserver is bit tedious so want the solution if i can place it on appdelegate/Scenedelegate or any other so that on whichever controller screenshot Captured i l be notified.Placing is the main required thing here
Something like reachability which works in similar way for network detection
Here is the Code :
func detectScreenShot(action: #escaping () -> ()) {
UIScreen.main.addObserver(self, forKeyPath: "captured", options: .new, context: nil)
let mainQueue = OperationQueue.main
NotificationCenter.default.addObserver(forName: UIApplication.userDidTakeScreenshotNotification, object: nil, queue: mainQueue) { notification in
// executes after screenshot
print(notification)
action()
}
}
override func observeValue(forKeyPath keyPath: String?, of object: Any?, change: [NSKeyValueChangeKey: Any]?, context: UnsafeMutableRawPointer?) {
if (keyPath == "captured") {
let isCaptured = UIScreen.main.isCaptured
print(isCaptured)
}
}
I think you can implement this by making a BaseViewController,And all the other View Controllers should be inherited by the BaseViewController,So u just have to observe the screenshot detection in BaseViewController and you don't have to write the code on every ViewController
I am following Apple's latest sample code AVCam Swift, which is updated to use AVCapturePhotoOutput.
var isFlashScene: Bool { get }
A Boolean value indicating whether the scene currently being previewed
by the camera warrants use of the flash. This property’s value changes
depending on the scene currently visible to the camera. For example,
you might use this property to highlight the flash control in your
app’s camera UI, indicating to the user that the scene is dark enough
that enabling the flash might be desirable. If the photo capture
output’s supportedFlashModes value is off, this property’s value is
always false. This property supports Key-value observing.
I am trying to Key-value observe this so when Auto Flash Mode indicates that this is a scene that flash will fire (Just like the stock iOS Camera App), So I can change the UI, just like the documentation notes.
So I set it up like this:
private let photoOutput = AVCapturePhotoOutput()
private var FlashSceneContext = 0
self.addObserver(self, forKeyPath: "photoOutput.isFlashScene", options: .new, context: &FlashSceneContext)
override func observeValue(forKeyPath keyPath: String?, of object: Any?, change: [NSKeyValueChangeKey : Any]?, context: UnsafeMutableRawPointer?) {
if context == & FlashSceneContext {
print ("Flash Scene Changed")
}
}
Above never shows a change. Even if I put a log in to check
print (self.photoOutput.isFlashScene)
This comes out as False all the time though out the app.
I also tried :
self.photoOutput.addObserver(self, forKeyPath: "isFlashScene", options: .new, context: &FlashSceneContext)
.... still no change is Flash Scene, it's stuck on False.
self.photoOutput.addObserver(self, forKeyPath: "isFlashScene", options: .new, context: &FlashSceneContext)
Above was the proper way to setup the KVO.
photoSettingsForSceneMonitoring has to be implemented:
let photoSettings = AVCapturePhotoSettings()
photoSettings.flashMode = .auto
photoSettings.isAutoStillImageStabilizationEnabled = true
self.photoOutput.photoSettingsForSceneMonitoring = photoSettings
Works!
I need to be notified when a control button (on a video) is pressed. For example if I tap on the "pause" or on "full scren" button I need to implement some logic. Can I override methods of AVPlayerViewController? I found AVPlayerViewControllerDelegate but I can't find any methods to override.
I also tried to add an observer to the AVPlayer
player.addObserver(self, forKeyPath: "status", options:NSKeyValueObservingOptions(), context: nil)
and I used:
override func observeValueForKeyPath(keyPath: String,
ofObject object: AnyObject, change: [String : AnyObject],
context: UnsafeMutablePointer<Void>) {
...
}
but I get a notification only when the video is played: this method isn't called if I tap on a control button.
Thanks
keypaths are different, for swift, to check play/pause after clicked:
player .addObserver(self, forKeyPath: "rate", options: NSKeyValueObservingOptions.New, context: nil)
and in
observeValueForKeyPath
check like this
if ((change!["new"] as! Int) == 1)
^ this returns if video played or paused for true/false cases
AVPlayer emit an NSNotification when pause is pressed. AVPlayerDidPauseNotification (or something like that). You can register a listener in the NSNotificationCenter for this notification.
I thought that I could check the status of an AVPlayer simply by the property "rate".
This is how I create a player instance:
player = AVPlayer(URL: stream) // streaming from the internet
player!.play()
At some later point I would do something like this
println(player!.rate)
This is what I discovered:
In Simulator I get "0.0" in case the player is not running or "1.0" if it is running.
If I start the player but interrupt the internet connection it changes values from 1 to 0.
However, on my iPhone the property keeps value 1 even if I enter Airplane Mode?!
Do you have any idea why that happens and how I could check the stream condition otherwise?
I have tried an observer so far:
player!.addObserver(self, forKeyPath: "status", options: NSKeyValueObservingOptions.New, context: nil)
But even the method "observeValueForKeyPath" does not get fired in my iPhone test case.
Check out the Apple docs here
and scroll to the "Key-Value Observing" section. Especially #3 in that section.
It helped me get my implementation to work. My resulting code looks like this:
//Global
var player = AVPlayer()
func setUpPlayer() {
//...
// Setting up your player code goes here
self.player.addObserver(self, forKeyPath: "status", options: NSKeyValueObservingOptions(), context: nil)
//...
}
// catch changes to status
override func observeValueForKeyPath(keyPath: String?, ofObject object: AnyObject?, change: [String : AnyObject]?, context: UnsafeMutablePointer<Void>) {
print("obsrved")
}
I could not make it work with adding an observer on the currentItem as user #gabbler suggested.
However it helped using the notification center like this:
NSNotificationCenter.defaultCenter().addObserverForName(
AVPlayerItemFailedToPlayToEndTimeNotification,
object: nil,
queue: nil,
usingBlock: { notification in
self.stop()
})
Note that stop() is a method in the same class which stops the stream as if a stop button were clicked.
I am working on video application. I want to discard the video frames when camera is autofocusing. During autofocus image captured become blurred and image processing for that frame become bad but once autofocus is done, image processing become excellent. Any body give me solution?
adjustingFocus property.
Indicates whether the device is currently adjusting its focus setting. (read-only)
*Notes: You can observe changes to the value of this property using Key-value observing.
iOS 4.0 and later
https://developer.apple.com/library/ios/documentation/AVFoundation/Reference/AVCaptureDevice_Class/Reference/Reference.html#//apple_ref/occ/instp/AVCaptureDevice/adjustingFocus
Following is a sample code in Swift 3.x.
First a observer should be added to the selected capture device at camera initialization.
captureDevice.addObserver(self, forKeyPath: "adjustingFocus", options: [.new], context: nil)
Then observeValue method is overridden. By accessing the optional value returned by the method, autoFocussing frames can be identified.
override func observeValue(forKeyPath keyPath: String?, of object: Any?, change: [NSKeyValueChangeKey : Any]?, context: UnsafeMutableRawPointer?) {
guard let key = keyPath, let changes = change else {
return
}
if key == "adjustingFocus" {
let changedValue = changes[.newKey]
if (changedValue! as! Bool){
// camera is auto-focussing
}else{
// camera is not auto-focussing
}
}
}
Example on Swift 4+
class ViewController: UIViewController, AVCapturePhotoCaptureDelegate {
//#objc var captureDevice: AVCaptureDevice?
override func viewWillAppear(_ animated: Bool) {
super.viewWillAppear(animated)
self.addObservers()
}
func addObservers() {
self.addObserver(self, forKeyPath: "captureDevice.adjustingFocus", options: .new, context: nil)
}
func removeObservers() {
self.removeObserver(self, forKeyPath: "captureDevice.adjustingFocus")
}
override func observeValue(forKeyPath keyPath: String?, of object: Any?, change: [NSKeyValueChangeKey: Any]?, context: UnsafeMutableRawPointer?) {
if keyPath == "captureDevice.adjustingFocus" {
print("============== adjustingFocus: \(self.captureDevice?.lensPosition)")
}
} //End of class
Observing adjustingFocus is not working for me. It's always no. And I find this.
Note that when traditional contrast detect auto-focus is in use, the AVCaptureDevice adjustingFocus property flips to YES when a focus is underway, and flips back to NO when it is done. When phase detect autofocus is in use, the adjustingFocus property does not flip to YES, as the phase detect method tends to focus more frequently, but in small, sometimes imperceptible amounts. You can observe the AVCaptureDevice lensPosition property to see lens movements that are driven by phase detect AF.
from Apple
I have not try it yet, I will try and update later.
Edit. I have try it, and confirm this's right.