Trying to learn AVFoundation by debugging and understanding all properties of AVPlayerItem.
I've noticed something when I run a video playing on simulator (or device), that AVPlayerItem's property isPlaybackBufferFull is always false.
On the player view, I am printing all the properties and updating them thanks to KVO.
On the debug view, the loaded time ranges gets updated correctly. When it's fully loaded, the property isPlaybackBufferFull doesn't get set to true.
let newItem: AVPlayerItem = .init(url: url)
player.replaceCurrentItem(with: newItem)
self.isPlaybackBufferFullObserver = item.observe(\.isPlaybackBufferFull, changeHandler: self.onIsPlaybackBufferFullObserverChanged)
private func onIsPlaybackBufferFullObserverChanged(playerItem: AVPlayerItem, change: NSKeyValueObservedChange<Bool>) {
if playerItem.isPlaybackBufferFull {
// Breakpoint should normally stops here
}
}
Am I missing anything? Shouldn't isPlaybackBufferFull be true at some point?
Related
I am trying to observe a time in the timeline of my AVPlayer.
I tried this on the main queue; which did not work. I then switched to a background queue, as advised from this stack overflow post; which did not with either. Looking for a working solution or an explanation as to why this isn't working.
//add boundary time notification to global queue
avLayer.player!.addBoundaryTimeObserver(forTimes: [NSValue(time: avLayer.player!.currentItem!.duration)], queue: DispatchQueue.main){
self.avLayer.player!.pause()
}
//add boundary time notification to background queue
avLayer.player!.addBoundaryTimeObserver(forTimes: [NSValue(time: avLayer.player!.currentItem!.duration)], queue: DispatchQueue.global(qos: .userInteractive)){
self.avLayer.player!.pause()
}
Update: After retaining a strong reference to the return value of the observer, I set a breakpoint in the callback. It is still not working.
//add boundary time notification
boundaryTimeObserver = avLayer.player!.addBoundaryTimeObserver(forTimes: [NSValue(time: avLayer.player!.currentItem!.duration)], queue: DispatchQueue.main){
self.avLayer.player!.pause()
}
2019 simple example ..
var player = AVPlayer()
var token: Any?
override func viewDidAppear(_ animated: Bool) {
super.viewDidAppear(animated)
let u = "https... "
let playerItem = AVPlayerItem(url: URL(string: u)!)
player = AVPlayer(playerItem: playerItem)
player.play()
token = player.addBoundaryTimeObserver(
forTimes: [0.5 as NSValue],
queue: DispatchQueue.main) { [weak self] in
self?.spinner.stopAnimating()
print("The audio is in fact beginning about now...")
}
}
Works perfectly.
Important .. it won't find "0"
Use a small value to find the "beginning" as a quick solution.
There may be two problems:
As the documentation for addBoundaryTimeObserver states:
You must maintain a strong reference to the returned value as long as you want the time observer to be invoked by the player
As your initial code does not keep a reference to the returned internal opaque time observer, the observer probably is released immediately and thus is never called.
Make sure the time you register for observing actually has the correct value:
playerItem.duration may be indefinite (see documentation of this property)
even the duration of the playerItem's asset may be unknown, or an in-precise estimation, depending on the type of the asset and loading state (again, see documentation of AVAsset.duration on this).
As a consequence, the time you register for observing may never be reached (note that the time can easily be checked by inserting a CMTimeShow(duration))
Approaches to resolve this:
if you just want to stop the player when the playerItem's end is reached, setting player.actionAtItemEnd to pause may be sufficient
if you need to execute some custom logic when the item's end is reached, register an observer for AVPlayerItemDidPlayToEndTime notifications with the playerItem as object. This mechanism is independent from possibly in-precise durations and so hopefully more reliable
I am testing this using iOS 10.2 on my actual iPhone 6s device.
I am playing streamed audio and am able to play/pause audio, skip tracks, etc. I also have enabled background modes and the audio plays in the background and continues through a playlist properly. The only issue I am having is getting the lock screen controls to show up. Nothing displays at all...
In viewDidLoad() of my MainViewController, right when my app launches, I call this...
func setupAudioSession(){
UIApplication.shared.beginReceivingRemoteControlEvents()
do {
try AVAudioSession.sharedInstance().setCategory(AVAudioSessionCategoryPlayback, with: AVAudioSessionCategoryOptions.mixWithOthers)
self.becomeFirstResponder()
do {
try AVAudioSession.sharedInstance().setActive(true)
print("AVAudioSession is Active")
} catch let error as NSError {
print(error.localizedDescription)
}
} catch let error as NSError {
print(error.localizedDescription)
}
}
and then in my AudioPlayer class after I begin playing audio I call ...
func setupLockScreen(){
let commandCenter = MPRemoteCommandCenter.shared()
commandCenter.nextTrackCommand.isEnabled = true
commandCenter.nextTrackCommand.addTarget(self, action:#selector(skipTrack))
MPNowPlayingInfoCenter.default().nowPlayingInfo = [MPMediaItemPropertyTitle: "TESTING"]
}
When I lock my iPhone and then tap the power button again to go to the lock screen, the audio controls are not displayed at all. It is as if no audio is playing, I just see my normal background photo. Also no controls are displayed in the control panel (swiping up on home screen and then swiping left to where the music controls should be).
Is the issue because I am not using AVAudioPlayer or AVPlayer? But then how does, for example, Spotify get the lock screen controls to display using their own custom audio player? Thanks for any advice / help
The issue turned out to be this line...
try AVAudioSession.sharedInstance().setCategory(AVAudioSessionCategoryPlayback, with: AVAudioSessionCategoryOptions.duckOthers)
Once I changed it to
try AVAudioSession.sharedInstance().setCategory(AVAudioSessionCategoryPlayback, with: [])
everything worked fine. So it seems that passing in any argument for AVAudioSessionCategoryPlaybackOptions causes the lock screen controls to not display. I also tried passing in .mixWithOthers an that too caused the lock screen controls to not be displayed
In Swift 4. This example is only to show the player on the lock screen and works with iOS 11. To know how to play auidio on the device you can follow this thread https://stackoverflow.com/a/47710809/1283517
import MediaPlayer
import AVFoundation
Declare player
var player : AVPlayer?
Now create a function
func setupLockScreen(){
let commandCenter = MPRemoteCommandCenter.shared()
commandCenter.nextTrackCommand.isEnabled = true
commandCenter.togglePlayPauseCommand.addTarget(self, action: #selector(controlPause))
MPNowPlayingInfoCenter.default().nowPlayingInfo = [MPMediaItemPropertyTitle: currentStation]
}
now create a function for control play and pause event. I have created a BOOL "isPlaying" to determine the status of the player.
#objc func controlPause() {
if isPlaying == true {
player?.pause()
isPlaying = false
} else {
player?.play()
isPlaying = true
}
}
And ready. Now the player will be displayed on the lock screen
Yes, for the lock screen to work you need to use iOS APIs to play audio. Not sure how Spotify does it but they may be using a second audio session in parallel for this purpose and use the controls to control both. Your background handler (the singleton in my case) could start playing the second audio with 0 volume when it goes into background and stop it when in foreground. I haven't tested it myself but an option to try.
Question:
In Swift code, apart from using an NSTimer, how can I get animations
to start at exact points during playback of a music file played using AVFoundation?
Background
I have a method that plays a music file using AVFoundation (below). I also have UIView animations that I want to start at exact points during the music file being played.
One way I could achieve this is using an NSTimer, but that has the potential to get out of sync or not be exact enough.
Is there a method that I can tap into AVFoundation accessing the music file's time elapsed (time counter), so when certain points during the music playback arrive, animations start?
Is there an event / notification that AVFoundation triggers that gives a constant stream of time elapsed since the music file has started playing?
For example
At 0:52.50 (52 seconds and 1/2), call startAnimation1(), at 1:20.75 (1 minute, 20 seconds and 3/4), call startAnimation2(), and so on?
switch musicPlayingTimeElapsed {
case 0:52.50:
startAnimation1()
case 1:20.75:
startAnimation2()
default:
()
}
Playing music using AVFoundation
import AVFoundation
var myMusic : AVAudioPlayer?
func playMusic() {
if let musicFile = self.setupAudioPlayerWithFile("fileName", type:"mp3") {
self.myMusic = musicFile
}
myMusic?.play()
}
func setupAudioPlayerWithFile(file:NSString, type:NSString) -> AVAudioPlayer? {
let path = NSBundle.mainBundle().pathForResource(file as String, ofType: type as String)
let url = NSURL.fileURLWithPath(path!)
var audioPlayer:AVAudioPlayer?
do {
try audioPlayer = AVAudioPlayer(contentsOfURL: url)
} catch {
print("AVAudioPlayer not available")
}
return audioPlayer
}
If you use AVPlayer instead of AVAudioPlayer, you can use the (TBH slightly awkward) addBoundaryTimeObserverForTimes method:
let times = [
NSValue(CMTime:CMTimeMake(...)),
NSValue(CMTime:CMTimeMake(...)),
NSValue(CMTime:CMTimeMake(...)),
// etc
];
var observer: AnyObject? = nil // instance variable
self.observer = self.player.addBoundaryTimeObserverForTimes(times, queue: nil) {
switch self.player.currentTime() {
case 0:52.50:
startAnimation1()
case 1:20.75:
startAnimation2()
default:
break
}
}
// call this to stop observer
self.player.removeTimeObserver(self.observer)
The way I solve this is to divide the music up into separate segments beforehand. I then use one of two approaches:
I play the segments one at a time, each in its own audio player. The audio player's delegate is notified when a segment finishes, and so starting the next segment — along with accompanying action — is up to me.
Alternatively, I queue up all the segments onto an AVQueuePlayer. I then use KVO on the queue player's currentItem. Thus, I am notified exactly when we move to a new segment.
You might try using Key Value Observing to observe the duration property of your sound as it plays. When the duration reaches your time thresholds you'd trigger each animation. You'd need to make the time thresholds match times >= the trigger time, since you will likely not get a perfect match with your desired time.
I don't know how well that would work however. First, I'm not sure if the sound player's duration is KVO-compliant.
Next, KVO is somewhat resource-intensive, and if your KVO listener gets called thousands of times a second it might bog things down. It would at least be worth a try.
I use AVAudioPlayer to play a click sound if the user taps on a button.
Because there is a delay between the tap and the sound, I play the sound once in viewDidAppear with volume = 0
I found that if the user taps on the button within a time period the sound plays immediately, but after a certain time there is a delay between the tap and the sound in this case also.
It seems like in the first case the sound comes from cache of the initial play, and in the second case the app has to load the sound again.
Therefore now I play the sound every 2 seconds with volume = 0 and when the user actually taps on the button the sound comes right away.
My question is there a better approach for this?
My goal would be to keep the sound in cache within the whole lifetime of the app.
Thank you,
To avoid audio lag, use the .prepareToPlay() method of AVAudioPlayer.
Apple's Documentation on Prepare To Play
Calling this method preloads buffers and acquires the audio hardware
needed for playback, which minimizes the lag between calling the
play() method and the start of sound output.
If player is declared as an AVAudioPlayer then player.prepareToPlay() can be called to avoid the audio lag. Example code:
struct AudioPlayerManager {
var player: AVAudioPlayer? = AVAudioPlayer()
mutating func setupPlayer(soundName: String, soundType: SoundType) {
if let soundURL = Bundle.main.url(forResource: soundName, withExtension: soundType.rawValue) {
do {
player = try AVAudioPlayer(contentsOf: soundURL)
player?.prepareToPlay()
}
catch {
print(error.localizedDescription)
}
} else {
print("Sound file was missing, name is misspelled or wrong case.")
}
}
Then play() can be called with minimal lag:
player?.play()
If you save the pointer to AVAudioPlayer then your sound remains in memory and no other lag will occur.
First delay is caused by sound loading, so your 1st playback in viewDidAppear is right.
I have been streaming music from remote source using AVPlayer. I get URLs, use one to create an AVPlayerItem, which i then associate with my instance of AVPlayer. I add an observer to the item that I associate with the player to observe when the item finishes playing ( AVPlayerItemDidPlayToEndTimeNotification ). When the observer notifies me at the item end, I then create a new AVPlayerItem and do it all over again. This works well in the foreground AND in the background on iOS 9.2.
Problem: Since I have updated to iOS 9.3 this does not work in the background. Here is the relevant code:
var portionToBurffer = Double()
var player = AVPlayer()
func prepareAudioPlayer(songNSURL: NSURL, portionOfSongToBuffer: Double){
self.portionToBuffer = portionOfSongToBuffer
//create AVPlayerItem
let createdItem = AVPlayerItem(URL: songNSURL)
//Associate createdItem with AVPlayer
player = AVPlayer(playerItem: createdItem)
//Add item end observer
NSNotificationCenter.defaultCenter().addObserver(self, selector: "playerItemDidReachEnd:", name: AVPlayerItemDidPlayToEndTimeNotification, object: player.currentItem)
//Use KVO to see how much is loaded
player.currentItem?.addObserver(self, forKeyPath: "loadedTimeRanges", options: .New, context: nil)
}
override func observeValueForKeyPath(keyPath: String?, ofObject object: AnyObject?, change: [String : AnyObject]?, context: UnsafeMutablePointer<Void>) {
if keyPath == "loadedTimeRanges" {
if let loadedRangeAsNSValueArray = player.currentItem?.loadedTimeRanges {
let loadedRangeAsCMTimeRange = loadedRangeAsNSValueArray[0].CMTimeRangeValue
let endPointLoaded = CMTimeRangeGetEnd(loadedRangeAsCMTimeRange)
let secondsLoaded = CMTimeGetSeconds(endPointLoaded)
print("the endPointLoaded is \(secondsLoaded) and the duration is \(CMTimeGetSeconds((player.currentItem?.duration)!))")
if secondsLoaded >= portionToBuffer {
player.currentItem?.removeObserver(self, forKeyPath: "loadedTimeRanges")
player.play()
}
}
}
}
func playerItemDidReachEnd(notification: NSNotification){
recievedItemEndNotification()
}
func recievedItemEndNotification() {
//register background task
bgTasker.registerBackgroundTask()
if session.playlistSongIndex == session.playlistSongTitles.count-1 {
session.playlistSongIndex = 0
} else {
session.playlistSongIndex += 1
}
prepareAudioPlayer(songURL: session.songURLs[session.playlistSongIndex], portionOfSongToBuffer: 30.00)
}
I have set breakpoints to see that player.play() IS being called when in the background. When i print player.rate it reads 0.0. I have checked the property playbackLikelyToKeepUp of the AVPlayerItem and it is true. I have confirmed also that the new URL is successfully used to create the new AVPlayerItem and associated with the AVPlayer when the app is in the background. I have turned audio and airplay background capabilities on and I have even opened up a finite length background task (in code above as bgTasker.registerBackgroundTask). No idea what is going on.
I found THIS but i'm not sure it helps. Any advice would be great, thanks
When the observer notifies me at the item end, I then create a new AVPlayerItem and do it all over again
But the problem is that meanwhile play stops, and the rule is that background playing is permitted only so long as you were playing in the foreground and continue to play in the background.
I would suggest using AVQueuePlayer instead of AVPlayer. This will allow you to enqueue the next item while the current item is still playing — and thus, we may hope, this will count as continuing to play.
I encountered the similar problem, and I searched lots of websites on google, but didn't find the answer.
The Phenomenon
The problem of my app is that when I start play an audio, and turn the app to background, it will finish the playing of the current audio, and when playing the second audio, it will load some data, but then it stopped, if I turn the app to foreground, it will play the audio.
Solution
My solution is to add the following call.
UIApplication.shared.beginReceivingRemoteControlEvents()
So enable background audio capabilities is not enough, we need to begin receiving remote controller events.