In my application, I use the AVPlayer to read some streams (m3u8 file), with HLS protocol. I need to know how many times, during a streaming session, the client switches bitrate.
Let's assume the client's bandwidth is increasing. So the client will switch to a higher bitrate segment.
Can the AVPlayer detect this switch ?
Thanks.
I have had a similar problem recently. The solution felt a bit hacky but it worked as far as I saw. First I set up an observer for new Access Log notifications:
[[NSNotificationCenter defaultCenter] addObserver:self
selector:#selector(handleAVPlayerAccess:)
name:AVPlayerItemNewAccessLogEntryNotification
object:nil];
Which calls this function. It can probably be optimised but here is the basic idea:
- (void)handleAVPlayerAccess:(NSNotification *)notif {
AVPlayerItemAccessLog *accessLog = [((AVPlayerItem *)notif.object) accessLog];
AVPlayerItemAccessLogEvent *lastEvent = accessLog.events.lastObject;
float lastEventNumber = lastEvent.indicatedBitrate;
if (lastEventNumber != self.lastBitRate) {
//Here is where you can increment a variable to keep track of the number of times you switch your bit rate.
NSLog(#"Switch indicatedBitrate from: %f to: %f", self.lastBitRate, lastEventNumber);
self.lastBitRate = lastEventNumber;
}
}
Every time there is a new entry to the access log, it checks the last indicated bitrate from the most recent entry (the lastObject in the access log for the player item). It compares this indicated bitrate with a property that stored the the bitrate from that last change.
BoardProgrammer's solution works great! In my case, I needed the indicated bitrate to detect when the content quality switched from SD to HD. Here is the Swift 3 version.
// Add observer.
NotificationCenter.default.addObserver(self,
selector: #selector(handleAVPlayerAccess),
name: NSNotification.Name.AVPlayerItemNewAccessLogEntry,
object: nil)
// Handle notification.
func handleAVPlayerAccess(notification: Notification) {
guard let playerItem = notification.object as? AVPlayerItem,
let lastEvent = playerItem.accessLog()?.events.last else {
return
}
let indicatedBitrate = lastEvent.indicatedBitrate
// Use bitrate to determine bandwidth decrease or increase.
}
Related
I'm trying to build a sequencer app on iOS. There's a sample on the Apple Developer website that makes an audio unit play a repeating scale, here:
https://developer.apple.com/documentation/audiotoolbox/incorporating_audio_effects_and_instruments
In the sample code, there's a file "SimplePlayEngine.swift", with a class "InstrumentPlayer" which handles sending MIDI events to the selected audio unit. It spawns a thread with a loop that iterates through the scale. It sends a MIDI Note On message by calling the audio unit's AUScheduleMIDIEventBlock, sleeps the thread for a short time, sends a Note Off, and repeats.
Here's an abridged version:
DispatchQueue.global(qos: .default).async {
...
while self.isPlaying {
// cbytes is set to MIDI Note On message
...
self.audioUnit.scheduleMIDIEventBlock!(AUEventSampleTimeImmediate, 0, 3, cbytes)
usleep(useconds_t(0.2 * 1e6))
...
// cbytes is now MIDI Note Off message
self.noteBlock(AUEventSampleTimeImmediate, 0, 3, cbytes)
...
}
...
}
This works well enough for a demonstration, but it doesn't enforce strict timing, since the events will be scheduled whenever the thread wakes up.
How can I modify it to play the scale at a certain tempo with sample-accurate timing?
My assumption is that I need a way to make the synthesizer audio unit call a callback in my code before each render with the number of frames that are about to be rendered. Then I can schedule a MIDI event every "x" number of frames. You can add an offset, up to the size of the buffer, to the first parameter to scheduleMIDIEventBlock, so I could use that to schedule the event at exactly the right frame in a given render cycle.
I tried using audioUnit.token(byAddingRenderObserver: AURenderObserver), but the callback I gave it was never called, even though the app was making sound. That method sounds like it's the Swift version of AudioUnitAddRenderNotify, and from what I read here, that sounds like what I need to do - https://stackoverflow.com/a/46869149/11924045. How come it wouldn't be called? Is it even possible to make this "sample accurate" using Swift, or do I need to use C for that?
Am I on the right track? Thanks for your help!
You're on the right track. MIDI events can be scheduled with sample-accuracy in a render callback:
let sampler = AVAudioUnitSampler()
...
let renderCallback: AURenderCallback = {
(inRefCon: UnsafeMutableRawPointer,
ioActionFlags: UnsafeMutablePointer<AudioUnitRenderActionFlags>,
inTimeStamp: UnsafePointer<AudioTimeStamp>,
inBusNumber: UInt32,
inNumberFrames: UInt32,
ioData: UnsafeMutablePointer<AudioBufferList>?) -> OSStatus in
if ioActionFlags.pointee == AudioUnitRenderActionFlags.unitRenderAction_PreRender {
let sampler = Unmanaged<AVAudioUnitSampler>.fromOpaque(inRefCon).takeUnretainedValue()
let bpm = 960.0
let samples = UInt64(44000 * 60.0 / bpm)
let sampleTime = UInt64(inTimeStamp.pointee.mSampleTime)
let cbytes = UnsafeMutablePointer<UInt8>.allocate(capacity: 3)
cbytes[0] = 0x90
cbytes[1] = 64
cbytes[2] = 127
for i:UInt64 in 0..<UInt64(inNumberFrames) {
if (((sampleTime + i) % (samples)) == 0) {
sampler.auAudioUnit.scheduleMIDIEventBlock!(Int64(i), 0, 3, cbytes)
}
}
}
return noErr
}
AudioUnitAddRenderNotify(sampler.audioUnit,
renderCallback,
Unmanaged.passUnretained(sampler).toOpaque()
)
That used AURenderCallback and scheduleMIDIEventBlock. You can swap in AURenderObserver and MusicDeviceMIDIEvent, respectively, with similar sample-accurate results:
let audioUnit = sampler.audioUnit
let renderObserver: AURenderObserver = {
(actionFlags: AudioUnitRenderActionFlags,
timestamp: UnsafePointer<AudioTimeStamp>,
frameCount: AUAudioFrameCount,
outputBusNumber: Int) -> Void in
if (actionFlags.contains(.unitRenderAction_PreRender)) {
let bpm = 240.0
let samples = UInt64(44000 * 60.0 / bpm)
let sampleTime = UInt64(timestamp.pointee.mSampleTime)
for i:UInt64 in 0..<UInt64(frameCount) {
if (((sampleTime + i) % (samples)) == 0) {
MusicDeviceMIDIEvent(audioUnit, 144, 64, 127, UInt32(i))
}
}
}
}
let _ = sampler.auAudioUnit.token(byAddingRenderObserver: renderObserver)
Note that these are just examples of how it's possible to do sample-accurate MIDI sequencing on the fly. You should still follow the rules of rendering to reliably implement these patterns.
Sample accurate timing generally requires using the RemoteIO Audio Unit, and manually inserting samples at the desired sample position in each audio callback block using C code.
(A WWDC session on core audio a few years back recommended against using Swift in the audio real-time context. Not sure if anything has changed that recommendation.)
Or, for MIDI, use a precisely incremented time value in each successive scheduleMIDIEventBlock call, instead of AUEventSampleTimeImmediate, and set these calls up slightly ahead of time.
I am trying to observe a time in the timeline of my AVPlayer.
I tried this on the main queue; which did not work. I then switched to a background queue, as advised from this stack overflow post; which did not with either. Looking for a working solution or an explanation as to why this isn't working.
//add boundary time notification to global queue
avLayer.player!.addBoundaryTimeObserver(forTimes: [NSValue(time: avLayer.player!.currentItem!.duration)], queue: DispatchQueue.main){
self.avLayer.player!.pause()
}
//add boundary time notification to background queue
avLayer.player!.addBoundaryTimeObserver(forTimes: [NSValue(time: avLayer.player!.currentItem!.duration)], queue: DispatchQueue.global(qos: .userInteractive)){
self.avLayer.player!.pause()
}
Update: After retaining a strong reference to the return value of the observer, I set a breakpoint in the callback. It is still not working.
//add boundary time notification
boundaryTimeObserver = avLayer.player!.addBoundaryTimeObserver(forTimes: [NSValue(time: avLayer.player!.currentItem!.duration)], queue: DispatchQueue.main){
self.avLayer.player!.pause()
}
2019 simple example ..
var player = AVPlayer()
var token: Any?
override func viewDidAppear(_ animated: Bool) {
super.viewDidAppear(animated)
let u = "https... "
let playerItem = AVPlayerItem(url: URL(string: u)!)
player = AVPlayer(playerItem: playerItem)
player.play()
token = player.addBoundaryTimeObserver(
forTimes: [0.5 as NSValue],
queue: DispatchQueue.main) { [weak self] in
self?.spinner.stopAnimating()
print("The audio is in fact beginning about now...")
}
}
Works perfectly.
Important .. it won't find "0"
Use a small value to find the "beginning" as a quick solution.
There may be two problems:
As the documentation for addBoundaryTimeObserver states:
You must maintain a strong reference to the returned value as long as you want the time observer to be invoked by the player
As your initial code does not keep a reference to the returned internal opaque time observer, the observer probably is released immediately and thus is never called.
Make sure the time you register for observing actually has the correct value:
playerItem.duration may be indefinite (see documentation of this property)
even the duration of the playerItem's asset may be unknown, or an in-precise estimation, depending on the type of the asset and loading state (again, see documentation of AVAsset.duration on this).
As a consequence, the time you register for observing may never be reached (note that the time can easily be checked by inserting a CMTimeShow(duration))
Approaches to resolve this:
if you just want to stop the player when the playerItem's end is reached, setting player.actionAtItemEnd to pause may be sufficient
if you need to execute some custom logic when the item's end is reached, register an observer for AVPlayerItemDidPlayToEndTime notifications with the playerItem as object. This mechanism is independent from possibly in-precise durations and so hopefully more reliable
I'm currently facing a problem and I'll be glad if some of you being trough this problem and came up with a simple solution.
So basically, the question is:
Is there any way that I can pause an ongoing AVAudioPlayer when some external sounds come in such as: music, incoming call, notifications, etc.
I know that for detecting a call, there's this new CXCallObserverDelegate that helps us, but for music and the rest is there any simple solution to resolve all of this?
In terms of code, everything works fine, there's not a single problem regarding playing the audio. I have a custom path and an error, the error returns nil.
NSError *avPlayerError = nil;
NSData* mp3Data = [[NSData alloc] initWithContentsOfFile: path options: NSDataReadingMappedIfSafe error: &avPlayerError];
i'm using this code for my application :
in AppDelegate , didFinishLaunchingWithOptions add Observer for any incoming interrupt
NotificationCenter.default.addObserver(self, selector: #selector(self.onAudioSessionEvent(noti:)), name: Notification.Name.AVAudioSessionInterruption, object: nil)
When any interrupt occurs this function will called
#objc func onAudioSessionEvent(noti:Notification){
if noti.name == Notification.Name.AVAudioSessionInterruption
{
if let value = noti.userInfo![AVAudioSessionInterruptionTypeKey] as? NSNumber {
if value.uintValue == AVAudioSessionInterruptionType.began.rawValue {
print("start interrupt")
}else{
print("end interrupt")
}
}
}
}
Question:
In Swift code, apart from using an NSTimer, how can I get animations
to start at exact points during playback of a music file played using AVFoundation?
Background
I have a method that plays a music file using AVFoundation (below). I also have UIView animations that I want to start at exact points during the music file being played.
One way I could achieve this is using an NSTimer, but that has the potential to get out of sync or not be exact enough.
Is there a method that I can tap into AVFoundation accessing the music file's time elapsed (time counter), so when certain points during the music playback arrive, animations start?
Is there an event / notification that AVFoundation triggers that gives a constant stream of time elapsed since the music file has started playing?
For example
At 0:52.50 (52 seconds and 1/2), call startAnimation1(), at 1:20.75 (1 minute, 20 seconds and 3/4), call startAnimation2(), and so on?
switch musicPlayingTimeElapsed {
case 0:52.50:
startAnimation1()
case 1:20.75:
startAnimation2()
default:
()
}
Playing music using AVFoundation
import AVFoundation
var myMusic : AVAudioPlayer?
func playMusic() {
if let musicFile = self.setupAudioPlayerWithFile("fileName", type:"mp3") {
self.myMusic = musicFile
}
myMusic?.play()
}
func setupAudioPlayerWithFile(file:NSString, type:NSString) -> AVAudioPlayer? {
let path = NSBundle.mainBundle().pathForResource(file as String, ofType: type as String)
let url = NSURL.fileURLWithPath(path!)
var audioPlayer:AVAudioPlayer?
do {
try audioPlayer = AVAudioPlayer(contentsOfURL: url)
} catch {
print("AVAudioPlayer not available")
}
return audioPlayer
}
If you use AVPlayer instead of AVAudioPlayer, you can use the (TBH slightly awkward) addBoundaryTimeObserverForTimes method:
let times = [
NSValue(CMTime:CMTimeMake(...)),
NSValue(CMTime:CMTimeMake(...)),
NSValue(CMTime:CMTimeMake(...)),
// etc
];
var observer: AnyObject? = nil // instance variable
self.observer = self.player.addBoundaryTimeObserverForTimes(times, queue: nil) {
switch self.player.currentTime() {
case 0:52.50:
startAnimation1()
case 1:20.75:
startAnimation2()
default:
break
}
}
// call this to stop observer
self.player.removeTimeObserver(self.observer)
The way I solve this is to divide the music up into separate segments beforehand. I then use one of two approaches:
I play the segments one at a time, each in its own audio player. The audio player's delegate is notified when a segment finishes, and so starting the next segment — along with accompanying action — is up to me.
Alternatively, I queue up all the segments onto an AVQueuePlayer. I then use KVO on the queue player's currentItem. Thus, I am notified exactly when we move to a new segment.
You might try using Key Value Observing to observe the duration property of your sound as it plays. When the duration reaches your time thresholds you'd trigger each animation. You'd need to make the time thresholds match times >= the trigger time, since you will likely not get a perfect match with your desired time.
I don't know how well that would work however. First, I'm not sure if the sound player's duration is KVO-compliant.
Next, KVO is somewhat resource-intensive, and if your KVO listener gets called thousands of times a second it might bog things down. It would at least be worth a try.
There seems to be a problem with the MPMoviePlayerController where once you're in fullscreen mode and you hold down the fast forward button, letting it seek forward (playing at fast speed) all the way to the end of the video.
Thereafter the you just get a black screen and it's stuck. In other words it does not respond to any taps gestures and you can not get out of this situation. Has anyone else encountered this problem?
Is there anyway to work around it in code?
It seems it's an iOS bug since fast backward to the very beginning won't cause the black screen but fast forward to the end will, and after that the 'play'/'pause' call to the video player never works. I temporarily fix this by adding protected logic into the scrubber refresh callback:
let's assume that monitorPlaybackTime will be called in 'PLAY_BACK_TIME_MONITOR_INTERVAL' seconds period to refresh the scrubber, and in it I add a check logic:
NSTimeInterval duration = self.moviePlayer.duration;
NSTimeInterval current = self.moviePlayer.currentPlaybackTime;
if (isnan(current) || current > duration) {
current = duration;
} else if (self.moviePlayer.playbackState == MPMoviePlaybackStateSeekingForward) {
if (current + self.moviePlayer.currentPlaybackRate*PLAY_BACK_TIME_MONITOR_INTERVAL > duration) {
[self.moviePlayer endSeeking];
}
}
A workaround to solve the black screen, not perfect, hope it can help.
I'm guessing you are not handling the MPMoviePlayerPlaybackDidFinishNotification. You really should if you're not.
Still its unexpected for me that the movie player would go into a "stuck" state like you describe. I would more readily expect it to stop playback automatically and reset when it reaches the end. Anyway, I think your problem will go away if you observe the MPMoviePlayerPlaybackDidFinishNotification and handle the movie controller appropriately.
Ran into the same issue on iOS6. Managed to fix it by registering for the MPMoviePlayerPlaybackDidFinishNotification (as suggested by Leuguimerius) with the following implementation:
- (void)playbackDidFisnish:(NSNotification *)notif {
if (self.player.currentPlaybackTime <= 0.1) {
dispatch_async(dispatch_get_main_queue(), ^{
[self.player stop];
[self.player play];
[self.player pause];
});
}
}
Where self.player is the associated MPMoviePlayerController instance. The check against currentPlaybackTime serves to distinguish the more standard invocations of playbackDidFinish (where the movie is allowed to play at normal speed until its end) from those scenarios where the user fast forwards until the end. Stopping then playing and pausing results in a usable, visually consistent interface even when fast-forwarding to the end.
None of the aforementioned solutions worked for me, so this is what I ended up doing:
NSNotificationCenter.defaultCenter().addObserver(self, selector: Selector("moviePlayerLoadStateDidChange"), name: MPMoviePlayerLoadStateDidChangeNotification, object: nil)
func moviePlayerLoadStateDidChange() {
let loadState = moviePlayerController?.loadState
if loadState == MPMovieLoadState.Unknown {
moviePlayerController?.contentURL = currentmovieURL
moviePlayerController?.prepareToPlay()
}
}
I think the issue is that when the seek foraward button is single pressed, it wants to skip to the next video, that's why a loading indicator appears. Listening for the load state change event, you can specify what the next video should be, and if you don't have any, you can just give it the same url.