Get notified of HLS segment requests in Swift - ios

I am writing an iOS app in Swift for HLS live streaming. I want my app logic to be notified when each HLS segment request is initiated during playback (and what the respective URL is). I have tried to observe changes to various properties of AVPlayer and AVPlayerItem using KVO. Though, it only informs me of when the playback is initiated. For example, adding the following observer triggers a invocation of observeValue method when playback starts, but I have not found a way to be continuously notified of each segment request.
playerItem.addObserver(self, forKeyPath: "status", options:NSKeyValueObservingOptions(), context: nil)
Is there a method with KVO that should allow me to be notified of each segment request? Are there other objects/API:s not related to AVFoundation that I should consider?
/George

I follow Fabian's approach when I need to debug HLS streams. It shows useful info every time that there is an update related to the current stream being played. Here's the code I use, hope it helps anyone facing a similar issue!
func trackPlayerLogs() {
NotificationCenter.default.addObserver(self,
selector: #selector(handleAVPlayerAccess),
name: NSNotification.Name.AVPlayerItemNewAccessLogEntry,
object: nil)
}
func handleAVPlayerAccess(notification: Notification) {
guard let playerItem = notification.object as? AVPlayerItem,
let lastEvent = playerItem.accessLog()?.events.last else {
return
}
let indicatedBitrate = lastEvent.indicatedBitrate
print("--------------PLAYER LOG--------------")
print("EVENT: \(lastEvent)")
print("INDICATED BITRATE: \(indicatedBitrate)")
print("PLAYBACK RELATED LOG EVENTS")
print("PLAYBACK START DATE: \(lastEvent.playbackStartDate)")
print("PLAYBACK SESSION ID: \(lastEvent.playbackSessionID)")
print("PLAYBACK START OFFSET: \(lastEvent.playbackStartOffset)")
print("PLAYBACK TYPE: \(lastEvent.playbackType)")
print("STARTUP TIME: \(lastEvent.startupTime)")
print("DURATION WATCHED: \(lastEvent.durationWatched)")
print("NUMBER OF DROPPED VIDEO FRAMES: \(lastEvent.numberOfDroppedVideoFrames)")
print("NUMBER OF STALLS: \(lastEvent.numberOfStalls)")
print("SEGMENTS DOWNLOADED DURATION: \(lastEvent.segmentsDownloadedDuration)")
print("DOWNLOAD OVERDUE: \(lastEvent.downloadOverdue)")
print("--------------------------------------")
}

I don't know of an easy way to be notified of each segment request while it's happening. You should look at AVPlayerItem's accessLog property and look at the AVPlayerItemAccessLogEvents in the log. These will describe both network and playback events. I highly recommend this approach if it fits your needs.
Another way is to set up your app as an HTTP server and point an AVURLAsset/AVPLayerItem at the local server which will then have to translate those requests to an external server. This is much more complex, difficult, error-prone, and is nearly guaranteed to have bad performance. Please do not do this.
addendum:
You may be tempted to look at AVAssetResourceLoaderDelegate as it says you can handle resource requests on behalf of an AVURLAsset. Unfortunately it does not go through the loader for segments. It seems to be for playlists, decryption keys and other such assets.

Related

AudioKit bounce recording sync

I need help with an issue... I am recording a bounce with renderToFile, in AudioKit 4 and an artist has complained that the different tracks are not aligned to the sample. In preRender, we have a list of players with the different tracks and records that are set to play in succession, the problem is that I cannot set the AVAudioTime scheduling because it crashes, I suppose due to the fact of the engine being in manualrenderingmode. Is there a way to sync them to the sample? I suppose this is an issue tied to the underlaying AVAudioEngine...
I cannot use AVMutableComposition because we need the recording to be exactly as the one played by AudioKit, and volume would differ.
I've experienced random crashes after setting offline rendering mode, as well as after going back to online rendering mode after being offline.
This situation seems to be triggered by this kind of conditions:
Setting offline rendering mode when the engine hasn't completely finished processing.
Start processing right after calling disableManualRenderingMode, without giving enough time for the engine to start.
A partial workaround I've found is to wait some time before setting offline mode, as well as waiting a small time interval after going online. So I have functions in my code as follows:
func setOnlineMode(successCompletion: #escaping() -> Void, failCompletion: #escaping() -> Void) {
AKManager.engine.stop()
DispatchQueue.main.asyncAfter(deadline: DispatchTime.now() + 0.5, execute: {
AKManager.engine.disableManualRenderingMode()
do {
try AKManager.engine.start()
} catch {
print("failed to start engine")
failCompletion()
}
DispatchQueue.main.asyncAfter(deadline: DispatchTime.now() + 0.5, execute: {
successCompletion()
})
})
}
I also try to avoid resetting the engine after going to offline mode:
/* From AudioKit renderToFile function */
try AKTry {
// Engine can't be running when switching to offline render mode.
if self.isRunning { self.stop() }
try self.enableManualRenderingMode(.offline, format: audioFile.processingFormat, maximumFrameCount: maximumFrameCount)
// This resets the sampleTime of offline rendering to 0.
self.reset() /* I don't do this */
try self.start()
}
To be honest, I've heavily modified my code to avoid going back and forth between online and offline mode. I only do it now at one point in my app, after a delay as explained above.

Programmatically trigger the action that a headphone pause button would do

I am trying to find a way to pause any playing media on the device, so I was thinking of triggering the same logic that is fired when a user press the headphone "middle button"
I managed to prevent music from resuming (after I pause it within my app, which basically start an AVAudioSession for recording) by NOT setting the AVAudioSession active property to false and leave it hanging, but I am pretty sure thats a bad way to do it. If I deactivate it the music resumes. The other option I am thinking of is playing some kind of silent loop that would "imitate" the silence I need to do. But I think if what I am seeking is doable, it would be the best approach as I understood from this question it cannot be done using the normal means
func stopAudioSession() {
let audioSession = AVAudioSession.sharedInstance(
do {
if audioSession.secondaryAudioShouldBeSilencedHint{
print("someone is playing....")
}
try audioSession.setActive(false, options: .notifyOthersOnDeactivation)
isSessionActive = false
} catch let error as NSError {
print("Unable to deactivate audio session: \(error.localizedDescription)")
print("retying.......")
}
}
In this code snippet as the function name implies I set active to false, tried to find other options but I could not find another way of stopping my recording session and prevent resume of the other app that was already playing
If someone can guide me to which library I should look into, if for example I can tap into the H/W part and trigger it OR if I can find out which library is listening to this button press event and handling the pause/play functionality
A friend of mine who is more experienced in IOS development suggested the following workaround and it worked - I am posting it here as it might help someone trying to achieve a similar behaviour.
In order to stop/pause what is currently being played on a user device, you will need to add a music player into your app. then at the point where you need to pause/stop the current media, you just initiate the player, play and then pause/stop it - simple :)
like so:
let musicPlayer = MPMusicPlayerApplicationController.applicationQueuePlayer
func stopMedia(){
MPMediaLibrary.requestAuthorization({(newPermissionStatus: MPMediaLibraryAuthorizationStatus) in
self.musicPlayer.setQueue(with: .songs())
self.musicPlayer.play()
print("Stopping music player")
self.musicPlayer.pause()
print("Stopped music player")
})
}
the part with MPMediaLibrary.requestAuthorization is needed to avoid an authorisation error when accessing user's media library.
and of course you will need to add the Privacy - Media Library Usage Description
key into your Info.plist file

Error handling in AVAssetResourceLoaderDelegate

What is the recommended way for handling error in AVAssetResourceLoaderDelegate? specifically network requests in shouldWaitForLoadingOfRequestedResource / shouldWaitForRenewalOfRequestedResource?
However, there is no way for my app to know something went wrong until I start playing the video. I tried loadingRequest.finishLoading(with: err) and return false in the shouldWaitForLoadingOfRequestedResource delegate call, but nothing was reported until the download finishes and the user starts playing the video.
the code goes like this..
let urlAsset = AVURLAsset(url: url)
asset?.resourceLoader.preloadsEligibleContentKeys = true
asset?.resourceLoader.setDelegate(self.fairPlayAssetLoaderDelegate, queue: DispatchQueue.global(qos: .default)) // fairPlayAssetLoaderDelegate is an object that implements AVAssetResourceLoaderDelegate
// send urlAsset to AVAssetDownloadURLSession.makeAssetDownloadTask
after setting preloadsEligibleContentKeys = true, it seems like shouldWaitForLoadingOfRequestedResource is called right away after setting the resource loader delegate in AVURLAsset. but there is no way to tell the resource loader has successfully got all the information it wants.
I tried using loadValuesAsynchronously on AVURLAsset, but that's a no go either.
AVAssetDownloadURLSession doesn't always report the error either..
Thanks in advance.

CallKit can reactivate sound after swapping call

I'm developing CallKit application, I have a problem, Call Holding is failing to restart audio when "swapping" calls on the CallKit screen until user returns to in-app call screen. I can bypass this by updating:
supportsHolding = false
but I can I solve this problem, whatsapp for example can do this correctly!
p.s. I'm using webrtc to make a call!
thanks!
EDIT:
This is code of provider:
public func provider(_ provider: CXProvider, perform action: CXSetHeldCallAction) {
guard let call = conductor!.callWithUUID(uuid: action.callUUID) else {
WebRtcConductor.debug("\(self.TAG) 🔴 failed to perform HeldAction: uuid: \(action.uuid), calluiid: \(action.callUUID)")
action.fail()
return
}
setIsHeld(call: call, isHeld: action.isOnHold)
action.fulfill()
}
the setIsHeld function simply do:
audioTrack.isEnabled = enabled
If I use "mute" button of callkit screen, all works fine, but if I have 2 active calls, when I swipe from webrtc call to normal call, CXSetHeldCallAction is called and audio track did disabled, If I swipe again to webrtc call, audio track is enabled but i do not hear nothing, if I return to main app screen, audio works fine again!
Actually, there is a limitation in the Google WebRTC Library which leads to the described problem when implementing a CallKit integration which supports swapping calls.
The WebRTC Issue 8126 is known for over a year now, but not yet integrated into the WebRTC master branch. However, you can find the necessary code changes to fix this problem in the original ticket.
However, as a workarround, you can trigger the system notification which is subscribed by WebRTC internally.
Post a AVAudioSessionInterruptionType.ended Notification in the "didActivate audioSession" method of the CallKit Provider:
var userInfo = Dictionary<AnyHashable, Any>()
let interrupttioEndedRaw = AVAudioSessionInterruptionType.ended.rawValue
userInfo[AVAudioSessionInterruptionTypeKey] = interrupttioEndedRaw
NotificationCenter.default.post(name: NSNotification.Name.AVAudioSessionInterruption, object: self, userInfo: userInfo)
PS: Stare the ticket to improve chances of a merge ;-)
Had the same issue. If I have 1 active call, then new calls is incoming, I tap hold&accept. New call works, but after using Swap in CallKit audio stops working.
Found that provider:performSetHeldCallAction: method from CXProviderDelegate protocol is the spot where you can actually deactivate/activate audio for Swap calls via CallKit native interface.
In my case I used the audioController.deactivateAudioSession() method for the call was putting in OnHold.
But found that the same method provider:performSetHeldCallAction: was fired for other call that is being put active (from OnHold state), when tap Swap button via CallKit.
So you just need to deactivate/activate audio respectively to call's state (either hold or not).
In common way it should look this way:
func provider(_ provider: CXProvider, perform action: CXSetHeldCallAction) {
// Retrieve the Call instance corresponding to the action's call UUID
guard let call = callManager.callWithUUID(uuid: action.callUUID) else {
action.fail()
return
}
// Update the Call's underlying hold state.
call.isOnHold = action.isOnHold
// Stop or start audio in response to holding or unholding the call.
if call.isOnHold {
stopAudio()
} else {
startAudio()
}
// Signal to the system that the action has been successfully performed.
action.fulfill()
}
P.S. It looks like you should have some class that responds for Audio session. It should implement kind of activate audio session / deactivate audio session.

NotificationCenter stops working when the screen is locked

I'm having trouble with an app I'm building, the app objective is to play audio files, it works by requesting an audio file from a public API, playing it and wait until it ends, after it ends it requests another audio and starts over.
Here's a shortened version of the code that does this, I omitted the error checking part for simplicity
func requestEnded(audioSource: String) {
let url = URL(string: "https://example.com/audios/" + audioSource)
audio = AVPlayer(url: url!)
NotificationCenter.default.addObserver(self,selector: #selector(MediaItem.finishedPlaying(_:)), name: NSNotification.Name.AVPlayerItemDidPlayToEndTime, object: audio?.currentItem)
audio?.play()
}
#objc func finishedPlaying(_ notification: NSNotification) {
print("Audio ended")
callAPI()
}
func callAPI() {
// do all the http request stuff
requestEnded(audioSource: "x.m4a")
}
// callAPI() is called when the app is initialized
It works well when the screen is unlocked. When I lock the phone the current audio keeps playing but when it ends finishedPlaying() never gets called (the print statement is not shown on the console).
How can I make it so the app would know the audio ended and trigger another one all while locked?, In the android version I got around the screenlock problem by using a partial wakelock which made it run normally even with the screen off.
It has to be done this way because the API decides the audio on realtime and it's all done on the backend so no way to buffer more than one audio without breaking the requirements of the app
What are my options here?, any help is appreciated.

Resources