I'm currently facing a problem and I'll be glad if some of you being trough this problem and came up with a simple solution.
So basically, the question is:
Is there any way that I can pause an ongoing AVAudioPlayer when some external sounds come in such as: music, incoming call, notifications, etc.
I know that for detecting a call, there's this new CXCallObserverDelegate that helps us, but for music and the rest is there any simple solution to resolve all of this?
In terms of code, everything works fine, there's not a single problem regarding playing the audio. I have a custom path and an error, the error returns nil.
NSError *avPlayerError = nil;
NSData* mp3Data = [[NSData alloc] initWithContentsOfFile: path options: NSDataReadingMappedIfSafe error: &avPlayerError];
i'm using this code for my application :
in AppDelegate , didFinishLaunchingWithOptions add Observer for any incoming interrupt
NotificationCenter.default.addObserver(self, selector: #selector(self.onAudioSessionEvent(noti:)), name: Notification.Name.AVAudioSessionInterruption, object: nil)
When any interrupt occurs this function will called
#objc func onAudioSessionEvent(noti:Notification){
if noti.name == Notification.Name.AVAudioSessionInterruption
{
if let value = noti.userInfo![AVAudioSessionInterruptionTypeKey] as? NSNumber {
if value.uintValue == AVAudioSessionInterruptionType.began.rawValue {
print("start interrupt")
}else{
print("end interrupt")
}
}
}
}
Related
I am facing a problem with Audio When using Callkit with WebRTC for VOIP call, While answering the call from Lock Screen.
General Functionality :
My app activates the audioSession when it's launched. For an incoming call, SDP Offer & Answer are generated and exchanged. Peer Connection is set up. Both audio and video streams are generated, whether it's audio call or video call. Then Call is reported to callkit by using the following code:
callProvider.reportNewIncomingCall(with: currentCallUUID!, update: update) { error in }
If app is in the foreground, it works fine.
But, when the phone is locked, and user answers the call from lock screen, the Streams are exchanged but no audio comes on either end until user enters into the app himself.
As the user enters into the App, audio becomes active on both the ends.
All the background settings and capabilities are set properly.
I have also referred to the following work around provided by Apple staff. But even it does not work.
https://forums.developer.apple.com/thread/64544
As I mentioned, I am using WebRTC for calling. If I exchange the media streams after the user answers the call( still on Lock Screen) and peer connection is set at that time. It works fine (But it adds the delay in making the call connection).
But if Peer Connection is made before displaying call (say before reporting call to callkit), the audio stops working.
I am able to resolve this issue.
Steps that I followed -
I checked the code related to WebRTC here
I added RTCAudioSession header file which is actually a private class of Webrtc. So every time I receive a call event from signaling, I enable RTCAudiosession and on end of the call, I disable it.
I have to render the incoming streams to a dummy view (Although it is not displayed when the call is going and the app is not yet open, but it is required to make audio working).
I hope this will help if someone is facing the same issue.
#abhimanyu are you still facing the issue or you made it work. I am facing same issue with CallKit.
As per my understanding in WebRTC M60 release they have fixed on issue related to CallKit, which I think created a side effect and caused this issue.
The issue which they have fixed is related to System AudioSession, when ever CallKit presents incoming call UI and play ringer tone CallKit takes control of AudioSession and after user action (accept/ decline) it releases control. In WebRTC M60 release, now they have added observers for this control exchange. That's why it is working if app is in foreground, but if phone is locked and any incoming call is accepted then (I am assuming you are using CallKit UI for call and not redirecting user to App on accept from lock screen) due to Native UI of call it is not possible for WebRTC to activate its own AudioSession instance as call is going through CallKit Screen.
Link for bug which has been fixed on WebRTC M60: https://bugs.chromium.org/p/webrtc/issues/detail?id=7446
If you found any workaround for this issue please let me know.
Please Note that I share my code and its about to my needs and I share for reference. you need to change it according to your need.
when you receive voip notification create new incident of your webrtc handling class, and
add this two lines to code block because enabling audio session from voip notification fails
RTCAudioSession.sharedInstance().useManualAudio = true
RTCAudioSession.sharedInstance().isAudioEnabled = false
didReceive method;
func pushRegistry(_ registry: PKPushRegistry, didReceiveIncomingPushWith payload: PKPushPayload, for type: PKPushType, completion: #escaping () -> Void) {
let state = UIApplication.shared.applicationState
if(payload.dictionaryPayload["hangup"] == nil && state != .active
){
Globals.voipPayload = payload.dictionaryPayload as! [String:Any] // I pass parameters to Webrtc handler via Global singleton to create answer according to sdp sent by payload.
RTCAudioSession.sharedInstance().useManualAudio = true
RTCAudioSession.sharedInstance().isAudioEnabled = false
Globals.sipGateway = SipGateway() // my Webrtc and Janus gateway handler class
Globals.sipGateway?.configureCredentials(true) // I check janus gateway credentials stored in Shared preferences and initiate websocket connection and create peerconnection
to my janus gateway which is signaling server for my environment
initProvider() //Crating callkit provider
self.update.remoteHandle = CXHandle(type: .generic, value:String(describing: payload.dictionaryPayload["caller_id"]!))
Globals.callId = UUID()
let state = UIApplication.shared.applicationState
Globals.provider.reportNewIncomingCall(with:Globals.callId , update: self.update, completion: { error in
})
}
}
func initProvider(){
let config = CXProviderConfiguration(localizedName: "ulakBEL")
config.iconTemplateImageData = UIImage(named: "ulakbel")!.pngData()
config.ringtoneSound = "ringtone.caf"
// config.includesCallsInRecents = false;
config.supportsVideo = false
Globals.provider = CXProvider(configuration:config )
Globals.provider.setDelegate(self, queue: nil)
update = CXCallUpdate()
update.hasVideo = false
update.supportsDTMF = true
}
modify your didActivate and didDeActive delegate functions like below,
func provider(_ provider: CXProvider, didActivate audioSession: AVAudioSession) {
print("CallManager didActivate")
RTCAudioSession.sharedInstance().audioSessionDidActivate(audioSession)
RTCAudioSession.sharedInstance().isAudioEnabled = true
// self.callDelegate?.callIsAnswered()
}
func provider(_ provider: CXProvider, didDeactivate audioSession: AVAudioSession) {
print("CallManager didDeactivate")
RTCAudioSession.sharedInstance().audioSessionDidDeactivate(audioSession)
RTCAudioSession.sharedInstance().isAudioEnabled = false
}
in Webrtc handler class configure media senders and audiosession
private func createPeerConnection(webRTCCallbacks:PluginHandleWebRTCCallbacksDelegate) {
let rtcConfig = RTCConfiguration.init()
rtcConfig.iceServers = server.iceServers
rtcConfig.bundlePolicy = RTCBundlePolicy.maxBundle
rtcConfig.rtcpMuxPolicy = RTCRtcpMuxPolicy.require
rtcConfig.continualGatheringPolicy = .gatherContinually
rtcConfig.sdpSemantics = .planB
let constraints = RTCMediaConstraints(mandatoryConstraints: nil,
optionalConstraints: ["DtlsSrtpKeyAgreement":kRTCMediaConstraintsValueTrue])
pc = sessionFactory.peerConnection(with: rtcConfig, constraints: constraints, delegate: nil)
self.createMediaSenders()
self.configureAudioSession()
if webRTCCallbacks.getJsep() != nil{
handleRemoteJsep(webrtcCallbacks: webRTCCallbacks)
}
}
mediaSenders;
private func createMediaSenders() {
let streamId = "stream"
// Audio
let audioTrack = self.createAudioTrack()
self.pc.add(audioTrack, streamIds: [streamId])
// Video
/* let videoTrack = self.createVideoTrack()
self.localVideoTrack = videoTrack
self.peerConnection.add(videoTrack, streamIds: [streamId])
self.remoteVideoTrack = self.peerConnection.transceivers.first { $0.mediaType == .video }?.receiver.track as? RTCVideoTrack
// Data
if let dataChannel = createDataChannel() {
dataChannel.delegate = self
self.localDataChannel = dataChannel
}*/
}
private func createAudioTrack() -> RTCAudioTrack {
let audioConstrains = RTCMediaConstraints(mandatoryConstraints: nil, optionalConstraints: nil)
let audioSource = sessionFactory.audioSource(with: audioConstrains)
let audioTrack = sessionFactory.audioTrack(with: audioSource, trackId: "audio0")
return audioTrack
}
audioSession ;
private func configureAudioSession() {
self.rtcAudioSession.lockForConfiguration()
do {
try self.rtcAudioSession.setCategory(AVAudioSession.Category.playAndRecord.rawValue)
try self.rtcAudioSession.setMode(AVAudioSession.Mode.voiceChat.rawValue)
} catch let error {
debugPrint("Error changeing AVAudioSession category: \(error)")
}
self.rtcAudioSession.unlockForConfiguration()
}
Please consider that because I worked with callbacks and delegates code includes delegates and callback chunks. you can ignore them accordingly!!
FOR REFERENCE You can also check the example at this link
I am making an audio player in my iOS project.
I setup the play(audioOfUrl:URL, for times:Int)method by passing the name of the url and how many times the audio file will be play as following:
func play(audioOfUrl:URL, for times:Int) {
let urlPath = audioOfUrl
loopsLeftOver = times
do {
let audio = try AVAudioPlayer.init(contentsOf: urlPath)
audioPlayer = audio
audio.play()
audio.numberOfLoops = times - 1
} catch let error {
print(error.localizedDescription)
}
}
there will be a pause() and resume() as well, What I need to do is keep track of how many loops the audio player leftover.
For example, if I call the play method like this:
play(audioOfUrl:audioUrl, for times:5)
I need to keep eyes on the times leftover as the audio player running.
I try to use the AVAudioPlayerDelegate method, but it is not working, how to keep eyes on the audio.numberOfLoops? Thanks in advance.
You can subscribe for AVPlayerItemDidPlayToEndTime notification and count how many times the handler of this notification has been called.
NotificationCenter.default.addObserver(self, selector:#selector(playerItemDidReachEnd(_:)),
name: NSNotification.Name.AVPlayerItemDidPlayToEndTime,
object: audioPlayer.currentItem)
I have been streaming music from remote source using AVPlayer. I get URLs, use one to create an AVPlayerItem, which i then associate with my instance of AVPlayer. I add an observer to the item that I associate with the player to observe when the item finishes playing ( AVPlayerItemDidPlayToEndTimeNotification ). When the observer notifies me at the item end, I then create a new AVPlayerItem and do it all over again. This works well in the foreground AND in the background on iOS 9.2.
Problem: Since I have updated to iOS 9.3 this does not work in the background. Here is the relevant code:
var portionToBurffer = Double()
var player = AVPlayer()
func prepareAudioPlayer(songNSURL: NSURL, portionOfSongToBuffer: Double){
self.portionToBuffer = portionOfSongToBuffer
//create AVPlayerItem
let createdItem = AVPlayerItem(URL: songNSURL)
//Associate createdItem with AVPlayer
player = AVPlayer(playerItem: createdItem)
//Add item end observer
NSNotificationCenter.defaultCenter().addObserver(self, selector: "playerItemDidReachEnd:", name: AVPlayerItemDidPlayToEndTimeNotification, object: player.currentItem)
//Use KVO to see how much is loaded
player.currentItem?.addObserver(self, forKeyPath: "loadedTimeRanges", options: .New, context: nil)
}
override func observeValueForKeyPath(keyPath: String?, ofObject object: AnyObject?, change: [String : AnyObject]?, context: UnsafeMutablePointer<Void>) {
if keyPath == "loadedTimeRanges" {
if let loadedRangeAsNSValueArray = player.currentItem?.loadedTimeRanges {
let loadedRangeAsCMTimeRange = loadedRangeAsNSValueArray[0].CMTimeRangeValue
let endPointLoaded = CMTimeRangeGetEnd(loadedRangeAsCMTimeRange)
let secondsLoaded = CMTimeGetSeconds(endPointLoaded)
print("the endPointLoaded is \(secondsLoaded) and the duration is \(CMTimeGetSeconds((player.currentItem?.duration)!))")
if secondsLoaded >= portionToBuffer {
player.currentItem?.removeObserver(self, forKeyPath: "loadedTimeRanges")
player.play()
}
}
}
}
func playerItemDidReachEnd(notification: NSNotification){
recievedItemEndNotification()
}
func recievedItemEndNotification() {
//register background task
bgTasker.registerBackgroundTask()
if session.playlistSongIndex == session.playlistSongTitles.count-1 {
session.playlistSongIndex = 0
} else {
session.playlistSongIndex += 1
}
prepareAudioPlayer(songURL: session.songURLs[session.playlistSongIndex], portionOfSongToBuffer: 30.00)
}
I have set breakpoints to see that player.play() IS being called when in the background. When i print player.rate it reads 0.0. I have checked the property playbackLikelyToKeepUp of the AVPlayerItem and it is true. I have confirmed also that the new URL is successfully used to create the new AVPlayerItem and associated with the AVPlayer when the app is in the background. I have turned audio and airplay background capabilities on and I have even opened up a finite length background task (in code above as bgTasker.registerBackgroundTask). No idea what is going on.
I found THIS but i'm not sure it helps. Any advice would be great, thanks
When the observer notifies me at the item end, I then create a new AVPlayerItem and do it all over again
But the problem is that meanwhile play stops, and the rule is that background playing is permitted only so long as you were playing in the foreground and continue to play in the background.
I would suggest using AVQueuePlayer instead of AVPlayer. This will allow you to enqueue the next item while the current item is still playing — and thus, we may hope, this will count as continuing to play.
I encountered the similar problem, and I searched lots of websites on google, but didn't find the answer.
The Phenomenon
The problem of my app is that when I start play an audio, and turn the app to background, it will finish the playing of the current audio, and when playing the second audio, it will load some data, but then it stopped, if I turn the app to foreground, it will play the audio.
Solution
My solution is to add the following call.
UIApplication.shared.beginReceivingRemoteControlEvents()
So enable background audio capabilities is not enough, we need to begin receiving remote controller events.
In my application, I use the AVPlayer to read some streams (m3u8 file), with HLS protocol. I need to know how many times, during a streaming session, the client switches bitrate.
Let's assume the client's bandwidth is increasing. So the client will switch to a higher bitrate segment.
Can the AVPlayer detect this switch ?
Thanks.
I have had a similar problem recently. The solution felt a bit hacky but it worked as far as I saw. First I set up an observer for new Access Log notifications:
[[NSNotificationCenter defaultCenter] addObserver:self
selector:#selector(handleAVPlayerAccess:)
name:AVPlayerItemNewAccessLogEntryNotification
object:nil];
Which calls this function. It can probably be optimised but here is the basic idea:
- (void)handleAVPlayerAccess:(NSNotification *)notif {
AVPlayerItemAccessLog *accessLog = [((AVPlayerItem *)notif.object) accessLog];
AVPlayerItemAccessLogEvent *lastEvent = accessLog.events.lastObject;
float lastEventNumber = lastEvent.indicatedBitrate;
if (lastEventNumber != self.lastBitRate) {
//Here is where you can increment a variable to keep track of the number of times you switch your bit rate.
NSLog(#"Switch indicatedBitrate from: %f to: %f", self.lastBitRate, lastEventNumber);
self.lastBitRate = lastEventNumber;
}
}
Every time there is a new entry to the access log, it checks the last indicated bitrate from the most recent entry (the lastObject in the access log for the player item). It compares this indicated bitrate with a property that stored the the bitrate from that last change.
BoardProgrammer's solution works great! In my case, I needed the indicated bitrate to detect when the content quality switched from SD to HD. Here is the Swift 3 version.
// Add observer.
NotificationCenter.default.addObserver(self,
selector: #selector(handleAVPlayerAccess),
name: NSNotification.Name.AVPlayerItemNewAccessLogEntry,
object: nil)
// Handle notification.
func handleAVPlayerAccess(notification: Notification) {
guard let playerItem = notification.object as? AVPlayerItem,
let lastEvent = playerItem.accessLog()?.events.last else {
return
}
let indicatedBitrate = lastEvent.indicatedBitrate
// Use bitrate to determine bandwidth decrease or increase.
}
My background audio works fine almost all the time. Screen locked, or mute switch on. But when the user has the application in the background, and it receives a call, even if the user doesn't answer the call, the background audio does not resumes after the interruption ends.
The Music app properly resumes background audio if it was interrupted.
Am I missing some property or do I need to have a callback or set a background task to continue background audio execution after an interruption? Or this is something that we can't do?
We are resuming our audio after an outbound call and inbound call while the app is in the background.
We play audio with AVAudioPlayer and listen to the AVAudioSessionInterruptionNotification. Apple automatically pauses the AVAudioPlayer for you on an interruption and when you tell it to resume after you receive the interruption is over, Apple will set your session active again. See Table 3-2 from Handling Audio Interruptions on recommendations if you are using other types of audio technologies.
Subscribe to the notification:
[[NSNotificationCenter defaultCenter] addObserver:self selector:#selector(onAudioSessionEvent:) name:AVAudioSessionInterruptionNotification object:nil];
Handle the notification:
- (void) onAudioSessionEvent: (NSNotification *) notification
{
//Check the type of notification, especially if you are sending multiple AVAudioSession events here
if ([notification.name isEqualToString:AVAudioSessionInterruptionNotification]) {
NSLog(#"Interruption notification received!");
//Check to see if it was a Begin interruption
if ([[notification.userInfo valueForKey:AVAudioSessionInterruptionTypeKey] isEqualToNumber:[NSNumber numberWithInt:AVAudioSessionInterruptionTypeBegan]]) {
NSLog(#"Interruption began!");
} else {
NSLog(#"Interruption ended!");
//Resume your audio
}
}
}
In my experience I found I had to "wait a second or two" before attempting to reopen the audio device. I think that after the OS switches back to your app from the call, the phone app is still shutting down.
something like this when you return to foreground after knowing your audio session has been stopped:
dispatch_time_t restartTime = dispatch_time(DISPATCH_TIME_NOW,
1.5LL * NSEC_PER_SEC);
dispatch_after(restartTime, dispatch_get_global_queue(0, 0), ^{
[audioSystem restart];
});
I am using Xamarin so this is C# but the following code is working for me. I first set that my AppDelegate implements the AVAudioSession delegate methods by including IAVAudioSessionDelegate
public partial class AppDelegate: UIApplicationDelegate, IAVAudioSessionDelegate
I added a variable to the AppDelegate class for the audio session
public static AVAudioSession AudioSession;
In the override of the FinishedLaunching method:
AudioSession = AVAudioSession.SharedInstance();
AudioSession.Delegate = this;
error = AudioSession.SetCategory( AVAudioSessionCategory.Playback, AVAudioSessionCategoryOptions.DuckOthers );
AudioSession.SetMode( new NSString( "AVAudioSessionModeMoviePlayback" ), out error );
The two relevant delegate methods in AppDelegate.cs are:
[Export( "beginInterruption" )]
public void BeginInterruption()
{
PlayerViewController.BeginSessionInterruption();
}
[Export( "endInterruptionWithFlags:" )]
public void EndInterruption( AVAudioSessionInterruptionFlags flags )
{ // ignore the flags so we're not dependent on the interrupting event saying that we can resume
PlayerViewController.ResumeAfterSessionInterruption();
}
My AppDelegate also has an override of OnActivated to enable the video tracks if it's a video asset, and an override of DidEnterBackground to disable the media's video tracks but still play the audio.
In the PlayerViewController.BeginSessionInterruption() method, I can't look at Player.Rate to see if the player was running at the time, because the interrupting alarm or phone call has already paused the player. From the "Responding to Interruptions" section of Apple's Audio Session Programming Guide, with my emphasis added:
Your app is active, playing back audio.
A phone call arrives. The system activates the phone app’s audio session.
The system deactivates your audio session. At this point, * playback in your app has stopped *.
The system invokes your interruption listener callback function indicating that your session has been deactivated.
...
My PlayerViewController's Play button has a Paused property to toggle between Play and Pause and draw the appropriate button image. So instead of checking Player.Rate, I look to see if the button's Paused property is false:
public void BeginSessionInterruption()
{
PlayerWasRunningOnInterruption = !btnPlay.Paused;
TogglePlayPause( true ); // put the Play button into the Paused state to agree with the player being stopped
public void ResumeAfterSessionInterruption()
{
NSError error;
AppDelegate.AudioSession.SetActive( true, AVAudioSessionSetActiveOptions.NotifyOthersOnDeactivation, out error ); // always reactivate the audio session
if ( PlayerWasRunningOnInterruption )
{
// rewind a little bit
// call btnPlayClick to resume the playback as if the user pressed the Play button
}
}
Look at:
https://developer.apple.com/documentation/avfoundation/avaudiosession#//apple_ref/doc/uid/TP40008240-CH1-DontLinkElementID_3
https://developer.apple.com/documentation/avfoundation/avaudiosession/responding_to_audio_session_interruptions
Code:
private func setupAudioSession() {
do {
try AVAudioSession.sharedInstance().setCategory(AVAudioSessionCategoryPlayback, with: .mixWithOthers)
try AVAudioSession.sharedInstance().setActive(true)
setupAudioNotifications()
} catch {
print(error)
}
}
private func setupAudioNotifications() {
NotificationCenter.default.addObserver(self, selector: #selector(handleInterruption), name: .AVAudioSessionInterruption, object: nil)
}
#objc func handleInterruption(notification: Notification) {
guard let userInfo = notification.userInfo,
let typeValue = userInfo[AVAudioSessionInterruptionTypeKey] as? UInt,
let type = AVAudioSessionInterruptionType(rawValue: typeValue) else {
return
}
if type == .began {
// Interruption began, take appropriate actions
Player.shared.stop()
} else if type == .ended {
if let optionsValue = userInfo[AVAudioSessionInterruptionOptionKey] as? UInt {
let options = AVAudioSessionInterruptionOptions(rawValue: optionsValue)
if options.contains(.shouldResume) {
// Interruption Ended - playback should resume
Player.shared.start()
} else {
// Interruption Ended - playback should NOT resume
}
}
}
}
if you only detect the call status you can use CTCallCenter, but if you want to detect the interruption (include the call interruption),you can use AVAudioSessionInterruptionNotification, and if you want to support background you should add code like this:
[[AVAudioSession sharedInstance] setActive:YES error:nil];
[[AVAudioSession sharedInstance] setCategory:AVAudioSessionCategoryPlayback error:nil];
before you resume play music in background.hope this can help.
Only two options works for me:
reload AVPlayer or AVAudioPlayer after interruption end
func interruptionNotification(_ notification: Notification) {
guard let type = notification.userInfo?[AVAudioSessionInterruptionTypeKey] as? UInt,
let interruption = AVAudioSessionInterruptionType(rawValue: type) else {
return
}
if interruption == .ended && playerWasPlayingBeforeInterruption {
player.replaceCurrentItem(with: AVPlayerItem(url: radioStation.url))
play()
}
}
use
AVAudioSession.sharedInstance().setCategory(AVAudioSessionCategoryPlayback, with: .mixWithOthers)