I'm using MPRemoteCommandCenter and MPMusicPlayerController.applicationMusicPlayer on the iPhone.
I'm trying to receive remote control events when the user is playing music and double taps on the headphone button.
If I use AVAudioPlayer, the remote commands are received perfectly.
However, if I use MPMusicPlayerController with any of its players (systemMusicPlayer, applicationMusicPlayer, or applicationQueuePlayer) the the commands do not get received. They appear to get gobbled up. For example when I double tap the remote, the music will toggle between play and stop. Instead, I need the remote events sent to my app.
Below is a sample app with my code. In the info.plist I've specified the required background mode for an app that plays audio (although its not necessary).
import UIKit
import MediaPlayer
class ViewController: UIViewController {
var mpPlayer:MPMusicPlayerController!
func remoteHandler() {
print("success")
}
override func viewDidLoad() {
super.viewDidLoad()
mpPlayer = MPMusicPlayerController.applicationMusicPlayer()
//mpPlayer = MPMusicPlayerController.systemMusicPlayer()
assert(mpPlayer != nil)
let cc = MPRemoteCommandCenter.shared()
print("cc = \(cc)")
cc.nextTrackCommand.isEnabled = true
cc.nextTrackCommand.addTarget(self, action: #selector(ViewController.remoteHandler))
cc.previousTrackCommand.isEnabled = true
cc.previousTrackCommand.addTarget(self, action: #selector(ViewController.remoteHandler))
cc.playCommand.isEnabled = true
cc.playCommand.addTarget(self, action: #selector(ViewController.remoteHandler))
do {
try AVAudioSession.sharedInstance().setCategory(AVAudioSessionCategoryPlayback)
print("AVAudioSession successfully set AVAudioSessionCategoryPlayback")
} catch let error as NSError {
print("AVAudioSession setCategory error: \(error.localizedDescription)")
}
mpPlayer.setQueueWithStoreIDs(["270139033"]) // requires iOS 10.3
mpPlayer.play()
}
}
Output is:
cc = 0x123e086c0
AVAudioSession successfully set AVAudioSessionCategoryPlayback
remoteHandler is never called.
From the Apple Developer web site.
When you use either the system or application player, you do not get
event notifications. Those players automatically handle events.
So there is no way to receive remote control events if you use MPMusicPlayerController. Looking forward to see this feature! Right now MPMusicPlayerController is the only way to play Apple Music songs.
Related
I am creating an music player iOS app and getting data from firebase. I can able to control play, pause, next and previous in simulator or iPhone. While headset is connect to device play, next and previous functionalities are not working properly.
Here is the code which i've used;
func setupRemoteCommandCenter() {
// Get the shared MPRemoteCommandCenter
let commandCenter = MPRemoteCommandCenter.shared()
// Add handler for Play Command
commandCenter.playCommand.addTarget { event in
player?.play()
print("headset play")
return .success
}
// Add handler for Pause Command
commandCenter.pauseCommand.addTarget { event in
player?.pause()
print("headset pause")
return .success
}
// Add handler for Next Command
commandCenter.nextTrackCommand.addTarget { event in
return .success
}
// Add handler for Previous Command
commandCenter.previousTrackCommand.addTarget { event in
return .success
}
}
And calling setupRemoteCommandCenter function in viewdidload
This document says that to receive player event notifications you need to
begin playing audio
be the "Now playing app"
The definition of "Now playing app" is hard to pin down, but it seems to be any app that has an active, non-mixable audio session and is playing audio (or has very recently played audio, there seems to be a brief grace period here) . One possible non-mixable audio session is:
let session = AVAudioSession.sharedInstance()
do {
try session.setCategory(AVAudioSessionCategoryPlayback)
try session.setActive(true)
} catch let err as NSError {
print("Error setting up non mixable audio session \(err)")
}
p.s. if you want the headset controls to work while the screen is locked (or if you want the lockscreen controls to work for that matter), you will need to add audio to Background Modes, because technically, if the screen is locked then your app has been backgrounded:
I am facing a problem with Audio When using Callkit with WebRTC for VOIP call, While answering the call from Lock Screen.
General Functionality :
My app activates the audioSession when it's launched. For an incoming call, SDP Offer & Answer are generated and exchanged. Peer Connection is set up. Both audio and video streams are generated, whether it's audio call or video call. Then Call is reported to callkit by using the following code:
callProvider.reportNewIncomingCall(with: currentCallUUID!, update: update) { error in }
If app is in the foreground, it works fine.
But, when the phone is locked, and user answers the call from lock screen, the Streams are exchanged but no audio comes on either end until user enters into the app himself.
As the user enters into the App, audio becomes active on both the ends.
All the background settings and capabilities are set properly.
I have also referred to the following work around provided by Apple staff. But even it does not work.
https://forums.developer.apple.com/thread/64544
As I mentioned, I am using WebRTC for calling. If I exchange the media streams after the user answers the call( still on Lock Screen) and peer connection is set at that time. It works fine (But it adds the delay in making the call connection).
But if Peer Connection is made before displaying call (say before reporting call to callkit), the audio stops working.
I am able to resolve this issue.
Steps that I followed -
I checked the code related to WebRTC here
I added RTCAudioSession header file which is actually a private class of Webrtc. So every time I receive a call event from signaling, I enable RTCAudiosession and on end of the call, I disable it.
I have to render the incoming streams to a dummy view (Although it is not displayed when the call is going and the app is not yet open, but it is required to make audio working).
I hope this will help if someone is facing the same issue.
#abhimanyu are you still facing the issue or you made it work. I am facing same issue with CallKit.
As per my understanding in WebRTC M60 release they have fixed on issue related to CallKit, which I think created a side effect and caused this issue.
The issue which they have fixed is related to System AudioSession, when ever CallKit presents incoming call UI and play ringer tone CallKit takes control of AudioSession and after user action (accept/ decline) it releases control. In WebRTC M60 release, now they have added observers for this control exchange. That's why it is working if app is in foreground, but if phone is locked and any incoming call is accepted then (I am assuming you are using CallKit UI for call and not redirecting user to App on accept from lock screen) due to Native UI of call it is not possible for WebRTC to activate its own AudioSession instance as call is going through CallKit Screen.
Link for bug which has been fixed on WebRTC M60: https://bugs.chromium.org/p/webrtc/issues/detail?id=7446
If you found any workaround for this issue please let me know.
Please Note that I share my code and its about to my needs and I share for reference. you need to change it according to your need.
when you receive voip notification create new incident of your webrtc handling class, and
add this two lines to code block because enabling audio session from voip notification fails
RTCAudioSession.sharedInstance().useManualAudio = true
RTCAudioSession.sharedInstance().isAudioEnabled = false
didReceive method;
func pushRegistry(_ registry: PKPushRegistry, didReceiveIncomingPushWith payload: PKPushPayload, for type: PKPushType, completion: #escaping () -> Void) {
let state = UIApplication.shared.applicationState
if(payload.dictionaryPayload["hangup"] == nil && state != .active
){
Globals.voipPayload = payload.dictionaryPayload as! [String:Any] // I pass parameters to Webrtc handler via Global singleton to create answer according to sdp sent by payload.
RTCAudioSession.sharedInstance().useManualAudio = true
RTCAudioSession.sharedInstance().isAudioEnabled = false
Globals.sipGateway = SipGateway() // my Webrtc and Janus gateway handler class
Globals.sipGateway?.configureCredentials(true) // I check janus gateway credentials stored in Shared preferences and initiate websocket connection and create peerconnection
to my janus gateway which is signaling server for my environment
initProvider() //Crating callkit provider
self.update.remoteHandle = CXHandle(type: .generic, value:String(describing: payload.dictionaryPayload["caller_id"]!))
Globals.callId = UUID()
let state = UIApplication.shared.applicationState
Globals.provider.reportNewIncomingCall(with:Globals.callId , update: self.update, completion: { error in
})
}
}
func initProvider(){
let config = CXProviderConfiguration(localizedName: "ulakBEL")
config.iconTemplateImageData = UIImage(named: "ulakbel")!.pngData()
config.ringtoneSound = "ringtone.caf"
// config.includesCallsInRecents = false;
config.supportsVideo = false
Globals.provider = CXProvider(configuration:config )
Globals.provider.setDelegate(self, queue: nil)
update = CXCallUpdate()
update.hasVideo = false
update.supportsDTMF = true
}
modify your didActivate and didDeActive delegate functions like below,
func provider(_ provider: CXProvider, didActivate audioSession: AVAudioSession) {
print("CallManager didActivate")
RTCAudioSession.sharedInstance().audioSessionDidActivate(audioSession)
RTCAudioSession.sharedInstance().isAudioEnabled = true
// self.callDelegate?.callIsAnswered()
}
func provider(_ provider: CXProvider, didDeactivate audioSession: AVAudioSession) {
print("CallManager didDeactivate")
RTCAudioSession.sharedInstance().audioSessionDidDeactivate(audioSession)
RTCAudioSession.sharedInstance().isAudioEnabled = false
}
in Webrtc handler class configure media senders and audiosession
private func createPeerConnection(webRTCCallbacks:PluginHandleWebRTCCallbacksDelegate) {
let rtcConfig = RTCConfiguration.init()
rtcConfig.iceServers = server.iceServers
rtcConfig.bundlePolicy = RTCBundlePolicy.maxBundle
rtcConfig.rtcpMuxPolicy = RTCRtcpMuxPolicy.require
rtcConfig.continualGatheringPolicy = .gatherContinually
rtcConfig.sdpSemantics = .planB
let constraints = RTCMediaConstraints(mandatoryConstraints: nil,
optionalConstraints: ["DtlsSrtpKeyAgreement":kRTCMediaConstraintsValueTrue])
pc = sessionFactory.peerConnection(with: rtcConfig, constraints: constraints, delegate: nil)
self.createMediaSenders()
self.configureAudioSession()
if webRTCCallbacks.getJsep() != nil{
handleRemoteJsep(webrtcCallbacks: webRTCCallbacks)
}
}
mediaSenders;
private func createMediaSenders() {
let streamId = "stream"
// Audio
let audioTrack = self.createAudioTrack()
self.pc.add(audioTrack, streamIds: [streamId])
// Video
/* let videoTrack = self.createVideoTrack()
self.localVideoTrack = videoTrack
self.peerConnection.add(videoTrack, streamIds: [streamId])
self.remoteVideoTrack = self.peerConnection.transceivers.first { $0.mediaType == .video }?.receiver.track as? RTCVideoTrack
// Data
if let dataChannel = createDataChannel() {
dataChannel.delegate = self
self.localDataChannel = dataChannel
}*/
}
private func createAudioTrack() -> RTCAudioTrack {
let audioConstrains = RTCMediaConstraints(mandatoryConstraints: nil, optionalConstraints: nil)
let audioSource = sessionFactory.audioSource(with: audioConstrains)
let audioTrack = sessionFactory.audioTrack(with: audioSource, trackId: "audio0")
return audioTrack
}
audioSession ;
private func configureAudioSession() {
self.rtcAudioSession.lockForConfiguration()
do {
try self.rtcAudioSession.setCategory(AVAudioSession.Category.playAndRecord.rawValue)
try self.rtcAudioSession.setMode(AVAudioSession.Mode.voiceChat.rawValue)
} catch let error {
debugPrint("Error changeing AVAudioSession category: \(error)")
}
self.rtcAudioSession.unlockForConfiguration()
}
Please consider that because I worked with callbacks and delegates code includes delegates and callback chunks. you can ignore them accordingly!!
FOR REFERENCE You can also check the example at this link
I am testing this using iOS 10.2 on my actual iPhone 6s device.
I am playing streamed audio and am able to play/pause audio, skip tracks, etc. I also have enabled background modes and the audio plays in the background and continues through a playlist properly. The only issue I am having is getting the lock screen controls to show up. Nothing displays at all...
In viewDidLoad() of my MainViewController, right when my app launches, I call this...
func setupAudioSession(){
UIApplication.shared.beginReceivingRemoteControlEvents()
do {
try AVAudioSession.sharedInstance().setCategory(AVAudioSessionCategoryPlayback, with: AVAudioSessionCategoryOptions.mixWithOthers)
self.becomeFirstResponder()
do {
try AVAudioSession.sharedInstance().setActive(true)
print("AVAudioSession is Active")
} catch let error as NSError {
print(error.localizedDescription)
}
} catch let error as NSError {
print(error.localizedDescription)
}
}
and then in my AudioPlayer class after I begin playing audio I call ...
func setupLockScreen(){
let commandCenter = MPRemoteCommandCenter.shared()
commandCenter.nextTrackCommand.isEnabled = true
commandCenter.nextTrackCommand.addTarget(self, action:#selector(skipTrack))
MPNowPlayingInfoCenter.default().nowPlayingInfo = [MPMediaItemPropertyTitle: "TESTING"]
}
When I lock my iPhone and then tap the power button again to go to the lock screen, the audio controls are not displayed at all. It is as if no audio is playing, I just see my normal background photo. Also no controls are displayed in the control panel (swiping up on home screen and then swiping left to where the music controls should be).
Is the issue because I am not using AVAudioPlayer or AVPlayer? But then how does, for example, Spotify get the lock screen controls to display using their own custom audio player? Thanks for any advice / help
The issue turned out to be this line...
try AVAudioSession.sharedInstance().setCategory(AVAudioSessionCategoryPlayback, with: AVAudioSessionCategoryOptions.duckOthers)
Once I changed it to
try AVAudioSession.sharedInstance().setCategory(AVAudioSessionCategoryPlayback, with: [])
everything worked fine. So it seems that passing in any argument for AVAudioSessionCategoryPlaybackOptions causes the lock screen controls to not display. I also tried passing in .mixWithOthers an that too caused the lock screen controls to not be displayed
In Swift 4. This example is only to show the player on the lock screen and works with iOS 11. To know how to play auidio on the device you can follow this thread https://stackoverflow.com/a/47710809/1283517
import MediaPlayer
import AVFoundation
Declare player
var player : AVPlayer?
Now create a function
func setupLockScreen(){
let commandCenter = MPRemoteCommandCenter.shared()
commandCenter.nextTrackCommand.isEnabled = true
commandCenter.togglePlayPauseCommand.addTarget(self, action: #selector(controlPause))
MPNowPlayingInfoCenter.default().nowPlayingInfo = [MPMediaItemPropertyTitle: currentStation]
}
now create a function for control play and pause event. I have created a BOOL "isPlaying" to determine the status of the player.
#objc func controlPause() {
if isPlaying == true {
player?.pause()
isPlaying = false
} else {
player?.play()
isPlaying = true
}
}
And ready. Now the player will be displayed on the lock screen
Yes, for the lock screen to work you need to use iOS APIs to play audio. Not sure how Spotify does it but they may be using a second audio session in parallel for this purpose and use the controls to control both. Your background handler (the singleton in my case) could start playing the second audio with 0 volume when it goes into background and stop it when in foreground. I haven't tested it myself but an option to try.
I am using Google Cloud messaging to push notifications to my iOS app written in Swift 2.0 xCode 7.1. GCM doesn't allow custom notification sounds.
See here: https://developers.google.com/cloud-messaging/http-server-ref
So I turned off the default sound and am trying to play a sound whenever 'didReceiveRemoteNotification' is called. My problem is the sound is not playing when in background mode..however if I put the code(below) in 'didFinishLaunchingWithOptions' it works perfectly, just not when I want it to play.
I have added the background more in info.plist. Like I said its working just not when a push notification arrives.
let alertSound = NSURL(fileURLWithPath: NSBundle.mainBundle().pathForResource("GMNotification", ofType: "wav")!)
print(alertSound)
//var error:NSError?
do {
try self.audioPlayer = AVAudioPlayer(contentsOfURL: alertSound, fileTypeHint:nil)
self.audioPlayer.prepareToPlay()
self.audioPlayer.play()
print("PLAY !!!")
} catch {
print("Error ???")
}
Can anyone help please?
You must create a shared instance "Singelton" to play sound. Refer back to:
Swift - AVAudioPlayer, sound doesn't play correctly
This should tell you how it can be done
Hello I tried the various solutions to similar questions but couldnt get my code to work. I have the following function that I call in my app:
func PlaySound (WhenToPlaySound:String) {
AVAudioSession.sharedInstance().setCategory(AVAudioSessionCategoryPlayAndRecord, error: nil)
AVAudioSession.sharedInstance().setActive(true, error: nil)
if WhenToPlaySound == "BeginningOfRound" {
if UIApplication.sharedApplication().applicationState == UIApplicationState.Background {
soundnotification.soundName = "BoxingBellStart.wav"
UIApplication.sharedApplication().scheduleLocalNotification(soundnotification)
println("timer is done in background mode")
} else {
// Load Sound
soundlocation = NSBundle.mainBundle().URLForResource("BoxingBellStart", withExtension: "wav")!
player = AVAudioPlayer(contentsOfURL: soundlocation, error: &Error)
player.volume = 1.0
// Play Sound
player.play()
println("timer is done in active mode")
}
} else {
if UIApplication.sharedApplication().applicationState == UIApplicationState.Background {
soundnotification.soundName = "Boxing.wav"
UIApplication.sharedApplication().scheduleLocalNotification(soundnotification)
println("timer is done in background mode")
} else {
// Load Sound
soundlocation = NSBundle.mainBundle().URLForResource("Boxing", withExtension: "wav")!
player = AVAudioPlayer(contentsOfURL: soundlocation, error: &Error)
player.volume = 1.0
// Play Sound
player.play()
println("timer is done in active mode")
}
}
}
Mostly it works except two things:
1. I can't seem to override the speaker volume. I want the system volume to be full volume before I play my sound.
The LocalNotifications that are set to activate when the app is in background mode only pay when the device isn't muted.
To address the first problem I wanted to the following but didnt know how to use it:
AVAudioSession.sharedInstance().overrideOutputAudioPort(<#portOverride: AVAudioSessionPortOverride#>, error: <#NSErrorPointer#>)
Thanks in advance,
Ace
overrideOutputAudioPort affects the audio routing, not the volume.
When you say that you want to "override the speaker volume", I assume that you mean you want to control the system output volume from you app code. This is not possible as Apple believes that output volume should remain in the control of the user at all times.
AVAudioPlayer's volume property sets the volume relative to the system output level. It defaults to 1.0 (player volume == system volume). You can't turn it up higher, spinal-tap style, to 1.1...
See also my answer here ... if you want to take control of the system volume, you will need the user interface provided by MPVolumeView.
Similarly regarding your notifications - if the user has muted the device, your app won't be able to ignore that.
update
regarding notifications, it isn't as straightforward as I suggested. It might work if you set the AVAudioSession category to AVAudioSessionCategoryPlayback (and read the apple docs on this setting).
When using this category, your app audio continues with the Silent switch set to silent or when the screen locks
You may also need to add 'audio' to UIBackgroundModes in your info.plist.