UIApplication -beginReceivingRemoteControlEvents causes Music app to take over audio - ios

I have an app that speaks to the user and listens to the user's speech response. I've noticed that when I plug my phone into my car audio system and use the app, when my app is done speaking, it receives an interruption notification and the Music app starts playing music instead of allowing my app to continue.
This doesn't happen if the phone is not attached to an external device, and this doesn't happen the moment I plug the phone in, only when the speech stops and the phone is playing through the car. I have done some testing and determined that this behavior appears when I call the beginReceivingRemoteControlEvents method on my application. If I don't sign up for remote control events when my application loads, the problem does not occur, but I cannot display 'now playing' information for my audio or use the car's controls for controlling playback.
Has anyone found a way to listen for remote control events without forfeiting control of the device's audio playback?

This is often caused by the car stereo rather than your iOS device. Check the stereo's manual and switch it from Audio Mode to iPod Mode (or whatever your manual names these options). Basically, your car stereo is listening for the track ended notification and using that to trigger a 'play next track' notification. This calls the MPMusicPlayer which usually picks the first track alphabetically in your device's library. It could be that there is a workaround in software but I've found that the easiest thing is to change the setting on the car stereo.

Use the following to disable remote control events (you may have to replace togglePlayPauseCommand with playCommand, or do both):
MPRemoteCommandCenter *commandCenter = [MPRemoteCommandCenter sharedCommandCenter];
[commandCenter.togglePlayPauseCommand addTargetWithHandler:^MPRemoteCommandHandlerStatus(MPRemoteCommandEvent * _Nonnull event) {
NSLog(#"toggle button pressed");
return MPRemoteCommandHandlerStatusSuccess;
}];
or, if you prefer to use a method instead of a block:
[commandCenter.togglePlayPauseCommand addTarget:self action:#selector(toggleButtonAction)];
To stop:
[commandCenter.togglePlayPauseCommand removeTarget:self];
or:
[commandCenter.togglePlayPauseCommand removeTarget:self action:#selector(toggleButtonAction)];
You'll need to add this to the includes area of your file:
#import MediaPlayer;

Related

iOS AVAudioSession interruption notification not working as expected

I want to know when my AVAudioRecorder is inaccessible (e.g when music starts playing).
As audioRecorderEndInterruption will be deprecated with iOS 9 I am focusing on AVAudioSession's interruption notification (but neither is working as expected).
The issue is that the interruption notification is never called if the app was and remains in the foreground when the interruption occurs.
E.g: The user starts and stops playing music without moving the application into the background.
To detect any interruptions I am using:
[[NSNotificationCenter defaultCenter] addObserver:self selector:#selector(audioSessionWasInterrupted:) name:AVAudioSessionInterruptionNotification object:nil];
...
- (void)audioSessionWasInterrupted:(NSNotification *)notification {
if ([notification.name isEqualToString:AVAudioSessionInterruptionNotification]) {
NSLog(#"Interruption notification");
if ([[notification.userInfo valueForKey:AVAudioSessionInterruptionTypeKey] isEqualToNumber:[NSNumber numberWithInt:AVAudioSessionInterruptionTypeBegan]]) {
NSLog(#"InterruptionTypeBegan");
} else {
NSLog(#"InterruptionTypeEnded");
}
}
}
I get InterruptionTypeBegan as expected, but InterruptionTypeEnded isn't called if the app is still in the foreground (meaning it won't be called until the app is placed in the background and back into the foreground).
How may I receive InterruptionTypeEnded notification when the interruption occurs while the app is in the foreground?
This is a widespread problem affecting any app using AV framework components (the same goes for native iOS apps).
As explained in 's documentation on the subject of audio interruptions, the InterruptionTypeEnded should actually be applied in the scenario mentioned:
If the user dismisses the interruption ... the system invokes your callback method, indicating that the interruption has ended.
However, it also states that the InterruptionTypeEnded might not be called at all:
There is no guarantee that a begin interruption will have an end interruption.
Therefore, a different approach is needed in the scenario mentioned.
When it comes to handling music interruptions, the issue won't be around for long. iOS 9 effectively prevents outside audio sources to be used while the app's audio handler is invoked.
A way to handle the exact issue of media interruption could be to listen to MPMusicPlayerController's playbackState, as shown in this stackoverflow question: Detecting if music is playing?.
A more direct way to handle the issue of interruptions would be to either:
Block outside audio interruptions completely by re-invoking your audio component at the time of InterruptionTypeBegan.
Or by giving a UI indication that an outside media source has interrupted the audio session (for example showing an inactive microphone).
Hopefully  will come up with a better solution to the problem, but in the meantime this should give you some options to solve the interruption issue.
If you haven't already, try setting your AVCaptureSession's property usesApplicationAudioSession to NO.
This question & answer may act as a good reference if you're looking for any more detail.
I try this and find InterruptionTypeEnded may called after music pause in some app, but other not called when pause.
My solution is update the UI to let user know record has stopped and do some related work such as file operation. When interruption ends, active AVAudioSession, if doesn't have error, start a new record.
If you want to join the file before and after interrupt, the answer to this question: AVAudioRecorder records only the audio after interruption may be helpful to you.

Generating Remote Control Events from my app

I have two questions regarding Remote Control Events on iOS:
I know that music applications are registered to remote control events and then can receive such events from the iPhone's player widget.
Let's say I want my app to fire such events, is that possible?
How does headphones for example generate those events?
Without private API, you cannot send remotecontrol event to your application.
The reason is we cannot create such an Event (UIEvent) to send out by using:
[[UIApplication sharedApplication] sendEvent:anEvent];
You can, however save a registered event then play back by calling the above-function.
I don't know if it possible for headphone events but with private API, you can send some events like: home button press, power button press or mouse events (not tested).
You should read this book:
http://www.amazon.com/gp/product/1118057651/ref=pd_lpo_sbs_dp_ss_1?pf_rd_p=1535523702&pf_rd_s=lpo-top-stripe-1&pf_rd_t=201&pf_rd_i=0321278542&pf_rd_m=ATVPDKIKX0DER&pf_rd_r=0T2AMHJCEEKJN41YJHD5
It'll be a hard work to make this works.
Take a look at GSEvent to know how to send an event to OS (iOS).
Edit: I've found 2 event types in GSEvent.GSEventType:
kGSEventHeadsetButtonDown = 1018,
kGSEventHeadsetButtonUp = 1019,
PS: - this use private API so it will be rejected if you post this app to AppStore.

CocoaLibSpotify - receiving remote control events and setting now playing info

There's not much to my question I guess. I'm just curious about how CocoaLibSpotify works with AVFoundation and if it's compatible with how Apple needs me to register for remote control events and to set the now playing info in MKNowPlayingInfoCenter.
Apple says to receive remote control events my app needs to "Begin playing audio. Your app must be the “Now Playing” app. Restated, even if your app is the first responder and you have turned on event delivery, your app does not receive remote control events until it begins playing audio.'" however, that's all the documentation I can find... Does playing a track with SPPlaybackManager meet this requirement? What is the requirement anyway?
Thanks for your help again.
Remote control events work fine with CocoaLibSpotify without any modifications to the library at all, but only on the device and not in the Simulator (including iOS7's Control Center).
Taking the Simple Player example, I made the following changes:
Changed Simple_PlayerAppDelegate to be a subclass of UIResponder.
Overrode canBecomeFirstResponder: to return YES.
Implemented remoteControlReceivedWithEvent:.
In the callback to the playTrack: call to CocoaLibSpotify, added:
[[UIApplication sharedApplication] beginReceivingRemoteControlEvents];
[self becomeFirstResponder];
These changes allowed Simple Player to receive remote control events when running on a device.

AVAudioSessionDelegate called at endInterruption, but beginInterruption not called

I'm setting up an AVAudioSession when the app launches and setting the delegate to the appDelegate. Everything seems to be working (playback, etc) except that beginInterruption on the delegate is not being called when the phone receives a call. When the call ends endInterruption is being called though.
The only thought I have is that the audio player code I'm using used to be based on AVAudioPlayer, but is now using AVPlayer. The callbacks for the AVAudioPlayer delegate for handling interrupts are still in there, but it seems odd that they would conflict in any way.
Looking at the header, in iOS6, it looks like AVAudioSessionDelegate is now deprecated.
Use AVAudioSessionInterruptionNotification instead in iOS6.
Update: That didn't work. I think there's a bug in the framework.
Yes, in my experience, beginInterruption, nor the newly documented AVAudioSessionInterruptionNotification work properly. What I had to do was track the status of the player using a local flag, then handle the endInterruption:withFlags: method in order to track recovery from interruptions.
With iOS 6, the resuming from an interruption will at least keep your AudioPlayer in the right place, so there was no need for me to store the last known play time of my AVAudioPlayer, I simply had to hit play.
Here's the solution that I came up with. It seems like iOS 6 kills your audio with a Media Reset if an AVPlayer stays resident too long. What ends up happening, is the AVPlayer plays, but no sound comes out. The rate on the AVPlayer is 1, but there's absolutely no sound. To add pain to the situation, there's no error on either the AVAudioSession setActive, nor the AVPlayer itself that indicates that there's a problem.
Add to the fact that you can't depend on appWillResignActive, because your app may already be in the background if you're depending on remote control gestures at all.
The final solution I implemented was to add a periodic observer on the AVPlayer, and record the last known time. When I receive the event that I've been given back control, I create a new AVPlayer, load it with the AVPlayerItem, and seekToTime to the proper time.
It's quite an annoying workaround, but at least it works, and avoids the periodic crashes that were happening.
I can confirm that using the C api, the interruption method is also not called when the interruption begins; only when it ends
(AudioSessionInitialize (nil, nil, interruptionListenerCallback, (__bridge void *)(self));
I've also filed a bug report with apple for the issue.
Edit: This is fixed in iOS 6.1 (but not iOS 6.0.1)
Just call:
[[AVAudioSession sharedInstance] setDelegate: self];
I just checked on my iPhone 5 (running iOS 6.0) by setting a breakpoint in the AudioSessionInterruptionListener callback function that was declared in AudioSessionInitialize(), and this interrupt callback does, in fact, get called when the app has an active audio session and audio unit and is interrupted with an incoming phone call (Xcode shows the app stopped at the breakpoint at the beginning of the interruption, which I then continue from).
I have the app then stop its audio unit and de-activate its audio session. Then, on the end interruption callback, the app re-activates the audio session and restarts the audio unit without problems (the app is recording audio properly afterwards).
I built a brand new audio streaming (AVPlayer) application atop iOS 6.0.x and found the same problem.
Delegates are now deprecated and we have to use notifications, that's great, however here's my findings:
During an incoming phone call I get only AVAudioSessionInterruptionTypeEnded in my handler, along with AVAudioSessionInterruptionOptionShouldResume. Audio session gets suspended automatically (audio fades) and I just need to resume playback of AVPlayer.
However when attempting to launch a game, such as CSR Racing, I oddly get the dreaded AVAudioSessionInterruptionTypeBegan but no sign when my application can resume playback, not even killing the game.
Now, this may depend on other factors, such as my audio category (in my case AVAudioSessionCategoryPlayback) and the mixing settings of both applications (kAudioSessionProperty_OverrideCategoryMixWithOthers), I'm not sure, but definitely I see something out of place.
Hopefully others reported that on 6.1beta this is fixed and I yet have to upgrade, so we'll see.

Volume Control in a mostly Silent iOS app

I've asked this question before but I feel I should start a new thread since my other thread is dated and probably poorly worded. I'm wondering what the best approach would be for adding volume control to an iOS app the is mostly silent. A good example would be a navigation app that only plays audio when you approach or miss a turn. In such an app, hearing a turn prompt which is not loud enough, the user would want the volume for the prompts to be audible and would naturally used the side volume controls to adjust prompts to their liking.
There are several problems here. One is that audio is not currently playing so the user has no reference as to how much it has been increased. This is more or less expected however there are technical issues that I am more interested in. To link the side volume control to your app you have to start and manage an audio session. I have not found an authoritative reference for such a situation as most documentation assumes you are currently playing or in the process of starting audio. Managing an audio session for a mostly silent app seems to be an edge case, though I find it rather common in that two of the major apps I've worked on require such functionality.
Of the various problems associated with audio session management, you have to address killing and restoring the audio session as you move in and out of the background. You have to consider other apps playing audio as you begin and stop the session. Depending on your type of app, you may have other more advanced needs such as custom override routing to the speakers, custom mute controls, etc. If you have any experience with such an app could you elaborate on how you addressed such challenges and expound on other issues?
One very common merhod is to set the audio session category appropriate for the type of app at launch, no matter whether sound is immanent or won't be played till tomorrow (as long as the purpose and settings of the app is to play such).
Added:
One way to allow to user to adjust the volume when the app is silent is to provide some means for the user to have your app to immediately start (and/or maybe stop) playing some sound with an amplitude typical for your app: some calibration tone/talk, your copyright notice, trademark jingle, or a safety message, for instance.
The main issue I see when developing apps that are mostly silent regard moving in/out of the foreground and playing nicely with other audio. To give a better idea of what I usually do I'll give some snippets from a recent project. (These are intentionally incomplete and only meant to illustrate a point.) For the sake of argument let's assume we have an AudioManager class that is responsible for maintaining the audio sessions. This class is what we use to instantiate our custom audio player. In such a class we put:
#interface MyAudioManager ()
#property (nonatomic, retain) BOOL alwaysMaintainAudioSession;
#property (nonatomic, retain) MyCustomAudioPlayer *player;
#end
#implementation MyAudioManager
#synthesize alwaysMaintainAudioSession;
#synthesize player;
-(void) applicationWillEnterForeground
{
isInBackground = NO;
if (NO==[self anyAudioIsPlaying] && self.alwaysMaintainAudioSession) {
[self activateAudioSession];
}
}
-(void) activateAudioSession
{
AudioSessionSetActive(TRUE);
AudioSessionAddPropertyListener ( kAudioSessionProperty_AudioRouteChange, AudioPropertyListener, self);
}
-(BOOL) anyAudioIsPlaying
{
return [self otherAudioIsPlaying] || [player isPlaying];
}
-(BOOL) otherAudioIsPlaying
{
UInt32 yesNo;
UInt32 propertySize = sizeof(yesNo);
OSStatus status = AudioSessionGetProperty(kAudioSessionProperty_OtherAudioIsPlaying, &propertySize, &yesNo);
if (kAudioSessionUnsupportedPropertyError == status) {
return MPMusicPlaybackStatePlaying == [theiPodMusicPlayer playbackState];
} else {
return MPMusicPlaybackStatePlaying == [theiPodMusicPlayer playbackState] || yesNo;
}
}
The manager allows you to set a property that always keeps the volume control linked to the app sounds which means we always make sure either a session is active or some other app is playing audio. In any other case the volume control reverts to controlling the ringer. So when entering the foreground we have to check for any other audio playing and conditionally activate the audio session. We also need to close the session when moving to the background to restore the ringer volume control.
-(void) applicationDidEnterBackground
{
if (NO==[self anyAudioIsPlaying]) {
AudioSessionSetActive(NO);
}
}
In my solution I include a bunch of other code to handle things like responding intelligently when bluetooth audio devices are connected, factory methods for creating the custom player, custom audio compression and more. The main idea, however, is handling other apps playing audio while attempting to keep the volume control linked to app volume while in the foreground.

Resources