I have an audio app that plays background audio on iOS devices. I need to have the app have the "skip 15" buttons — a la the Apple Podcasts app and Overcast — instead of next/previous track buttons. Does anyone know where the documentation to this is, or of some examples? This is turning out to be a tricky issue to Google.
Update: Great answer to this question for iOS 7.1 and later at https://stackoverflow.com/a/24818340/1469259.
The buttons on the lock screen trigger "remote control" events. You can handle these events and skip forward/back as you require:
- (void)viewDidAppear:(BOOL)animated {
[[UIApplication sharedApplication] beginReceivingRemoteControlEvents];
if ([self canBecomeFirstResponder]) {
[self becomeFirstResponder];
}
}
- (void)remoteControlReceivedWithEvent:(UIEvent *)receivedEvent {
if (receivedEvent.type == UIEventTypeRemoteControl) {
switch (receivedEvent.subtype) {
case UIEventSubtypeRemoteControlTogglePlayPause:
// add code here to play/pause audio
break;
case UIEventSubtypeRemoteControlPreviousTrack:
// add code here to skip back 15 seconds
break;
case UIEventSubtypeRemoteControlNextTrack:
// add code here to skip forward 15 seconds
break;
default:
break;
}
}
}
How you do the actual skipping depends on how you're playing the audio.
Documentation here https://developer.apple.com/library/ios/documentation/EventHandling/Conceptual/EventHandlingiPhoneOS/Remote-ControlEvents/Remote-ControlEvents.html
Potentially another way of achieving this, but not one I've used myself is MPSkipIntervalCommand https://developer.apple.com/library/ios/documentation/MediaPlayer/Reference/MPSkipIntervalCommand_Ref/index.html
Related
I added the Spotify player to my app which also plays music using the MPMusicPlayerController. When music is playing from Spotify and the screen is locked, the remote control events are not received for play/pause and FFW/RWD when the user presses these buttons on the locked screen.
If music is playing from the MPMusicPlayerController, I am able to receive the remote control events based on the following code:
-(void) ViewDidLoad {
...
[[UIApplication sharedApplication] beginReceivingRemoteControlEvents];
[self becomeFirstResponder];
...
}
and
- (BOOL) canBecomeFirstResponder
{
return YES;
}
- (void) remoteControlReceivedWithEvent: (UIEvent*) event
{
// see [event subtype] for details
if (event.type == UIEventTypeRemoteControl) {
// We may be receiving an event from the lockscreen
switch (event.subtype) {
case UIEventSubtypeRemoteControlTogglePlayPause:
case UIEventSubtypeRemoteControlPlay:
case UIEventSubtypeRemoteControlPause:
// User pressed play or pause from lockscreen
[self playOrPauseMusic:nil];
break;
case UIEventSubtypeRemoteControlNextTrack:
// User pressed FFW from lockscreen
[self fastForwardMusic:nil];
break;
case UIEventSubtypeRemoteControlPreviousTrack:
// User pressed rewind from lockscreen
[self rewindMusic:nil];
break;
default:
break;
}
}
}
While the iPod controls are visible when the app enters the background, they do not respond when I press pause. Instead, the iPod controls disappear when I press pause. What addition is needed to enable detection of play/pause and FFW/RWD when streaming audio such as Spotify is playing in the background from lock screen?
I believe I ran into this in the past. If I remember correctly I added in the
-(void)remoteControlReceivedWithEvent:(UIEvent *) event { ... }
as well as
- (BOOL) canBecomeFirstResponder { return YES; }
to the app delegate (This is also where my audio controller lived). I was having having the issue where the UIViewControllers were not alive during the time I wanted to catch the UIEventTypeRemoteControl notifications.
Give that a try and see if that helps.
After further investigation, I have found that if include the following code when my app enters the background and when the remote control events are received, the iPod controls do not disappear.
// Set up info center to display album artwork within ipod controls (needed for spotify)
MPMediaItemArtwork *ipodControlArtwork = [[MPMediaItemArtwork alloc]initWithImage:artworkImage];
[MPNowPlayingInfoCenter defaultCenter].nowPlayingInfo = [NSDictionary dictionaryWithObjectsAndKeys:nowPlayingTitle, MPMediaItemPropertyTitle,
nowPlayingArtist, MPMediaItemPropertyArtist, ipodControlArtwork, MPMediaItemPropertyArtwork, [NSNumber numberWithDouble:0.0], MPNowPlayingInfoPropertyPlaybackRate, nil];
I am developing an audio player app in Swift and have completed the core functionality. I would like to set it up so that when you hit track next or track previous on bluetooth headphones it will trigger the previous and next buttons that are in the app already.
This is straight from Apple's documentation.
- (void) remoteControlReceivedWithEvent: (UIEvent *) receivedEvent {
if (receivedEvent.type == UIEventTypeRemoteControl) {
switch (receivedEvent.subtype) {
case UIEventSubtypeRemoteControlTogglePlayPause:
[self playPauseToggle: nil]
break;
case UIEventSubtypeRemoteControlNextTrack:
[self nextTrack: nil]
break;
...
And here's the function in swift:
func remoteControlReceivedWithEvent(_ event: UIEvent)
I am not sure, however, if Apple will let you get away with handling these events in anyway other than their explicit expected behavior. If I click pause, for example, and it opens a URL or something like that, you can expect to be rejected or pulled from the store right away. If your previous and next buttons aren't directly related to audio playback this isn't allowed and will not fly.
How can I make my cocoalibspotify app for iOS respond to the built-in pause/play and next/previous controls? I see no documentation of this.
In the main view controller, the app needs to subscribe to remote control events.
- (void)viewDidLoad {
// ...
[[UIApplication sharedApplication] beginReceivingRemoteControlEvents];
// ...
}
Remote control button presses will then trigger remoteControlReceivedEvent: on your view controller. The event can be handled as follows.
- (void) remoteControlReceivedWithEvent: (UIEvent *) receivedEvent {
if (receivedEvent.type == UIEventTypeRemoteControl) {
switch (receivedEvent.subtype) {
case UIEventSubtypeRemoteControlPause:
// Handle pause
break;
case UIEventSubtypeRemoteControlPlay:
// Handle play
break;
case UIEventSubtypeRemoteControlPreviousTrack:
// Handle Previous
break;
case UIEventSubtypeRemoteControlNextTrack:
// Handle Next
break;
default:
break;
}
}
}
I'm writing an app that streams music and I'm having a ton of trouble setting the icon on the music control dock screen (when you double-click the home button and swipe to the left). All the documentation said to do something like this, but the icon never appears and I don't ever receive the event notification. All of this code is in my player view controller. _radioPlayer is an instance of an AVQueuePlayer. What am I doing wrong here?
- (void)viewWillAppear:(BOOL)animated
{
UIApplication *application = [UIApplication sharedApplication];
if([application respondsToSelector:#selector(beginReceivingRemoteControlEvents)])
[application beginReceivingRemoteControlEvents];
[self becomeFirstResponder];
}
- (BOOL)canBecomeFirstResponder
{
return YES;
}
- (void)remoteControlReceivedWithEvent:(UIEvent *)event {
switch (event.subtype) {
case UIEventSubtypeRemoteControlTogglePlayPause:
[_radioPlayer pause];
break;
case UIEventSubtypeRemoteControlPlay:
[_radioPlayer play];
break;
case UIEventSubtypeRemoteControlPause:
[_radioPlayer pause];
break;
default:
break;
}
}
EDIT: I read through the documentation and followed the steps but it's still not working. Here are some of the possible reasons I've come up with:
A. My music isn't playing yet when viewDidAppear is called. In the documentation it states
"Your app must be the “Now Playing” app. Restated, even if your app is the first responder >and you have turned on event delivery, your app does not receive remote control events until >it begins playing audio."
I tried calling beginReceivingRemoteControlEvents: and becomeFirstResponder: after the music starts playing but this doesn't work either. Can these only be called in viewDidAppear:? Does iOS automatically detect when the music begins playing so this isn't necessary?
B. There is something weird about using an AVQueuePlayer. I'm almost positive this isn't it since the event message is handled by the view controller, not the player itself.
It may not working for you because you are calling beginReceivingRemoteControlEvents in viewWillAppear instead of viewDidAppear. Check out the documentaion -
- (void)viewDidAppear:(BOOL)animated {
[super viewDidAppear:animated];
// Turn on remote control event delivery
[[UIApplication sharedApplication] beginReceivingRemoteControlEvents];
}
This is a similar question to this. But I still can't quite figure out what to do.
So, I have a tabbar app that functions similar to an ipod. One view controller is the "NOW PLAYING" view controller and it is the view controller at index 1. So, in that VC I have:
- (void)viewDidAppear:(BOOL)animated
{
// Turn on remote control event delivery
[[UIApplication sharedApplication] beginReceivingRemoteControlEvents];
[[AVAudioSession sharedInstance] setCategory:AVAudioSessionCategoryPlayback error:nil];
// Set itself as the first responder
[self becomeFirstResponder];
}
- (BOOL)canBecomeFirstResponder
{
return YES;
}
- (void)remoteControlReceivedWithEvent:(UIEvent *)event
{
NSLog(#"Where is my event?");
if(event.type == UIEventTypeRemoteControl)
{
switch (event.subtype) {
case UIEventSubtypeRemoteControlTogglePlayPause:
NSLog(#"Pause");
break;
case UIEventSubtypeRemoteControlNextTrack:
NSLog(#"Next");
break;
case UIEventSubtypeRemoteControlPreviousTrack:
NSLog(#"Prev");
break;
default:
NSLog(#"SOMETHING WAS CLICKED");
break;
}
}
}
I don't receive any events on remote clicks either on the headphones or on the double-home button click shortcuts. I am running on an actual iPhone, not in the simulator. I am using a AVQueuePlayer (which IS playing) to manage my media.
I just shoved everything into the AppDelegate and told the AppDelegate to call to the ViewController to tell it what to do and it worked fine. Is it poor practise to put it in the AppDelegate?