Play/Pause next/Prev buttons are greyed out in control center - ios

In my application, playback is controlled from control center.
When playback is going on in AVPlayer(At this time playback controls are working fine from control center), I am loading a webview with other streaming URL.
Once streaming is done again I am starting playback from AVPlayer.
After this, Playback controls are greyed out in control center.
I am using [MPNowPlayingInfoCenter defaultCenter].nowPlayingInfo to enable playback control in control center.
What would be the problem?

I ran into this problem as well working with an AVPlayer instance. You can use MPRemoteCommandCenter to set up controls on the lock screen and command center.
MPRemoteCommandCenter *commandCenter = [MPRemoteCommandCenter sharedCommandCenter];
commandCenter.previousTrackCommand.enabled = YES;
[commandCenter.previousTrackCommand addTarget:self action:#selector(previousTapped:)];
commandCenter.playCommand.enabled = YES;
[commandCenter.playCommand addTarget:self action:#selector(playAudio)];
commandCenter.pauseCommand.enabled = YES;
[commandCenter.pauseCommand addTarget:self action:#selector(pauseAudio)];
commandCenter.nextTrackCommand.enabled = YES;
[commandCenter.nextTrackCommand addTarget:self action:#selector(nextTapped:)];
previousTapped:, playAudio, pauseAudio, and nextTapped are all methods in my view controller that call respective methods to control my AVPlayer instance. To enable an action, you must explicitly set enabled to YES and provide a command with a target and selector.
If you need to disable a specific action, you must explicitly set the enabled property to NO in addition to adding a target.
commandCenter.previousTrackCommand.enabled = NO;
[commandCenter.previousTrackCommand addTarget:self action:#selector(previousTapped:)];
If you do not set enabled for the command, the item will not appear at all on the lock screen or in command center.
In addition, remember to set your app up for background playback (add the UIBackgroundModes audio value to your Info.plist file.), set the player active, and check for errors:
NSError *setCategoryError;
NSError *setActiveError;
[[AVAudioSession sharedInstance] setCategory:AVAudioSessionCategoryPlayback error:&setCategoryError];
[[AVAudioSession sharedInstance] setActive:YES error:&setActiveError];

Google brought me here because I was having an issue with the command center when using AdMob video ads, and a comment on the OP referenced AdMob. Posting here for anyone else also having these issues.
AdMob video ads on iOS seem to utilize the MPRemoteCommandCenter for whatever reason. This may interfere with your app's usage of the command center. Here's what I came up with as a potential workaround to this: https://gist.github.com/ekilah/e74683291d3e7fafb947
The "gist" of the workaround is to reset all of the sharedCommandCenters listeners and the MPNowPlayingInfoCenter's info dictionary after an ad from AdMob is fetched and after it's played. The way the workaround goes about resetting all of the commands is less than pretty, but this is what I came up with. Maybe someone has a better method?
This approach may also help the OP. Resetting things between different usages may be a solution.

Finally I solved this problem by loading the mp3 url using AVplayer instead of loading it in UIWebview. Current playback time, total time can be retrieved from AVPlayer and also it is possible to seek the playback using slider.

Related

Ad banners causing MPNowPlayingInfoCenter to lose state

I have an app which plays audio using AVPlayer and I touch the right APIs to get the Now Playing info to update in Control Center.
[[AVAudioSession sharedInstance] setCategory:AVAudioSessionCategoryPlayback withOptions:0 error:&categoryError];
[[AVAudioSession sharedInstance] setMode:AVAudioSessionModeSpokenAudio error:&modeError];
[[AVAudioSession sharedInstance] setActive:YES error:&activeError];
[MPNowPlayingInfoCenter defaultCenter].nowPlayingInfo = {...};
...
MPRemoteCommandCenter * const commandCenter = [MPRemoteCommandCenter sharedCommandCenter];
commandCenter.playCommand.enabled = YES;
...
This API works as expected, until some ads appear using Google's AdMob framework. These are the standard MREC and banners which are presented in UIWebView instances. As soon as one appears, the Now Playing state reverts back to the Music app, and the ability to use the remote controls disappears.
Once this happens, I can't even re-set the Now Playing info, it's like it's stuck. Although, when I print out the value of [[MPNowPlayingInfoCenter defaultCenter].nowPlayingInfo, it is as I expect, it just doesn't show.
The ads play no audio or video, but somehow they are corrupting the Now Playing API and it does not recover.
I've reached out to Google and Apple as to how to fix this, but in the meantime wondered if anyone had any workarounds to suggest?
I found this question when looking into a similar issue with the Google Interactive Media Ads SDK (IMA SDK) where it was automatically changing the Now Playing state to just say "Advertisement". Ultimately found the flag on the IMAAdsSettings object, disablesNowPlayingInfo Documentation. Setting this to true resolved my issue.
let adsLoaderSettings = IMASettings()
adsLoaderSettings.disableNowPlayingInfo = true
adsLoader = IMAAdsLoader(settings: adsLoaderSettings)

iOS Media Player remote control events (MPRemoteCommand) are not working with Unity

Recently I've found out that MediaPlayer remote control event (MPRemoteCommand) handlers are not getting called in iOS application with embedded Unity player. I'm using Xcode 6.3.1 and Unity 5.0.1f1, however, it looks like it can be reproduced with any combination of currently available versions.
This is the official Apple documentation for handling remote control events on iOS: https://developer.apple.com/library/ios/documentation/EventHandling/Conceptual/EventHandlingiPhoneOS/Remote-ControlEvents/Remote-ControlEvents.html
The issue is very easy to reproduce - it's just 24 lines of code added to an empty project.
Create new Single View Application in Xcode (to make working example without Unity player) or create an empty project in Unity and immediately build it targeting iOS platform (File -> Build Settings..., iOS, Build) with all the settings left with their default values to get Xcode project generated by Unity (to make broken example with embedded Unity player).
Import MediaPlayer framework header file in AppDelegate.h (for example without Unity) or UnityAppController.h (for example with Unity):
#import <MediaPlayer/MediaPlayer.h>
Declare a property for MPMoviePlayerController object in the class interface located in AppDelegate.h/UnityAppController.h:
#property (strong, nonatomic) MPMoviePlayerController *player;
Insert this code which creates a player with URL (it's a live radio station, you can use any other HTTP live audio source that you'd like to), registers handlers for play, pause and togglePlayPause remote control events and starts playing the audio (thus making application the "Now Playing" app) in the beginning of application:didFinishLaunchingWithOptions: method located in AppDelegate.m/UnityAppController.mm (you can also add it just before return statement of that method - it doesn't matter):
self.player = [[MPMoviePlayerController alloc] initWithContentURL:[NSURL URLWithString:#"http://81.9.96.34/eldoradio-64k"]];
[[MPRemoteCommandCenter sharedCommandCenter].playCommand addTarget:self action:#selector(playCommand)];
[[MPRemoteCommandCenter sharedCommandCenter].pauseCommand addTarget:self action:#selector(pauseCommand)];
[[MPRemoteCommandCenter sharedCommandCenter].togglePlayPauseCommand addTarget:self action:#selector(togglePlayPauseCommand)];
[self.player play];
Add implementation of handlers to AppDelegate.m/UnityAppController.mm:
- (void)playCommand {
[self.player play];
}
- (void)pauseCommand {
[self.player pause];
}
- (void)togglePlayPauseCommand {
if (self.player.playbackState == MPMoviePlaybackStatePlaying) {
[self.player pause];
} else {
[self.player play];
}
}
Build and run the application. You should hear music playing once it is started.
Trigger some remote control events. The easiest way is to plug-in standard Apple EarPods and press play-pause button on them which should generate togglePlayPause event, but opening iOS Control Center (the UI you get by swiping the screen from the bottom) and pressing Play/Pause button from there is also okay (they should generate separate play and pause events respectively).
In the application without Unity audio stops and starts playing according to remote control events being sent.
In the application with Unity handlers are never getting called (you can set breakpoints in them and see that under debugger) and nothing happens at all. You can also notice that the appearance of iOS Control Center is completely different: in the application with embedded Unity player it has play and two rewind buttons regardless of audio state, but in the application without Unity it shows just pause button when audio is playing or just play button when it is not playing.
You can also replace MPMoviePlayerController with AVPlayer from AVFoundation framework - the result will be exactly the same.
Does anybody know any reason for this behavior and/or any workaround for it? This affects our application very badly.
P.S.: I've already asked similar question before, but didn't get any answers. The example in this one is as simple as possible.
My app has pretty much the same setup as yours - and the same problem. As long as I used a mixable audio session
try AVAudioSession.sharedInstance().setCategory(AVAudioSessionCategoryPlayAndRecord, withOptions: [.DuckOthers, .InterruptSpokenAudioAndMixWithOthers])
my app wouldn't trigger any of the MPRemoteCommand handlers.
When I turned my audio session into a non-mixable one (by removing the options)
try AVAudioSession.sharedInstance().setCategory(AVAudioSessionCategoryPlayAndRecord)
my MPRemoteCommand handlers would get called whenever I pressed the play/pause button on my headset.
So, the solution is to use a non-mixable audio session, which PlayAndRecord is without the explicit mixing options, instead of a mixable audio session.

Custom remote event handling in app from iOS lock screen

How does spotify handle custom remote events? Currently on iPhone 6 running iOS 8.1.3 and spotify version 2.4.0.1822 I get the following controls on the lock screen when I turn on spotify radio. Ive tried reading all docs pertaining to remote events, and I'm unable to find any resources which allow custom remote events from the lock screen.
Maybe that is implemented with MPRemoteCommandCenter.
Here is example...
MPRemoteCommandCenter *remoteCommandCenter = [MPRemoteCommandCenter sharedCommandCenter];
[[remoteCommandCenter skipForwardCommand] addTarget:self action:#selector(skipForward)];
[[remoteCommandCenter togglePlayPauseCommand] addTarget:self action:#selector(togglePlayPause)];
[[remoteCommandCenter pauseCommand] addTarget:self action:#selector(pause)];
[[remoteCommandCenter likeCommand] addTarget:self action:#selector(like)];
Implement this code, play music on your app, and lock your iPhone. You will probably see a customized lock screen.
Note - Menu can be customized label, but it can not customize icon image and number of row.

iOS 7 Dev - Minimizing app causes UIWebView inline YouTube Player to stop playing

I'm working on a project that plays YouTube videos, inline in a UIWebView. What I'm trying to do is to continue playing the app's audio, even when the home button is pressed (or focus is off the app in any way). I've done the following (as suggested from searching my question here):
Made sure Audio and Fetch were included in my info.plist for Background Modes
Made sure my ViewController that contained the WebView is registered to receive remote control events:
[[UIApplication sharedApplication] beginReceivingRemoteControlEvents];
[self becomeFirstResponder];
Messed with the AVAudioSession (sharedSession) and set the category to playback.
None of these things worked. Either I'm doing one of them incorrectly (which I doubt--I've found several "solutions" here that have worked for other people but not for my particular case).
I have a feeling this might have something to do with the way the WebView plays audio. Any ideas?
You need to add BackgroundMode: "Audio And AirPlay".
It's the following key in your plist:
`<key>UIBackgroundModes</key>
<array>
<string>audio</string>`

AVAudioPlayer Loses "Playing" Status

I have an AVAudioPlayer that needs to continue in the background.
Audio is set as the background mode in the plist & this runs on launch:
[[UIApplication sharedApplication] beginReceivingRemoteControlEvents];
[[RootController shared].view becomeFirstResponder];
AVAudioSession* session = [AVAudioSession sharedInstance];
[session setDelegate: self];
[session setActive:YES error:nil];
[session setCategory: AVAudioSessionCategoryPlayback error:nil];
- (BOOL)canBecomeFirstResponder { return YES; }
The Problem
Occasionally the AVAudioPlayer gets in this strange state where:
It's playing, but the play icon in the status bar disappears
If I pause then play, the icon shows up for maybe a second, then disappears
Here's the kicker - if I call setCurrentTime while playing, the play icon shows & stays
I've sunk about 20 hours into this & would love any ideas.
Description of the Bug
If you are playing an AVAudioPlayer then you create an AVPlayer, the playing icon will disappear. Apparently AVPlayer immediately takes precedence & since it is not playing yet the play icon disappears from the status bar.
This causes some serious issues:
If the app is in the background, iOS will shut down your AVAudioPlayer within 5 seconds (because it doesn't realize you're playing audio)
The iOS remote shows a play button even though audio is playing
The play icon is not showing in the status bar
The Workaround
First off, if you don't have to use AVPlayer, then don't. I use it because I need to play a remote MP3 without downloading it first. I used to use this AudioStreamer class but gave up because it pops up an alert when the stream becomes disconnected along with a few other bugs that I couldn't fix.
So if you're stuck with AVPlayer, there's only one way to re-connect playing status with your AVAudioPlayer. If you call setCurrentTime on the AVAudioPlayer then it will magically re-associate itself as the current player for the app. So you'll need to call it after any AVPlayer is initialized and anytime you resume playback on your AVAudioPlayer.
I decided to subclass AVAudioPlayer so I could register it in a global list (when it is initialized) and unregister it when it is deallocated. I also overrode the play method so that any calls to resume playback would also call setCurrentTime. Then I subclassed AVPlayer so that any time one is initialized, all active AVAudioPlayers call setCurrentTime on themselves. Last thing - you'll have to call setCurrentTime after a short, maybe 1 second, delay or else it will have no effect.
No kidding, this is the result of nearly 40 hours of troubleshooting.

Resources