Custom remote event handling in app from iOS lock screen - ios

How does spotify handle custom remote events? Currently on iPhone 6 running iOS 8.1.3 and spotify version 2.4.0.1822 I get the following controls on the lock screen when I turn on spotify radio. Ive tried reading all docs pertaining to remote events, and I'm unable to find any resources which allow custom remote events from the lock screen.

Maybe that is implemented with MPRemoteCommandCenter.
Here is example...
MPRemoteCommandCenter *remoteCommandCenter = [MPRemoteCommandCenter sharedCommandCenter];
[[remoteCommandCenter skipForwardCommand] addTarget:self action:#selector(skipForward)];
[[remoteCommandCenter togglePlayPauseCommand] addTarget:self action:#selector(togglePlayPause)];
[[remoteCommandCenter pauseCommand] addTarget:self action:#selector(pause)];
[[remoteCommandCenter likeCommand] addTarget:self action:#selector(like)];
Implement this code, play music on your app, and lock your iPhone. You will probably see a customized lock screen.
Note - Menu can be customized label, but it can not customize icon image and number of row.

Related

Play/Pause next/Prev buttons are greyed out in control center

In my application, playback is controlled from control center.
When playback is going on in AVPlayer(At this time playback controls are working fine from control center), I am loading a webview with other streaming URL.
Once streaming is done again I am starting playback from AVPlayer.
After this, Playback controls are greyed out in control center.
I am using [MPNowPlayingInfoCenter defaultCenter].nowPlayingInfo to enable playback control in control center.
What would be the problem?
I ran into this problem as well working with an AVPlayer instance. You can use MPRemoteCommandCenter to set up controls on the lock screen and command center.
MPRemoteCommandCenter *commandCenter = [MPRemoteCommandCenter sharedCommandCenter];
commandCenter.previousTrackCommand.enabled = YES;
[commandCenter.previousTrackCommand addTarget:self action:#selector(previousTapped:)];
commandCenter.playCommand.enabled = YES;
[commandCenter.playCommand addTarget:self action:#selector(playAudio)];
commandCenter.pauseCommand.enabled = YES;
[commandCenter.pauseCommand addTarget:self action:#selector(pauseAudio)];
commandCenter.nextTrackCommand.enabled = YES;
[commandCenter.nextTrackCommand addTarget:self action:#selector(nextTapped:)];
previousTapped:, playAudio, pauseAudio, and nextTapped are all methods in my view controller that call respective methods to control my AVPlayer instance. To enable an action, you must explicitly set enabled to YES and provide a command with a target and selector.
If you need to disable a specific action, you must explicitly set the enabled property to NO in addition to adding a target.
commandCenter.previousTrackCommand.enabled = NO;
[commandCenter.previousTrackCommand addTarget:self action:#selector(previousTapped:)];
If you do not set enabled for the command, the item will not appear at all on the lock screen or in command center.
In addition, remember to set your app up for background playback (add the UIBackgroundModes audio value to your Info.plist file.), set the player active, and check for errors:
NSError *setCategoryError;
NSError *setActiveError;
[[AVAudioSession sharedInstance] setCategory:AVAudioSessionCategoryPlayback error:&setCategoryError];
[[AVAudioSession sharedInstance] setActive:YES error:&setActiveError];
Google brought me here because I was having an issue with the command center when using AdMob video ads, and a comment on the OP referenced AdMob. Posting here for anyone else also having these issues.
AdMob video ads on iOS seem to utilize the MPRemoteCommandCenter for whatever reason. This may interfere with your app's usage of the command center. Here's what I came up with as a potential workaround to this: https://gist.github.com/ekilah/e74683291d3e7fafb947
The "gist" of the workaround is to reset all of the sharedCommandCenters listeners and the MPNowPlayingInfoCenter's info dictionary after an ad from AdMob is fetched and after it's played. The way the workaround goes about resetting all of the commands is less than pretty, but this is what I came up with. Maybe someone has a better method?
This approach may also help the OP. Resetting things between different usages may be a solution.
Finally I solved this problem by loading the mp3 url using AVplayer instead of loading it in UIWebview. Current playback time, total time can be retrieved from AVPlayer and also it is possible to seek the playback using slider.

iOS Media Player remote control events (MPRemoteCommand) are not working with Unity

Recently I've found out that MediaPlayer remote control event (MPRemoteCommand) handlers are not getting called in iOS application with embedded Unity player. I'm using Xcode 6.3.1 and Unity 5.0.1f1, however, it looks like it can be reproduced with any combination of currently available versions.
This is the official Apple documentation for handling remote control events on iOS: https://developer.apple.com/library/ios/documentation/EventHandling/Conceptual/EventHandlingiPhoneOS/Remote-ControlEvents/Remote-ControlEvents.html
The issue is very easy to reproduce - it's just 24 lines of code added to an empty project.
Create new Single View Application in Xcode (to make working example without Unity player) or create an empty project in Unity and immediately build it targeting iOS platform (File -> Build Settings..., iOS, Build) with all the settings left with their default values to get Xcode project generated by Unity (to make broken example with embedded Unity player).
Import MediaPlayer framework header file in AppDelegate.h (for example without Unity) or UnityAppController.h (for example with Unity):
#import <MediaPlayer/MediaPlayer.h>
Declare a property for MPMoviePlayerController object in the class interface located in AppDelegate.h/UnityAppController.h:
#property (strong, nonatomic) MPMoviePlayerController *player;
Insert this code which creates a player with URL (it's a live radio station, you can use any other HTTP live audio source that you'd like to), registers handlers for play, pause and togglePlayPause remote control events and starts playing the audio (thus making application the "Now Playing" app) in the beginning of application:didFinishLaunchingWithOptions: method located in AppDelegate.m/UnityAppController.mm (you can also add it just before return statement of that method - it doesn't matter):
self.player = [[MPMoviePlayerController alloc] initWithContentURL:[NSURL URLWithString:#"http://81.9.96.34/eldoradio-64k"]];
[[MPRemoteCommandCenter sharedCommandCenter].playCommand addTarget:self action:#selector(playCommand)];
[[MPRemoteCommandCenter sharedCommandCenter].pauseCommand addTarget:self action:#selector(pauseCommand)];
[[MPRemoteCommandCenter sharedCommandCenter].togglePlayPauseCommand addTarget:self action:#selector(togglePlayPauseCommand)];
[self.player play];
Add implementation of handlers to AppDelegate.m/UnityAppController.mm:
- (void)playCommand {
[self.player play];
}
- (void)pauseCommand {
[self.player pause];
}
- (void)togglePlayPauseCommand {
if (self.player.playbackState == MPMoviePlaybackStatePlaying) {
[self.player pause];
} else {
[self.player play];
}
}
Build and run the application. You should hear music playing once it is started.
Trigger some remote control events. The easiest way is to plug-in standard Apple EarPods and press play-pause button on them which should generate togglePlayPause event, but opening iOS Control Center (the UI you get by swiping the screen from the bottom) and pressing Play/Pause button from there is also okay (they should generate separate play and pause events respectively).
In the application without Unity audio stops and starts playing according to remote control events being sent.
In the application with Unity handlers are never getting called (you can set breakpoints in them and see that under debugger) and nothing happens at all. You can also notice that the appearance of iOS Control Center is completely different: in the application with embedded Unity player it has play and two rewind buttons regardless of audio state, but in the application without Unity it shows just pause button when audio is playing or just play button when it is not playing.
You can also replace MPMoviePlayerController with AVPlayer from AVFoundation framework - the result will be exactly the same.
Does anybody know any reason for this behavior and/or any workaround for it? This affects our application very badly.
P.S.: I've already asked similar question before, but didn't get any answers. The example in this one is as simple as possible.
My app has pretty much the same setup as yours - and the same problem. As long as I used a mixable audio session
try AVAudioSession.sharedInstance().setCategory(AVAudioSessionCategoryPlayAndRecord, withOptions: [.DuckOthers, .InterruptSpokenAudioAndMixWithOthers])
my app wouldn't trigger any of the MPRemoteCommand handlers.
When I turned my audio session into a non-mixable one (by removing the options)
try AVAudioSession.sharedInstance().setCategory(AVAudioSessionCategoryPlayAndRecord)
my MPRemoteCommand handlers would get called whenever I pressed the play/pause button on my headset.
So, the solution is to use a non-mixable audio session, which PlayAndRecord is without the explicit mixing options, instead of a mixable audio session.

How to prevent UIWebView video from getting remote control events

I am using UIWebView in an iOS app to play YouTube videos but to provide native experience, I've implemented playback controls using UIKit. So the UIWebView is only used to display video.
I've also implemented -remoteControlReceivedWithEvent: to allow control from Control Center and controller buttons on earphones. But it seems that UIWebView automatically handles remote control events from the earphones. This is a problem because when you toggle play/pause, my code would pause the video and then UIWebView will toggle it again to play the video.
Is there anyway to prevent this from happening?
Related issue is that UIWebView tries to set "Now Playing" information to MPNowPlayingInfoCenter which is also done by my code.
I encountered same kind of issue with my app which playbacks audio using AVAudioPlayer.
This app displays current audio info in MPNowPlayingInfoCenter during playback.
My requirement was to display an html5 video advert (with audio) inside a webview on top my player.
First i used only UIWebView as i needed to support iOS7 but i met a lot of issues, one of them was MPNowPlayingInfoCenter that displays url of ad video in place of current native audio playback.
I tried several solutions such as method swizzling on UIWebView without any success.
I found only one solution that works for me: use WKWebView by default instead of UIWebView web container. HTML5 video playback will not interact anymore with MPNowPlayingInfoCenter, i had to support also iOS7 so i created a wrapper class to switch between UIWebView (with still the issue) on iOS7 and WKWebView from iOS8 and more.
Hope this helps.
I ran into this question while trying to solve somewhat the reverse problem: I added a UIWebView to an app and remote control events were not working when the UIWeb view was on display, but worked elsewhere in the app.
In my case the solution was to add these 3 methods, which are present on all other view controllers in my app, to the controller of my new UIWebView:
- (void) viewDidAppear:(BOOL)animated {
[self becomeFirstResponder];
}
- (void)viewWillDisappear:(BOOL)animated {
[super viewWillDisappear:animated];
[self resignFirstResponder];
}
-(void)remoteControlReceivedWithEvent:(UIEvent *)event {
// Logic to handle remote control events
}
From this, I suspect that you can prevent your UIWebView from handling remote control events by doing one of two things:
1. Include logic in the remoteControlReceivedWithEvent to ignore the remote control events you don't want handled.
2. Have your UIWebView resign being the first responder by calling resignFirstResponder in the viewDidAppear method of the controller.

How to display overlay on Apple TV via AirPlay

I am developing an iOS app that displays a video, e.g., a football game, on Apple TV via AirPlay. I want to display additional information, e.g., player stats, on the big screen while the video is playing.
I am aware of the Redfin approach, where they require the user to turn on AirPlay mirroring first. Unfortunately, this is not acceptable for us. We want it to be obvious to users on how to show the video.
We are currently presenting an AirPlay Route button before displaying the video to allow the user to set it up using the following code.
self.airPlayPicker = [[MPVolumeView alloc] initWithFrame:CGRectMake(0, 0, 50, 50)];
self.airPlayPicker.showsVolumeSlider = NO;
self.airPlayPicker.showsRouteButton = YES;
[self.view addSubview:self.airPlayPicker];
The Route button will show when there is an Apple TV around, allowing the user to turn it on. We then present the video with MPMoviePlayerController.
When AirPlay is turned on and the video is playing, in code, I see only one UIScreen, but two UIWindows. But both UIWindows have the same dimensions as the iPhone. When I add a subview to either UIWindow, the subview always shows up on the iPhone.
Has anyone figured out how to present an overlay on top of the video on Apple TV? How do I even find the view object where the video is hosted?
I am aware that MPMoviePlayerController is built on top of AVPlayer. Would using AVPlayer give us better control of the UI?
As far as I know, this shouldn't be possible. When using AirPlay without mirroring, only the URL of the video is sent to the Apple TV. It is then up to the Apple TV to actually play the media.
Mirroring is the way to do it.

How to find available screens with AirPlay?

I am trying to detect when an AirPlay device is available on the network and if there is one, I'd like to display a list.
For example, the app "Dailymotion" does exactly what I want : when I connect my iPhone to a network with an Apple TV, an "AirPlay" icon appears : https://dl.dropbox.com/u/4534662/Photo%2004-03-13%2010%2007%2014.png (just next the HD)
And then, when clicking on the AirPlay icon, a picker shows up with the available AirPlay devices : https://dl.dropbox.com/u/4534662/Photo%2004-03-13%2010%2007%2018.png
I didn't find a way to do that with Apple Documentation. So, how can I do that programmatically?
You can display an AirPlay picker view (if AirPlay is available) like so:
MPVolumeView *volumeView = [ [MPVolumeView alloc] init] ;
[volumeView setShowsVolumeSlider:NO];
[volumeView sizeToFit];
[view addSubview:volumeView];
The MPVolumeView displays all the available AirPlay devices. The code above disables the volume slider, which you may or may not want to do. What you can't do is get programatic access to AirPlay device information.

Resources