iOS Admob ad starts playing with lock screen buttons - BUG? - ios

I have a music streaming app in which I use lock screen controls to play/pause/next the song.
I have Admob Interstitial ads in my app.
However when I use the lock screen controls, it gets passed down to the video ad as well because of which the video ad starts playing along with my app's music. Is there any way to prevent this?
Here's how I am handling the lock screen controls. I don't interact with the ads in any of this code but still the control gets passed down to admob's video player:
- (void)remoteControlReceivedWithEvent:(UIEvent *)event {
////NSLog(#"CustomApp:remoteControlReceivedWithEvent:%#", event.description);
if (event.type == UIEventTypeRemoteControl)
{
switch (event.subtype)
{
case UIEventSubtypeRemoteControlPlay:
// play the video
dispatch_async(dispatch_get_main_queue(), ^{
[[[SoundEngine sharedInstance] audioPlayer] resume];
//[[SoundEngine sharedInstance] setLockScreenElapsedTime];
});
break;
case UIEventSubtypeRemoteControlPause:
// pause the video
dispatch_async(dispatch_get_main_queue(), ^{
[[[SoundEngine sharedInstance] audioPlayer] pause];
//[[SoundEngine sharedInstance] setLockScreenElapsedTime];
});
break;
case UIEventSubtypeRemoteControlNextTrack:
// to change the video
dispatch_async(dispatch_get_main_queue(), ^{
[[SoundEngine sharedInstance] nextClicked];
//[[SoundEngine sharedInstance] setLockScreenElapsedTime];
});
break;
case UIEventSubtypeRemoteControlPreviousTrack:
// to play the privious video
dispatch_async(dispatch_get_main_queue(), ^{
[[SoundEngine sharedInstance] prevClicked];
//[[SoundEngine sharedInstance] setLockScreenElapsedTime];
});
break;
default:
break;
}
}
}

I also recently encountered an AdMob interstitial video ads playing after I called load, without ever calling present. In addition, when I did present the ads, toggling the mute switch did not work.
The versions of AdMob SDK were 7.19.1 and 7.20.0.
In the end I identified it to be an AdMob issue, but it was not so obvious. The portion that was causing the error was registering UserAgent for UserDefauts. In particular the following lines cause the problem.
let userAgent : String = "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_11_2) AppleWebKit/601.3.9 (KHTML, like Gecko) Version/9.0.2 Safari/601.3.9"
UserDefaults.standard.register(defaults: ["UserAgent" : userAgent])
Probably AdMob touches user agent stored in UserDefaults when loading ads. However, I absolutely needed that custom user agent so am still unsure what I can do, but at least we know how AdMob SDK fails us in this case.

Related

Lock Screen iPod Controls Not Working With Spotify Music Player

I added the Spotify player to my app which also plays music using the MPMusicPlayerController. When music is playing from Spotify and the screen is locked, the remote control events are not received for play/pause and FFW/RWD when the user presses these buttons on the locked screen.
If music is playing from the MPMusicPlayerController, I am able to receive the remote control events based on the following code:
-(void) ViewDidLoad {
...
[[UIApplication sharedApplication] beginReceivingRemoteControlEvents];
[self becomeFirstResponder];
...
}
and
- (BOOL) canBecomeFirstResponder
{
return YES;
}
- (void) remoteControlReceivedWithEvent: (UIEvent*) event
{
// see [event subtype] for details
if (event.type == UIEventTypeRemoteControl) {
// We may be receiving an event from the lockscreen
switch (event.subtype) {
case UIEventSubtypeRemoteControlTogglePlayPause:
case UIEventSubtypeRemoteControlPlay:
case UIEventSubtypeRemoteControlPause:
// User pressed play or pause from lockscreen
[self playOrPauseMusic:nil];
break;
case UIEventSubtypeRemoteControlNextTrack:
// User pressed FFW from lockscreen
[self fastForwardMusic:nil];
break;
case UIEventSubtypeRemoteControlPreviousTrack:
// User pressed rewind from lockscreen
[self rewindMusic:nil];
break;
default:
break;
}
}
}
While the iPod controls are visible when the app enters the background, they do not respond when I press pause. Instead, the iPod controls disappear when I press pause. What addition is needed to enable detection of play/pause and FFW/RWD when streaming audio such as Spotify is playing in the background from lock screen?
I believe I ran into this in the past. If I remember correctly I added in the
-(void)remoteControlReceivedWithEvent:(UIEvent *) event { ... }
as well as
- (BOOL) canBecomeFirstResponder { return YES; }
to the app delegate (This is also where my audio controller lived). I was having having the issue where the UIViewControllers were not alive during the time I wanted to catch the UIEventTypeRemoteControl notifications.
Give that a try and see if that helps.
After further investigation, I have found that if include the following code when my app enters the background and when the remote control events are received, the iPod controls do not disappear.
// Set up info center to display album artwork within ipod controls (needed for spotify)
MPMediaItemArtwork *ipodControlArtwork = [[MPMediaItemArtwork alloc]initWithImage:artworkImage];
[MPNowPlayingInfoCenter defaultCenter].nowPlayingInfo = [NSDictionary dictionaryWithObjectsAndKeys:nowPlayingTitle, MPMediaItemPropertyTitle,
nowPlayingArtist, MPMediaItemPropertyArtist, ipodControlArtwork, MPMediaItemPropertyArtwork, [NSNumber numberWithDouble:0.0], MPNowPlayingInfoPropertyPlaybackRate, nil];

How to get bluetooth headphone audio controls to trigger button in iOS Swift iphone App

I am developing an audio player app in Swift and have completed the core functionality. I would like to set it up so that when you hit track next or track previous on bluetooth headphones it will trigger the previous and next buttons that are in the app already.
This is straight from Apple's documentation.
- (void) remoteControlReceivedWithEvent: (UIEvent *) receivedEvent {
if (receivedEvent.type == UIEventTypeRemoteControl) {
switch (receivedEvent.subtype) {
case UIEventSubtypeRemoteControlTogglePlayPause:
[self playPauseToggle: nil]
break;
case UIEventSubtypeRemoteControlNextTrack:
[self nextTrack: nil]
break;
...
And here's the function in swift:
func remoteControlReceivedWithEvent(_ event: UIEvent)
I am not sure, however, if Apple will let you get away with handling these events in anyway other than their explicit expected behavior. If I click pause, for example, and it opens a URL or something like that, you can expect to be rejected or pulled from the store right away. If your previous and next buttons aren't directly related to audio playback this isn't allowed and will not fly.

podcast "skip" buttons in iOS lock screen

I have an audio app that plays background audio on iOS devices. I need to have the app have the "skip 15" buttons — a la the Apple Podcasts app and Overcast — instead of next/previous track buttons. Does anyone know where the documentation to this is, or of some examples? This is turning out to be a tricky issue to Google.
Update: Great answer to this question for iOS 7.1 and later at https://stackoverflow.com/a/24818340/1469259.
The buttons on the lock screen trigger "remote control" events. You can handle these events and skip forward/back as you require:
- (void)viewDidAppear:(BOOL)animated {
[[UIApplication sharedApplication] beginReceivingRemoteControlEvents];
if ([self canBecomeFirstResponder]) {
[self becomeFirstResponder];
}
}
- (void)remoteControlReceivedWithEvent:(UIEvent *)receivedEvent {
if (receivedEvent.type == UIEventTypeRemoteControl) {
switch (receivedEvent.subtype) {
case UIEventSubtypeRemoteControlTogglePlayPause:
// add code here to play/pause audio
break;
case UIEventSubtypeRemoteControlPreviousTrack:
// add code here to skip back 15 seconds
break;
case UIEventSubtypeRemoteControlNextTrack:
// add code here to skip forward 15 seconds
break;
default:
break;
}
}
}
How you do the actual skipping depends on how you're playing the audio.
Documentation here https://developer.apple.com/library/ios/documentation/EventHandling/Conceptual/EventHandlingiPhoneOS/Remote-ControlEvents/Remote-ControlEvents.html
Potentially another way of achieving this, but not one I've used myself is MPSkipIntervalCommand https://developer.apple.com/library/ios/documentation/MediaPlayer/Reference/MPSkipIntervalCommand_Ref/index.html

iOS 7 UIImagePickerController Camera No Image

For some reason the first time I open the UIImagePickerController in camera mode on my app it comes up blank. I have to close and reopen that view to get the camera feed to start working. I'm using the standard code that works in iOS 6 perfectly for camera capture. From the sample below I'm firing the capturePhoto: method. Anyone else running into this jenkiness with the iOS 7 camera? I checked the Apple dev forums but its near impossible to find answers there.
- (IBAction)capturePhoto:(id)sender {
[self doImagePickerForType:UIImagePickerControllerSourceTypeCamera];
}
- (void)doImagePickerForType:(UIImagePickerControllerSourceType)type {
if (!_imagePicker) {
_imagePicker = [[UIImagePickerController alloc] init];
_imagePicker.mediaTypes = #[(NSString*)kUTTypeImage];
_imagePicker.delegate = self;
}
_imagePicker.sourceType = type;
[self presentViewController:_imagePicker animated:YES completion:nil];
}
I'm also using UIImagePickerController and ran into the same issue with a blank screen. I'd like to expand a little on what klaudz mentioned regarding iOS 7 authorization for the camera.
Reference:
https://developer.apple.com/library/ios/documentation/AVFoundation/Reference/AVCaptureDevice_Class/Reference/Reference.html
"Recording audio always requires explicit permission from the user; recording video also requires user permission on devices sold in certain regions."
Here is some code fragments you can start with to check to see if you have permission for the camera and request it if your app hadn't previously requested it. If you are denied due to an earlier request, your app may need to put up a notice to the user to go into settings to manually enable access as klaudz pointed out.
iOS 7 example
NSString *mediaType = AVMediaTypeVideo; // Or AVMediaTypeAudio
AVAuthorizationStatus authStatus = [AVCaptureDevice authorizationStatusForMediaType:mediaType];
// This status is normally not visible—the AVCaptureDevice class methods for discovering devices do not return devices the user is restricted from accessing.
if(authStatus == AVAuthorizationStatusRestricted){
NSLog(#"Restricted");
}
// The user has explicitly denied permission for media capture.
else if(authStatus == AVAuthorizationStatusDenied){
NSLog(#"Denied");
}
// The user has explicitly granted permission for media capture, or explicit user permission is not necessary for the media type in question.
else if(authStatus == AVAuthorizationStatusAuthorized){
NSLog(#"Authorized");
}
// Explicit user permission is required for media capture, but the user has not yet granted or denied such permission.
else if(authStatus == AVAuthorizationStatusNotDetermined){
[AVCaptureDevice requestAccessForMediaType:mediaType completionHandler:^(BOOL granted) {
// Make sure we execute our code on the main thread so we can update the UI immediately.
//
// See documentation for ABAddressBookRequestAccessWithCompletion where it says
// "The completion handler is called on an arbitrary queue."
//
// Though there is no similar mention for requestAccessForMediaType, it appears it does
// the same thing.
//
dispatch_async(dispatch_get_main_queue(), ^{
if(granted){
// UI updates as needed
NSLog(#"Granted access to %#", mediaType);
}
else {
// UI updates as needed
NSLog(#"Not granted access to %#", mediaType);
}
});
}];
}
else {
NSLog(#"Unknown authorization status");
}
In iOS 7, an app could access the camera before getting authorize of the user.
When an app accesses the camera the first time, iOS show an alert view to ask user.
Users could also set the authorize in Settings--Privacy--Camera--[Your app's name].
The camera will stay in a black blank view if the switch is off.
If you call the camera by using AVCaptureDeviceInput, you can check like:
NSError *inputError = nil;
AVCaptureDeviceInput *captureInput =
[AVCaptureDeviceInput deviceInputWithDevice:inputDevice error:&inputError];
if (inputError &&
inputError.code == AVErrorApplicationIsNotAuthorizedToUseDevice)
{
// not authorized
}
If you call by using UIImagePickerController, I am still looking for a way to check whether got the authorize.
I tried these two methods:
[UIImagePickerController isSourceTypeAvailable:]
[UIImagePickerController isCameraDeviceAvailable:]
but they did't work that they all returned YES.
UPDATE
Thanks for Scott's expanding. [AVCaptureDevice authorizationStatusForMediaType:] is a better way to check.
AVAuthorizationStatus authStatus = [AVCaptureDevice authorizationStatusForMediaType:AVMediaTypeVideo];
if (authStatus == AVAuthorizationStatusAuthorized) {
// successful
} else {
// failed, such as
// AVAuthorizationStatusNotDetermined
// AVAuthorizationStatusRestricted
// AVAuthorizationStatusNotDetermined
}
But remember to check the iOS version, because [AVCaptureDevice authorizationStatusForMediaType:] and AVAuthorizationStatus are available above iOS 7.
if ([[[UIDevice currentDevice] systemVersion] floatValue] >= 7.0) {
// code for AVCaptureDevice auth checking
}
I experienced the exact same problem, and tried every solution on the Internet with no luck. But finally I found out it was the background thread prevented the camera preview to show up. If you happen to have background thread running while trying to open the camera as I do, try to block the background thread and see what happens. Hope you can get around it.
I came across this control AQPhotoPicker. It's quite easy to use, and hopefully it will help you

App Icon won't show in music controls in app switcher

I'm writing an app that streams music and I'm having a ton of trouble setting the icon on the music control dock screen (when you double-click the home button and swipe to the left). All the documentation said to do something like this, but the icon never appears and I don't ever receive the event notification. All of this code is in my player view controller. _radioPlayer is an instance of an AVQueuePlayer. What am I doing wrong here?
- (void)viewWillAppear:(BOOL)animated
{
UIApplication *application = [UIApplication sharedApplication];
if([application respondsToSelector:#selector(beginReceivingRemoteControlEvents)])
[application beginReceivingRemoteControlEvents];
[self becomeFirstResponder];
}
- (BOOL)canBecomeFirstResponder
{
return YES;
}
- (void)remoteControlReceivedWithEvent:(UIEvent *)event {
switch (event.subtype) {
case UIEventSubtypeRemoteControlTogglePlayPause:
[_radioPlayer pause];
break;
case UIEventSubtypeRemoteControlPlay:
[_radioPlayer play];
break;
case UIEventSubtypeRemoteControlPause:
[_radioPlayer pause];
break;
default:
break;
}
}
EDIT: I read through the documentation and followed the steps but it's still not working. Here are some of the possible reasons I've come up with:
A. My music isn't playing yet when viewDidAppear is called. In the documentation it states
"Your app must be the “Now Playing” app. Restated, even if your app is the first responder >and you have turned on event delivery, your app does not receive remote control events until >it begins playing audio."
I tried calling beginReceivingRemoteControlEvents: and becomeFirstResponder: after the music starts playing but this doesn't work either. Can these only be called in viewDidAppear:? Does iOS automatically detect when the music begins playing so this isn't necessary?
B. There is something weird about using an AVQueuePlayer. I'm almost positive this isn't it since the event message is handled by the view controller, not the player itself.
It may not working for you because you are calling beginReceivingRemoteControlEvents in viewWillAppear instead of viewDidAppear. Check out the documentaion -
- (void)viewDidAppear:(BOOL)animated {
[super viewDidAppear:animated];
// Turn on remote control event delivery
[[UIApplication sharedApplication] beginReceivingRemoteControlEvents];
}

Resources