There's not much to my question I guess. I'm just curious about how CocoaLibSpotify works with AVFoundation and if it's compatible with how Apple needs me to register for remote control events and to set the now playing info in MKNowPlayingInfoCenter.
Apple says to receive remote control events my app needs to "Begin playing audio. Your app must be the “Now Playing” app. Restated, even if your app is the first responder and you have turned on event delivery, your app does not receive remote control events until it begins playing audio.'" however, that's all the documentation I can find... Does playing a track with SPPlaybackManager meet this requirement? What is the requirement anyway?
Thanks for your help again.
Remote control events work fine with CocoaLibSpotify without any modifications to the library at all, but only on the device and not in the Simulator (including iOS7's Control Center).
Taking the Simple Player example, I made the following changes:
Changed Simple_PlayerAppDelegate to be a subclass of UIResponder.
Overrode canBecomeFirstResponder: to return YES.
Implemented remoteControlReceivedWithEvent:.
In the callback to the playTrack: call to CocoaLibSpotify, added:
[[UIApplication sharedApplication] beginReceivingRemoteControlEvents];
[self becomeFirstResponder];
These changes allowed Simple Player to receive remote control events when running on a device.
Related
I want to know when my AVAudioRecorder is inaccessible (e.g when music starts playing).
As audioRecorderEndInterruption will be deprecated with iOS 9 I am focusing on AVAudioSession's interruption notification (but neither is working as expected).
The issue is that the interruption notification is never called if the app was and remains in the foreground when the interruption occurs.
E.g: The user starts and stops playing music without moving the application into the background.
To detect any interruptions I am using:
[[NSNotificationCenter defaultCenter] addObserver:self selector:#selector(audioSessionWasInterrupted:) name:AVAudioSessionInterruptionNotification object:nil];
...
- (void)audioSessionWasInterrupted:(NSNotification *)notification {
if ([notification.name isEqualToString:AVAudioSessionInterruptionNotification]) {
NSLog(#"Interruption notification");
if ([[notification.userInfo valueForKey:AVAudioSessionInterruptionTypeKey] isEqualToNumber:[NSNumber numberWithInt:AVAudioSessionInterruptionTypeBegan]]) {
NSLog(#"InterruptionTypeBegan");
} else {
NSLog(#"InterruptionTypeEnded");
}
}
}
I get InterruptionTypeBegan as expected, but InterruptionTypeEnded isn't called if the app is still in the foreground (meaning it won't be called until the app is placed in the background and back into the foreground).
How may I receive InterruptionTypeEnded notification when the interruption occurs while the app is in the foreground?
This is a widespread problem affecting any app using AV framework components (the same goes for native iOS apps).
As explained in 's documentation on the subject of audio interruptions, the InterruptionTypeEnded should actually be applied in the scenario mentioned:
If the user dismisses the interruption ... the system invokes your callback method, indicating that the interruption has ended.
However, it also states that the InterruptionTypeEnded might not be called at all:
There is no guarantee that a begin interruption will have an end interruption.
Therefore, a different approach is needed in the scenario mentioned.
When it comes to handling music interruptions, the issue won't be around for long. iOS 9 effectively prevents outside audio sources to be used while the app's audio handler is invoked.
A way to handle the exact issue of media interruption could be to listen to MPMusicPlayerController's playbackState, as shown in this stackoverflow question: Detecting if music is playing?.
A more direct way to handle the issue of interruptions would be to either:
Block outside audio interruptions completely by re-invoking your audio component at the time of InterruptionTypeBegan.
Or by giving a UI indication that an outside media source has interrupted the audio session (for example showing an inactive microphone).
Hopefully will come up with a better solution to the problem, but in the meantime this should give you some options to solve the interruption issue.
If you haven't already, try setting your AVCaptureSession's property usesApplicationAudioSession to NO.
This question & answer may act as a good reference if you're looking for any more detail.
I try this and find InterruptionTypeEnded may called after music pause in some app, but other not called when pause.
My solution is update the UI to let user know record has stopped and do some related work such as file operation. When interruption ends, active AVAudioSession, if doesn't have error, start a new record.
If you want to join the file before and after interrupt, the answer to this question: AVAudioRecorder records only the audio after interruption may be helpful to you.
Does anyone know if there is a defined list of actions/events that will cause a suspended/not running application to become active?
For example, if you call [[UIApplication sharedApplication] beginReceivingRemoteControlEvents], pressing an audio control command on the control center will resume/start the app in the background. To prevent this from occurring, [[UIApplication sharedApplication] endReceivingRemoteControlEvents] needs to be called before the app is terminated or sent to the background.
Are there are other system-level events which can activate the app like this?
There are many triggers that can activate "dead apps". These include but not limited to:
-Push Notifications
-GameCenter Requests
-Significant Location Changes
-iCloud (Although hard to do)
-iBeacon
-Passbook location sensor
-and many more
Hope that helped
I have two questions regarding Remote Control Events on iOS:
I know that music applications are registered to remote control events and then can receive such events from the iPhone's player widget.
Let's say I want my app to fire such events, is that possible?
How does headphones for example generate those events?
Without private API, you cannot send remotecontrol event to your application.
The reason is we cannot create such an Event (UIEvent) to send out by using:
[[UIApplication sharedApplication] sendEvent:anEvent];
You can, however save a registered event then play back by calling the above-function.
I don't know if it possible for headphone events but with private API, you can send some events like: home button press, power button press or mouse events (not tested).
You should read this book:
http://www.amazon.com/gp/product/1118057651/ref=pd_lpo_sbs_dp_ss_1?pf_rd_p=1535523702&pf_rd_s=lpo-top-stripe-1&pf_rd_t=201&pf_rd_i=0321278542&pf_rd_m=ATVPDKIKX0DER&pf_rd_r=0T2AMHJCEEKJN41YJHD5
It'll be a hard work to make this works.
Take a look at GSEvent to know how to send an event to OS (iOS).
Edit: I've found 2 event types in GSEvent.GSEventType:
kGSEventHeadsetButtonDown = 1018,
kGSEventHeadsetButtonUp = 1019,
PS: - this use private API so it will be rejected if you post this app to AppStore.
I have a application running in background and I need to know if device is sleeping in order to start a sincronisation process, but I didn't find information about this.
Does anyone know if it is posible and how do it?
Thanks.
You cannot know if the device is asleep because you have no control over the OS.
You can, otherwise, use the App Delegate method:
- (void)applicationWillResignActive:(UIApplication *)application
{
//your code goes here
}
if you want to wait till your app goes to background
I believe you can't do this using public API. The only thing which you can check whether your application is active or in background (using AppDelegate callbacks). And as Luke pointed out in comments, checking whether device "falls asleep" isn't iOS best design practice.
There are some private API's to do what you want, you can look at following questions:
Is there a way to check if the iOS device is locked/unlocked?
Detect screen on/off from iOS service
However, you should be aware that your app won't be accepted in AppStore in such case.
I'm setting up an AVAudioSession when the app launches and setting the delegate to the appDelegate. Everything seems to be working (playback, etc) except that beginInterruption on the delegate is not being called when the phone receives a call. When the call ends endInterruption is being called though.
The only thought I have is that the audio player code I'm using used to be based on AVAudioPlayer, but is now using AVPlayer. The callbacks for the AVAudioPlayer delegate for handling interrupts are still in there, but it seems odd that they would conflict in any way.
Looking at the header, in iOS6, it looks like AVAudioSessionDelegate is now deprecated.
Use AVAudioSessionInterruptionNotification instead in iOS6.
Update: That didn't work. I think there's a bug in the framework.
Yes, in my experience, beginInterruption, nor the newly documented AVAudioSessionInterruptionNotification work properly. What I had to do was track the status of the player using a local flag, then handle the endInterruption:withFlags: method in order to track recovery from interruptions.
With iOS 6, the resuming from an interruption will at least keep your AudioPlayer in the right place, so there was no need for me to store the last known play time of my AVAudioPlayer, I simply had to hit play.
Here's the solution that I came up with. It seems like iOS 6 kills your audio with a Media Reset if an AVPlayer stays resident too long. What ends up happening, is the AVPlayer plays, but no sound comes out. The rate on the AVPlayer is 1, but there's absolutely no sound. To add pain to the situation, there's no error on either the AVAudioSession setActive, nor the AVPlayer itself that indicates that there's a problem.
Add to the fact that you can't depend on appWillResignActive, because your app may already be in the background if you're depending on remote control gestures at all.
The final solution I implemented was to add a periodic observer on the AVPlayer, and record the last known time. When I receive the event that I've been given back control, I create a new AVPlayer, load it with the AVPlayerItem, and seekToTime to the proper time.
It's quite an annoying workaround, but at least it works, and avoids the periodic crashes that were happening.
I can confirm that using the C api, the interruption method is also not called when the interruption begins; only when it ends
(AudioSessionInitialize (nil, nil, interruptionListenerCallback, (__bridge void *)(self));
I've also filed a bug report with apple for the issue.
Edit: This is fixed in iOS 6.1 (but not iOS 6.0.1)
Just call:
[[AVAudioSession sharedInstance] setDelegate: self];
I just checked on my iPhone 5 (running iOS 6.0) by setting a breakpoint in the AudioSessionInterruptionListener callback function that was declared in AudioSessionInitialize(), and this interrupt callback does, in fact, get called when the app has an active audio session and audio unit and is interrupted with an incoming phone call (Xcode shows the app stopped at the breakpoint at the beginning of the interruption, which I then continue from).
I have the app then stop its audio unit and de-activate its audio session. Then, on the end interruption callback, the app re-activates the audio session and restarts the audio unit without problems (the app is recording audio properly afterwards).
I built a brand new audio streaming (AVPlayer) application atop iOS 6.0.x and found the same problem.
Delegates are now deprecated and we have to use notifications, that's great, however here's my findings:
During an incoming phone call I get only AVAudioSessionInterruptionTypeEnded in my handler, along with AVAudioSessionInterruptionOptionShouldResume. Audio session gets suspended automatically (audio fades) and I just need to resume playback of AVPlayer.
However when attempting to launch a game, such as CSR Racing, I oddly get the dreaded AVAudioSessionInterruptionTypeBegan but no sign when my application can resume playback, not even killing the game.
Now, this may depend on other factors, such as my audio category (in my case AVAudioSessionCategoryPlayback) and the mixing settings of both applications (kAudioSessionProperty_OverrideCategoryMixWithOthers), I'm not sure, but definitely I see something out of place.
Hopefully others reported that on 6.1beta this is fixed and I yet have to upgrade, so we'll see.