Build Command-Center like Media Controls - ios

I want to build a control in my app that can play/pause and skip the sound that is played in the background. This should behave the same way as the media control does for all different media sources at the control center.
Screenshot
I played around with various options. AVAudioSession, MPRemoteCommandCenter and also acquiring some Bluetooth profiles, as headphones can play/pause background music as well. Unfortunately nothing was able to play/pause the background music.
Does anyone have an idea how to make this behavior happen.

If you are trying to make your audio app use control center (like spotify does) you need to use the MPNowPlayingInfoCenter to set the now playing item data (like:title, rate, duration, elapsedTime,...) it will be something like that:
MPNowPlayingInfoCenter.default().nowPlayingInfo = [
MPMediaItemPropertyTitle: title,
MPMediaItemPropertyArtist: artist,
MPNowPlayingInfoPropertyElapsedPlaybackTime: position,
MPMediaItemPropertyPlaybackDuration: duration,
MPNowPlayingInfoPropertyPlaybackRate: rate,
]
this will set the data of the played audio item in the media control center now in order to be able to use the controls button need to use the MPRemoteCommandCenter and set the target for each command you want to use for example for play/pause actions it can be done like that:
MPRemoteCommandCenter.shared().playCommand.addTarget(handler: playActionHandler)
MPRemoteCommandCenter.shared().pauseCommand.addTarget(handler: pauseActionHandler)
once all of this is done you will need to call the method bellow in order for your app to be able to receive the remote events and execute the needed action
UIApplication.shared.beginReceivingRemoteControlEvents()

Related

MPMediaItem and AVPlayerItem playback sequence in background

I'm having an issue with playing sequences of different kinds of items in a background.
In an app I'm working on we've introduced playlists which contain both content provided by the app and Apple Music content.
For that use AVPlayer and MPMusicPlayerController respectively. We observe one player or the other (depending what content is now playing) and if the other kind of content comes next, we release the old player (if we can - MPMusicPlayerController is a singleton, so best we can do is stop it) and load item to another player.
The problem starts when the app leaves foreground. Once MPMusicPlayerController takes over it doesn't want to give up control, so if any AVPlayer content comes after MPMusicPlayerController content, the music stops.
One workaround that I've tried is playing with .mixWithOthers options when I set the category on AVAudioSession, however this creates new category of problems - I'm loosing lockscreen controls, therefore I'm also loosing airplay. One dirty trick that I've tried was setting .mixWithOthers 3 seconds before MPMediaItem ends, and then disabling it back once AVPlayer starts. Beside the fact that there're probably many different things that can go wrong here, MPMediaPlayerController still doesn't want to give me back the control over lockscreen controls.
Is there any way this could ever work on iOS 13?

Is there any way to hide the video controls from Notification Center on iOS 11.0?

In my app, I play a video from a WebView and I don't want to allow the users to stop the video, so I don't show any video control. But the controls still appear in NSNotification and in Control Center. Is there any way to hide them?
WebView core functions are handled by the OS, so i think its not possible but you should implement the video/audio handling natively using these classes:
MPNowPlayingInfoCenter (currenlty playing items)
MPRemoteCommandCenter (observe different events, play/pause/next.. )

iOS 8 - Web radio application with player on locked screen

I'm trying to create a web radio application. For now, I'm able to play music from a streaming web radio, even when my application is on background (thanks to Background modes and the AVAudioSession singleton) I would like to know if I would be able, with the AVAudioPlayer class I'm using, to use the device music player to stream my web radio on the device locked screen and control it (see the screenshot I provided as an example) .
Thanks for reading me,
Maƫl
I found an issue to my problem. All I had to do is to use the MPNowPlayingInfoCenter class and to set informations like the song title. The audio controls show themselves automatically when you play sound thanks to the AVAudioSession singleton.

AVCaptureSession startRunning is unmuting device

My application has various sound effects for buttons and other actions, if the device is muted/silenced they don't make sounds as expected. However, one of the screens does video recording, and if that screen is navigated to it enables all of the sound effects everywhere in the app. By commenting out some things I determined that it was the startRunning function that does this - I'm not sure if this is just normal behavior because starting the camera enables related things, like audio, or if there's something weird going on that I can change.
If you're doing video recording you're most likely using the AVAudioSessionCategoryPlayAndRecord category. This category will always ignore the mute switch on the side of the device, by design. See here for definitions of all AVAudioSession categories. In short, there's no way to respect the mute switch when using this audio category. So maybe when you switch away from that screen, you should set the audio session category to something else like AVAudioSessionCategoryAmbient if that will not affect your app.

AVComposition breaks on Airplay

I have a video composition which I'd like to play over Airplay (without mirroring). The app works as expected when using normal Airplay mirroring, but I'd like to get the speed, reliability, and resolution bump you get from using Airplay video instead.
The problem is that when I set
player.usesAirPlayVideoWhileAirPlayScreenIsActive = YES;
...the player goes blank.
Notes:
Since I don't create separate windows for each display, they are both trying to use the same AVPlayer.
My AVVideoComposition contains different files and adds opacity ramps between them.
This unanswered question suggests that the problem is more likely due to the fact that I'm playing an AVComposition than the use of a shared player: AVComposition doesn't play via Airplay Video
Two questions:
Do I have to get rid of the player on the iPad?
Can an AVVideoComposition ever be played over AirPlay?
I can't make comments so I had to post this as an answer although it might not fully respond to the questions.
I had similar issue and at the end I found out that when AVPlayer plays AVComposition it simply doesn't display anything on the external display. That's why I had to do it myself by listening to UIScreen connection notifications.
I have to say that all worked pretty perfect. I'm checking first if there are more than one screen and if there are I simply move the AVPlayer on that screen while displaying a simple message on the device's screen that content is played on... plus the name of AirPlay device. This way I can put whatever I want on the external display and is not very complicated. Same thing is when I receive UIScreenDidConnectNotification.
That was fine until I noticed that the composition plays really choppy on the the external display. Even if it consists of only one video without any complex edits or overlays. Same video plays perfectly if I save it to the Camera Roll or if I use MPMoviePlayerController.
I've tried many things like lowering resolutions, lowering renderScale and so on but with no success.
One thing bothers me more is how actually Apple do this in iMovie - if you have AirPlay enabled and you play a project (note it's still not rendered so it must use a composition in order to display it) right after tapping play button it opens a player that plays content really smoothly on the external monitor. If you however activate AirPlay from the player it closes and start rendering the project. After that it plays it I thing by using MPMoviePlayerController.
I'm still trying to find a solution and will post back if I have any success.
So for the two questions:
I don't see why you have to get rid.
Yes it can be played but with different technique and obviously issues.
in the app .plist create a new item called:
required background modes
add a new array element called:
App plays audio or streams audio/video using AirPlay
Not sure if you have already tried this, but you don't mention it in your post.
Cheers!

Resources