How can I integrate a music app in the iOS system that my own custom player can handle actions (play, pause, forward) from the unlock screen and how to integrate that is displayed as player app in the double-click-home-button menu like on this screenshot Spotify does:
EDIT: The application itself is based at the Media Player Framework, but in this documents there isn't any hint how to get the custom player in this 'double-click-home-button' menu.
If you register your application for remote control events, and if you start playing either audio or video, your application will automatically take over these controls. See Apple's documentation for more info. This API is available on iOS 4 and later.
To set the string below the controls ("The Butterfly Defect" in your screenshot), use MPNowPlayingInfoCenter, available on iOS 5 and later.
Related
I would like to allow my users to capture recordings of my app's interface along with Agora audio using the system screen recording feature. I know that this is possible because the app Clubhouse also uses Agora and it allows for this.
I plan to show a banner to members of the channel when the app is being recorded and allow them to opt-out their channel if they so choose. But it is important to me that users are able to make and share these recordings on social media platforms as this will be one of the primary ways potential users learn about our app.
Is there some configuration option I need to enable that would allow this feature to work? Thanks!
I am also working on one agora based application
you need to just hold screen recording button for a second you will get one popover as the below screen shot for microphone on/off.
just turn on microphone and record video using system screen recording feature.
Have started working on my first app for iOS. (Normally I'm a Python guy). The app has inspirational advice, the client has recorded speech/narration that they would like to embed and play in the app. I don't want sound files to be separate in iTunes. Is it possible to embed player functionality and sound segments so a user can simply press a button and hear the speech?
Is it possible in iOS 11 with the new MusicKit API (or any other iOS API) to create a music player and have it displayed on the iPhone's lock screen, like the current Apple Music player? If so, how -- what APIs should be used?
The first - play user's songs on my app and allow control of my app
from the lock screen as my app is playing the music.
This is possible, the Music Kit API allows you to access the user's library:
MusicKit on iOS lets users play Apple Music and their local music
library natively from your apps and games. When a user provides
permission to their Apple Music account, your app can create
playlists, add songs to their library, and play any of the millions of
songs in the Apple Music catalog. If your app detects that the user is
not yet an Apple Music member, you can offer a trial from within your
app.
https://developer.apple.com/musickit/
Regarding the controls from the lock screen. That can be done without using that SDK, its simply filling the "MPNowPlayingInfoCenter"
Refer to this one for it: iOS: Displaying currently played track info in LockScreen?
Having trouble finding the API for the home screen shortcuts when the home button is clicked. My app plays audio and I want those similar shortcuts to pop up like the do for the iPod to give me the option to stop playback.
Can someone just point me to the reference?
This is a good place to start:
Remote Control Events
From the Apple documentation:
Test that your app is properly receiving and handling remote control
events with the Now Playing Controls. These controls are available on
recent iOS devices that are running iOS 4.0 or later. To access these
controls, press the Home button twice, then flick right along the
bottom of the screen until you find the audio playback controls. These
controls send remote control events to the app that is currently or
was most recently playing audio. The icon to the right of the playback
controls represents the app that is currently receiving the remote
control events.
I have a standard mp3 stream that is delivered through http. I use the approach used in this example project:
https://github.com/mattgallagher/AudioStreamer/
Basically it uses the approach described in Apples "Audio Stream Reference". If I set UIBackgroundModes to audio iOS plays the sound if the app is in the background.
But how can I make the iPod controls in the taskbar work? How can I integrate properly with iPod on iOS?
The controls in the taskbar fire remote-control events (for example like controls that are integrated into the head phones). If you follow the instructions described in Apples documentation Remote Control of Multimedia and set UIBackgroundModes to audio, then everything behaves as expected: If the app is suspended the sound continues to play, you app icon appears in the iPod menu in the taskbar and the buttons trigger events.