Can third-party devs control HomePod remotely? - ios

In iOS, you can remotely-control a HomePod via Control Center, the Music app, and Apple’s Podcasts app.
If a third-party developer wanted to send an iTunes podcast episode or an Apple Music item to a HomePod, could they build similar in-app controls like Music or Podcasts?
(Yes, I know that I can use AirPlay to stream audio to a HomePod - but that’s not quite the same as the UI shown below...)
I expect that there’s not a way for a third-party app to implement such a control - but I’d love to know if it is!

AFAIK there's no public API for this. I think it's currently implemented by the private framework MediaRemote.framework.

Related

Control music of other apps within my app

I'm currenty trying to find out if it's possible to control the music of other apps, e.g. Spotify, within my app. The two solutions I found are:
using the SDK of the specific service (e.g. Spotify SDK)
take over the audio session in my app, but this only allows me to stop/ play the music
I'm trying to find a solution that provides me with pause, play, forward, backward control and access to the title, maybe also the cover. Then I came across the "Sony Headphones Connect" app. It has a page with media control buttons, the title of the current song and is does not require any kind of sing in to Spotify or something else.
Picture is from the app from the App Store, I tested the controls and they work.
So somehow it must be possible, does anyone have an idea how I can achieve my desired result or how the "Sony Headphones Connect" app is handling this issue?
I think I found the answer, the "Sony Headphones Connect" app is controlling the music over bluetooth. So it's not directly possible to control the music of another app.

What is the definition of the "Apple Music API"?

I'm trying to understand the App Store Review Guidelines around the Apple Music API. However, I can't seem to find a definition for the Apple Music API, and in particular how far it extends (i.e. how much of the entire API does it encompass?).
On this iTune Affiliate Resource page it is suggested that the Apple Music API is a combination of:
SKCloudServiceController
MPMediaLibrary
MPMusicPlayerController
Is this truly the definition of the Apple Music API? And in which case, if you use any of the frameworks listed above are you technically using the Apple Music API?
To put some context, I'm planning to build a music player that utilises a device's local library only. Is there a way of building such an iOS music player app and avoid using the Apple Music API?
The point of the language you quote in your comment is merely that you must not try to do what Apple is already doing, i.e. make money by getting the user to subscribe to or access Apple Music (the online streaming service) through you rather than Apple. You aren't going to do that, so don't worry about it.
EDIT I'll give you an example of what Apple is saying you cannot do. Let's say you have an app that plays music from the user's library, and that you differentiate: you say, to play your music is free, but play Apple Music songs will cost you an extra dollar of in-app payment. That would be a violation of the rules.

Control Spotify background music in iOS app

I'm making an app that uses gestures to change the current background music e.g Pause, Skip, etc. Currently I can do this fine for music that is playing through the native MPMusicPlayerController class.
However I am unable to control the music when it is coming from Spotify (or other music playing apps for that matter). I believe this is to do with Apple's sandboxing policy between apps.
I'm not too hopeful, but does anyone know a way to trigger a universal music control notification? Something similar to what must happen on the iOS lockscreen when background music is playing? All my research tells me this almost definitely done through a private API, but am unable to get confirmation.
Alternatively, is there any Spotify specific way to achieve this?
Unfortunately the answer is No.
You can't get any information/Notifications about other apps, even regarding what's now playing.
As you wondered, it's all regarding the Sandboxing policy of Apple.

How To Push Music Playable Data Streamed From Spotify To A Device That Does Not Use The SDK Provided By Spotify

I apologise for the possibility of the title of my question would lead to confusion of the problem. For that I will explain my purpose in detail.
We are currently developing our own wifi speaker which is built with MIPS. The speaker comes with an app that will be used to manage it. One of the features that would we would like to include in the app is accessing contents of Spotify and be able to play them on the speakers.
Unfortunately, after going through the iOS SDK Documentation, and did some tests on Web API Console provided by the official of Spotify, I noticed that Spotify does not allow developers to directly get URL of a song, except for preview purposes. I also wasn't able to find any way to get the data bytes of the music streamed from the server. Every content comes with a corresponding URI which is used for a request.
For the device(WiFi Speaker) part, we recently tried to contact Spotify and ask for an SDK that can be used for development. However, one problem is that Spotify told us that they have SDK for x86, and ARMs architecture only. They don't have MIPS.
Now, here are my questions:
Is there any way for me to push music from an app to the WiFi Speakers without having to use SDK (for backend device)?
If Spotify can provide an SDK for our device, then how can we integrate the SDK with our platform?
I'll explain my 2nd question for clarity. Like for instance, in Android and iOS, these are popular platforms and are widely used by mobile devices. So if they provide SDKs for the two OS, then they can use default system frameworks to access the player for playing the content. (In iOS, it's the AVFoundation Framework). However, if Spotify were able to provide the SDK that we need, how would we able to integrate that with our own platform?
I will answer your question no 1:
You should be able to push music from an app using a buffer that you can read from using Core Audio and also forward to a device of your choice. I think what you are looking for can be found at CocoaLibSpotify

How does Audiobus for iOS work?

What SDK's does Audiobus use to provide inter-app audio routing? I am not aware of any Apple SDK that could facilitate inter-app communication for iOS and was under the impression that apps were sandboxed from each other so I'm really intrigued to hear how they pulled this off.
iOS allows inter-app communication via MIDI Sysex messages. AudioBus works by sending audio as MIDI Sysex message. You can read details from the developer himself:
http://atastypixel.com/blog/thirteen-months-of-audiobus/
My guess is that they use some sort of audio over network, because I've seen log statements when our app gets started even on a different device.
Don't really know about the details of the implementation, but this could be a way of staying in the "sandbox" constraint.
The Audiobus SDK (probably) use the Audio Session rules to "organize" all the sound output from the apps using their SDK, as you can see on their videos (on bottom of the page), the apps have an lateral menu to switch back and forwards between apps.
The Audio Session Category states:
Allows mixing: if yes, audio from other applications (such as the iPod) can continue playing when your application plays sound.
This way Audiobus can "control" the sound and allow the session to be persistent between the apps.

Resources