AirPlay: volume control is disabled when connected to Apple TV - ios

I'm implementing AirPlay support in a podcast app. I added an AVRoutePickerView and the AirPlay devices are loading fine and I can connect to a device successfully.
I'm testing in an Apple TV and the audio plays well but it's using always the max volume and I can't change it. The volume slider is disabled and I can't understand why that's happening because it works in other apps.
For example, I can change the volume in Overcast has expected and the audio doesn't start with the max volume:
What am I doing wrong? Am I missing any option.
UPDATE:
I'm using an AVPlayer and allowsExternalPlayback property is true.
UPDATE2:
The same issue happens with MPVolumeView.
Some Reddit users told me "It assumes a person would use the volume control of the output device (TV, sound system, etc) to control the volume.", "Like when you plug a MacBook into a tv via hdmi", and it makes sense but how can I force to not use the output device to control the volume? It works as I expected in other podcast apps.

After discussion with a DTS Engineer, he found a workaround (rdar://42881405
Volume control is disabled when connected to Apple TV using AirPlay).
"According to engineering, the disabling of the volume control is correct behavior for certain Apple TV configurations, where the audio is being sent to the actual TV via HDMI. In that case, volume is controlled by the TV itself.
An alteration to this standard behavior is made for audio-only apps (such as Podcasts and Overcast). In those cases, the volume control is enabled anyway, and it provides a software volume adjustment of the audio in addition to the hardware volume control. The reason you weren’t getting this is that you used AVQueuePlayer, which is regarded as a video player, not a pure audio player. I modified your sample project to use AVAudioPlayer instead, and the volume control was enabled for AirPlay output as expected.
However, AVAudioPlayer cannot play streamed assets, so it may not be a viable solution in your use case. I’m still researching whether the audio-only behavior can be obtained for other playback techniques."
Solution:
Basically, setting allowsExternalPlayback property of an AVPlayer/AVQueuePlayer to false will disallow the routing of video playback to AirPlay, and (as a side-effect) allows the pure audio playback.
Final note:
Even so, I think that using the new AVSampleBufferAudioRenderer and AVSampleBufferRenderSynchronizer classes would also work but they are way more complex to setup.
rdar://42966681 as been created: Provide an API for designating an app using AVPlayer/AVQueuePlayer as "audio-only".

Related

Can an audio unit (v3) replace inter-app audio to send audio to a host app?

My music performance app plays audio with AVAudioEngine, and uses inter-app audio to publish the engine's output to other apps. This allows users to feed the audio into a mixer app running on the same device. Since IAA is deprecated on iOS and not supported on Mac, I'm trying to replace this functionality with Audio Units.
I've added an audio unit extension of type augn using the Xcode template, and I understand the internalRenderBlock is what actually returns the audio data. But how can the extension access the audio playing in the container (main) app?
Is this even possible? I would expect this to be a common use case since Audio Units are positioned as a replacement for IAA, but I haven't seen any examples of anyone doing something like this. I don't want to process input from the host app, and I don't want to generate sound from scratch; I need to tap into the sound that the containing app is playing.
UPDATE
I just read the section "How an App Extension Communicates" in the App Extension Programming Guide. It doesn't look promising:
An app extension communicates directly only with the host app. There is no direct communication between an app extension and its containing app; typically, the containing app isn’t even running while a contained extension is running.
Also:
A Today widget (and no other app extension type) can ask the system to open its containing app by calling the openURL:completionHandler: method of the NSExtensionContext class. Any app extension and its containing app can access shared data in a privately defined shared container.
If that's the extent of the data sharing between the container and the extension, I don't see how this could work. The extension would need to access an AVAudioEngine node in real time so if the user of the containing app changes sounds, plays, pauses, changes volume, etc. that would all be reflected in the output that the host app receives.
And yet I feel like taking away IAA if AUv3 doesn't have this capability leaves a big gap in the platform. Hopefully there's another approach I'm not thinking of.
Maybe this would need to work the other way around, so in my situation, the mixer app would offer the audio unit extension, and then my app (an audio player) would be the host and provide the audio to the mixer's extension. But then the mixer app would have the same problem of not being able to obtain the incoming audio from its extension.
In addition to playing the audio via AVAudioEngine, an app has to also publish its audio output in an Audio Unit extension. That app extension's output can potentially be made visible to the input of other apps or Audio Unit extensions contained in other apps.
Added: To send audio data from an app to its own app extension, you can try putting the app and its extension in the same App Group, creating a set of shared files, and perhaps memory mapping the shared file(s). Or use writeToFile:atomically: to put blocks of audio samples into a ring buffer of shared files.
Also, the original pre-IAA method in iOS was to use MIDI SysEx data packets to pass audio sample blocks between apps. This might be possible on macOS as well, with a fairly low latency.
I contacted Apple Developer Tech Support, and they recommended continuing to use IAA on iOS.
They also mentioned exchanging data between the container and extension with files in an app group, which I assumed would be unsuitable for real-time audio, but hotpaw2's answer provides a couple hints about making that work.
I did find a couple third-party alternatives for Mac:
Loopback - costs users $100, but I tested it with the free trial and it worked
Blackhole - free to users, open-source and could potentially be licensed for integration into other apps; I didn't try it

Is there any relationship between an AVAudioEngine and an AVAudioSession?

I understand that this question might get a bad rating, but I've been looking at questions which ask how to reroute audio output to the loud speaker on iOS devices.
Every question I looked at the user talked about using your AVAudioSession to reroute it.. However, I'm not using AVAudioSession, I'm using an AVAudioEngine.
So basically my question is, even though I'm using an AVAudioEngine, should I still have an AVAudioSession?
If so, what is the relationship between these two objects? Or is there a way to connect an AVAudioEngine to an AVAudioSession?
If this is not the case, and there is no relation between an AVAudioEngine and an AVAudioSession, than how do you reroute audio so that it plays out of the main speakers on an iOS device rather than the earpiece.
Thank you!
AVAudioSession is specific to iOS and coordinates audio playback between apps, so that, for example, audio is stopped when a call comes in, or music playback stops when the user starts a movie. This API is needed to make sure an app behaves correctly in response to such events
AVAudioEngine is a modern Objective-C API for playback and recording. It provides a level of control for which you previously had to drop down to the C APIs of the Audio Toolbox framework (for example, with real-time audio tasks). The audio engine APIs are built to interface well with lower-level APIs, so you can still drop down to Audio Toolbox if you have to.
The basic concept of this API is to build up a graph of audio nodes, ranging from source nodes (players and microphones) and overprocessing nodes (mixers and effects) to destination nodes (hardware outputs). Each node has a certain number of input and output busses with well-defined data formats. This architecture makes it very flexible and powerful. And it even integrates with audio units.
so there is no inclusive relation between this .
Source Link : https://www.objc.io/issues/24-audio/audio-api-overview/
Yes it is not clearly commented , however, I found this comment from ios developer documentation.
AVFoundation playback and recording classes automatically activate your audio session.
Document Link : https://developer.apple.com/library/content/documentation/Audio/Conceptual/AudioSessionProgrammingGuide/ConfiguringanAudioSession/ConfiguringanAudioSession.html
I hope this may help you.

UIEvents sent from Apple TV while using AVPlayer external playback mode

I have an app that displays videos, and it's very important to us that we intercept all pause events, and prevent users from seeking in videos.
Doing it on device is pretty simple, we just don't expose any 'regular' controls to user, and in -remoteControlReceivedWithEvent:, we wrap all events that we're actually interested in.
But we're struggling with supporting Apple TV. It's our understanding that it should forward all events sent from Apple Remote to our app, as per [0]:
When AirPlay is in use, your media might be playing in another room from your host device. The AirPlay output device might have its own controls or respond to an Apple remote control. For the best user experience, your app should listen for and respond to remote events, such as play, pause, and fast-forward requests. Enabling remote events also allows your app to respond to the controls on headphones or earbuds that are plugged into the host device.
However, as far as I can see from my debugging and pulled hair, it doesn't apply to cases where you let AVPlayer handle displaying your video. We actually don't do anything at all to make videos play on TV, since AVPlayer's allowsExternalPlayback property is YES by default.
If I'm understanding docs correctly, while using that mode with Apple TV, only URL/data from device is sent to Apple TV, and aTV does the decoding and rendering part on it's own, as per [1]:
External playback mode is when video data is sent to an external device such as Apple TV via AirPlay and the mini-connector-based HDMI/VGA adapters for full screen playback at its original fidelity. AirPlay Video playback is also considered as an "external playback" mode.
which could potentially explain why I don't receive any events on device (e.g. someone at Apple thought that since aTV does the heavy lifting and actually decoding and rendering, apps on device shouldn't receive those events).
So, my question is basically this - is there any obvious tree I'm missing from the forest, or do I have no retreat other than either:
ugly hacks using KVO on playback position and playback rate, and punishing users for 'cheating'
reimplementing whole video rendering on my own, treating TV screen as second display
Any pointers will be greatly appreciated.
[0] https://developer.apple.com/library/ios/documentation/AudioVideo/Conceptual/AirPlayGuide/EnrichYourAppforAirPlay/EnrichYourAppforAirPlay.html
[1] https://developer.apple.com/library/ios/documentation/AVFoundation/Reference/AVPlayer_Class/Chapters/Reference.html#//apple_ref/occ/cl/AVPlayer

Widevine video streaming on iOS and AirPlay

Could you help us please with the following problem related to the DRM (Widevine) encrypted video stream playback and use of the AirPlay?
When we tried to play the video from iPhone with use of the AirPlay on Apple TV, the "failed to load content” error was shown on the TV screen. We are not sure if that is correct behaviour. We think it is, because for encrypted video playback we cannot use the AirPlay as it transports the raw unencrypted stream, right?
So far we found that the only possible solution is showing video on iPhone, while playing audio on the AppleTV, it seems that for audio the DRM restriction does not apply.
Could you confirm the above description? Could you give us some advices?
We found also following (Note that we are not using Brightcove, but the principle should be same) information on the Internet: http://support.brightcove.com/en/video-cloud/docs/widevine-plugin-brightcove-video-cloud-player-sdk-ios
Try WVUseEncryptedLoopback (set it to #1). It enables AirPlay support by securing the AirPlay stream. 1 enables encrypted loopback. 0 by default.
Also, enable WVPlayerDrivenAdaptationKey (set it to #1) (Switch between Apple Native Player Adaption = 1 and Widevine Adaption = 0)
Widevine version: 6.0.0.12792

iOS: How to choose which microphone (inbuilt/external) to use?

Let us say I have an audio iPhone app which takes input from the microphone.
Now, although I haven't tried this myself, I believe the user could use an external microphone that plugs into the phonojack socket.
This means my audio unit could be receiving its input from the internal or the external microphone.
My guess is that iOS will automatically route from an external microphone if it is connected.
But what if I don't want that?
Is there a way to specify which microphone should be used?
I have looked in the audio session guide, I can find some setting regarding a Bluetooth headset. But that is as close as I can find. It appears that it is not possible. But I find that difficult to believe.
PS Also I am curious how it detects an external microphone... if I plug my headphones in, it should continue routing from the internal microphone. my headphones are just plain stereo headphones. but if I used my mobile phone's headphones ( on extra band on the Jack... they have a microphone built onto the cable where the individual earpiece strands meet ) I would expect it to pick up this source instead.
You have to use the AUHAL unit to set a specific input device as default input and then connect it with the AudioQueue.
Apple has a detailled Technical Note for that: Device input using the HAL Output Audio Unit

Resources