UIEvents sent from Apple TV while using AVPlayer external playback mode - ios

I have an app that displays videos, and it's very important to us that we intercept all pause events, and prevent users from seeking in videos.
Doing it on device is pretty simple, we just don't expose any 'regular' controls to user, and in -remoteControlReceivedWithEvent:, we wrap all events that we're actually interested in.
But we're struggling with supporting Apple TV. It's our understanding that it should forward all events sent from Apple Remote to our app, as per [0]:
When AirPlay is in use, your media might be playing in another room from your host device. The AirPlay output device might have its own controls or respond to an Apple remote control. For the best user experience, your app should listen for and respond to remote events, such as play, pause, and fast-forward requests. Enabling remote events also allows your app to respond to the controls on headphones or earbuds that are plugged into the host device.
However, as far as I can see from my debugging and pulled hair, it doesn't apply to cases where you let AVPlayer handle displaying your video. We actually don't do anything at all to make videos play on TV, since AVPlayer's allowsExternalPlayback property is YES by default.
If I'm understanding docs correctly, while using that mode with Apple TV, only URL/data from device is sent to Apple TV, and aTV does the decoding and rendering part on it's own, as per [1]:
External playback mode is when video data is sent to an external device such as Apple TV via AirPlay and the mini-connector-based HDMI/VGA adapters for full screen playback at its original fidelity. AirPlay Video playback is also considered as an "external playback" mode.
which could potentially explain why I don't receive any events on device (e.g. someone at Apple thought that since aTV does the heavy lifting and actually decoding and rendering, apps on device shouldn't receive those events).
So, my question is basically this - is there any obvious tree I'm missing from the forest, or do I have no retreat other than either:
ugly hacks using KVO on playback position and playback rate, and punishing users for 'cheating'
reimplementing whole video rendering on my own, treating TV screen as second display
Any pointers will be greatly appreciated.
[0] https://developer.apple.com/library/ios/documentation/AudioVideo/Conceptual/AirPlayGuide/EnrichYourAppforAirPlay/EnrichYourAppforAirPlay.html
[1] https://developer.apple.com/library/ios/documentation/AVFoundation/Reference/AVPlayer_Class/Chapters/Reference.html#//apple_ref/occ/cl/AVPlayer

Related

AirPlay: volume control is disabled when connected to Apple TV

I'm implementing AirPlay support in a podcast app. I added an AVRoutePickerView and the AirPlay devices are loading fine and I can connect to a device successfully.
I'm testing in an Apple TV and the audio plays well but it's using always the max volume and I can't change it. The volume slider is disabled and I can't understand why that's happening because it works in other apps.
For example, I can change the volume in Overcast has expected and the audio doesn't start with the max volume:
What am I doing wrong? Am I missing any option.
UPDATE:
I'm using an AVPlayer and allowsExternalPlayback property is true.
UPDATE2:
The same issue happens with MPVolumeView.
Some Reddit users told me "It assumes a person would use the volume control of the output device (TV, sound system, etc) to control the volume.", "Like when you plug a MacBook into a tv via hdmi", and it makes sense but how can I force to not use the output device to control the volume? It works as I expected in other podcast apps.
After discussion with a DTS Engineer, he found a workaround (rdar://42881405
Volume control is disabled when connected to Apple TV using AirPlay).
"According to engineering, the disabling of the volume control is correct behavior for certain Apple TV configurations, where the audio is being sent to the actual TV via HDMI. In that case, volume is controlled by the TV itself.
An alteration to this standard behavior is made for audio-only apps (such as Podcasts and Overcast). In those cases, the volume control is enabled anyway, and it provides a software volume adjustment of the audio in addition to the hardware volume control. The reason you weren’t getting this is that you used AVQueuePlayer, which is regarded as a video player, not a pure audio player. I modified your sample project to use AVAudioPlayer instead, and the volume control was enabled for AirPlay output as expected.
However, AVAudioPlayer cannot play streamed assets, so it may not be a viable solution in your use case. I’m still researching whether the audio-only behavior can be obtained for other playback techniques."
Solution:
Basically, setting allowsExternalPlayback property of an AVPlayer/AVQueuePlayer to false will disallow the routing of video playback to AirPlay, and (as a side-effect) allows the pure audio playback.
Final note:
Even so, I think that using the new AVSampleBufferAudioRenderer and AVSampleBufferRenderSynchronizer classes would also work but they are way more complex to setup.
rdar://42966681 as been created: Provide an API for designating an app using AVPlayer/AVQueuePlayer as "audio-only".

How to prevent recording of iOS screen using quicktime

So with iOS 8, we can now record the screen of iOS devices. I've searched extensively and cannot find a way to detect, let alone prevent, this recording. The app I'm working on deals with some potentially sensitive information and images and would like to prevent this if at all possible.
Thank you in advance for your responses and insights!
Anthony
Apparently, there is some way to detect whether a display or QuickTime streaming is connected, because the Netflix app will show an error when that is the case (which also means you can't just use an iOS device and stream to your computer to watch it in big). The app works perfectly if QuickTime streaming is off with the cable is plugged in.
Maybe it just detects whether an external display is connected, and screen recording behaves like that, so basically you might have some success with these APIs and notifications.
Also, you could use an encrypted HTTP Live Stream according to Apple which would be blacked out in the stream / the recording.

Can information be stolen that gets transferred through the headphone jack by a backgrounded app?

Can information that gets transferred through the headphone jack be stolen by a backgrounded app?
Square makes a card reader that plugs into the headphone jack of the iPhone and transfers your credit card info to your phone.
There are many devices that transfer data through the headphone jack.
Since you can run background processes now on iOS, can that information that is being transferred be intercepted or monitored by an app running in the background?
Could a random app potentially be listening and looking for credit card numbers and steal the information?
If yes, then is there a way to cancel all other app's listening connection to the headphone jack while I transfer my own data from a device?
I have looked at the AurioTouch example of how to transfer data but all I need to know right now is if my data is in jeopardy of being stolen and how I can prevent that.
See the last paragraph of "Playing and Recording Background Audio" here.
To sum it up, you can stop your app from playing or recording audio if another app wants to play/record audio. Look into AVAudioSession.
You should be equally worried about the possibility of audio being intercepted on the outside of the device as audio. Such man-in-the-middle attacks have been reported against chip-and-pin terminals, and usually involve corrupt employees of the retailer using the device or social engineering.
The only solution is to encrypt the data. This makes the question of interception on the device irrelevant.

iPhone headphone audio jack re-routing

We created an external iOS notification light that uses the device’s audio for power.
When you get a phone call on iPhone and the light is plugged in, you still get the ringtone but when you pick up, the audio is rerouted to the headphones (the iPhone thinks our light/device is a headphones set) and the user has to extract myLED for at least 2mm to get the audio from the front receiver of the phone.
We have been exploring alternative solutions to this challange - recently we made a prototype with a particular jack shape so that it could be rotated by the user when getting a call to "reroute" the audio to the iPhone speaker/mic.
Although it may sound a clever option, this hardware solution is far from being neat - this leads to having positions where the myLED does not work/ it is not reliable, plus other complications.
I know of the existence of kAudioSessionOverrideAudioRoute_Speaker however I suspect that this will only direct the app audio to the rear speaker (the “loud” one) and not to the front receiver (because the “receiver” for the iphone is the headphones set if they are detected).
What would you suggest?
Super appreciated!
I think you're in a tough spot:
It's highly unlikely Apple will ever release the option to override audio routing for phone calls. As a key functionality of the phone, they tend to keep the call aspect under lock and key.
The headphone jack (probably - this is how most of them do it) uses the impedance between ground and one or both speakers or the remote control to determine if the plug is in. Other than breaking the circuit, there is no good way to simulate this.
The only options I think you have are these:
Require the user to remove the device when a call comes in.
Provide a microcontroller on the jack to drive a transistor; this transistor can electronically break the circuit to provide the same sort of impedance signature as an unplugged jack.
How, when, and if you can provide the information to the jack that a phone call is in progress is beyond my knowledge: is there an API for "incoming but not yet answered call" you can hook to? Will you have to do a watchdog thing to ensure communication with your app? Would it be possible for you to use the dock connector instead? I think these are really your options. Not a complete answer, but those are my thoughts.

HTML5 Audio callback fails on safari/iOS

I have built an app, designed to play each sound as the first one is finished by using 'ended' event.
In my initial version, each sound has its own audio element, resulting in something like:
function play_next_audio(){
speaker = $('audio#' + sounds[i++]).get(0);
speaker.addEventListener('ended',play_next_audio);
speaker.play();
}
This works great on all desktop browsers, including Safari, but does not go beyond the very first letter on iOS.
I have also tried a different approach - a single audio element that loads each sound in turn. There I experimented with binding the 'ended' event as well as loading first and binding 'canplaythrough' event instead. In both cases, it fails to work even on the desktop Safari - this time successfully playing the first two letters.
Here is the isolated test:
http://dev.connectfu.com:42001/app/test-sounds.html
Note that audio.load() is commented out several places - having it in or out seems to make no difference.
What am I doing wrong? How can I play series of sounds on iOS? Thank you so much for the help!
update As of 2017, the ended event doesn't fire on Safari (or Chrome) on iOS under several conditions. More information can be found here: It's almost 2017, and HTML5 audio is still broken on iOS.
I've built an HTML5 audio player (Chavah Messianic Radio) that "works" on Safari on iOS.
By "works", I mean, it plays the best it can on Apple's crippled iOS <audio> implementation. At the time of this writing, this includes iOS 5. I've tested this on iPhone 3 and up, and iPad original, iPad 2, and iPad 3.
Here are my findings:
Apple does not allow you to call .play on any audio until user interaction. For me, this means detecting iOS, then showing the music as paused until the user clicks play. Their reasoning is that this will consume data and battery; in practice, though, it just cripples web apps and stifles the evolution of the web.
If you want to play successive sounds (one after another), use a single element. When it's time to play the next sound, set existingAudio.src, then call existingAudio.load(), then call existingAudio.play(). This will allow you to play successive sounds.
Audio events don't fire if Safari is in the background.. While audio will continue playing if the user switches to a different app, the .ended event won't fire. This means it's practically impossible to build a music player app.
Hope this helps.
<rant>
In the meantime: Apple, please, please, please give us better support for HTML5 in iOS Safari. Here are your action items, Apple:
Let HTML5 audio work in the background.
Support OGG.
Support pre-loading audio.
Support concurrent audio.
Let us play audio without user interaction.
Do those things, Apple, you'll be the industry leader in mobile HTML5 audio, everyone will emulate you, you'll once again be leading the way, and web apps will work perfectly on your platforms, while being crippled on other mobile platforms. Yes, these features will use data and the battery, but native apps already do this. Stop crippling web apps and be the leader. Make HTML5 <audio> a first-class citizen on iOS.
</rant>
I don't believe the .play() method is supported for audio or video in iOS. Apple does not like the idea of videos or audio automatically playing upon visiting a page.
Here is a helpful reading about the state of HTML5 audio support across platforms: http://www.phoboslab.org/log/2011/03/the-state-of-html5-audio

Resources