airplay on an iPad app - ipad

is it possible to enable Airplay on an iPad app I've made that streams audio (no video). If so, is there any sample code out there?
thanks for any help.

This may be of interest:
AirPlay Overview: Encryption and Authentication
This part indicates that it may be possible to play what is streaming over AirPlay:
Bear in mind that if the video is playing over AirPlay, the viewer is watching on a television with no way to enter a password. In such cases, the viewer must authenticate on the device that is relaying the video via AirPlay, which may be in another room. This situation has other implications, including how long cookies or other authentication tokens should last before expiration. You should configure your server accordingly.

NO
The RAOP protocol is private and the stream is encrypted.
Source: http://en.wikipedia.org/wiki/Remote_Audio_Output_Protocol

Related

iOS requirement for live steaming

I'm working on the live streaming app like Periscope and doing research on requirement and restriction on iOS.
I found out that Apple only allows HLS (Http Live Streaming) for certain conditions. I found such conditions below from apple site.
If your app delivers video over cellular networks, and the video exceeds either 10 minutes duration or 5 MB of data in a five minute period, you are required to use HTTP Live Streaming.(https://developer.apple.com/library/content/documentation/NetworkingInternet/Conceptual/StreamingMediaGuide/UsingHTTPLiveStreaming/UsingHTTPLiveStreaming.html#//apple_ref/doc/uid/TP40008332-CH102-SW5)
But I'm not sure that HLS should be used for both publishing and watching video or only for watching is acceptable? Because i thinking of using RTMP for publishing and HLS for watching.
I wrote an app similar to periscope that is on the app store now and it can use 2Mbps and connects via RTMP protocol to send the data. So my guess is they no longer enforce it. I also beleive that at the time that that was written cell service load was possibly to high and they were hoping that HLS would help with that would be my guess. Now with 4gLTE it can handle the loads a little better. Again that is just a guess. My app went up with no problem or mention of that and the review team was more than aware what the app did.

How to prevent recording of iOS screen using quicktime

So with iOS 8, we can now record the screen of iOS devices. I've searched extensively and cannot find a way to detect, let alone prevent, this recording. The app I'm working on deals with some potentially sensitive information and images and would like to prevent this if at all possible.
Thank you in advance for your responses and insights!
Anthony
Apparently, there is some way to detect whether a display or QuickTime streaming is connected, because the Netflix app will show an error when that is the case (which also means you can't just use an iOS device and stream to your computer to watch it in big). The app works perfectly if QuickTime streaming is off with the cable is plugged in.
Maybe it just detects whether an external display is connected, and screen recording behaves like that, so basically you might have some success with these APIs and notifications.
Also, you could use an encrypted HTTP Live Stream according to Apple which would be blacked out in the stream / the recording.

UIEvents sent from Apple TV while using AVPlayer external playback mode

I have an app that displays videos, and it's very important to us that we intercept all pause events, and prevent users from seeking in videos.
Doing it on device is pretty simple, we just don't expose any 'regular' controls to user, and in -remoteControlReceivedWithEvent:, we wrap all events that we're actually interested in.
But we're struggling with supporting Apple TV. It's our understanding that it should forward all events sent from Apple Remote to our app, as per [0]:
When AirPlay is in use, your media might be playing in another room from your host device. The AirPlay output device might have its own controls or respond to an Apple remote control. For the best user experience, your app should listen for and respond to remote events, such as play, pause, and fast-forward requests. Enabling remote events also allows your app to respond to the controls on headphones or earbuds that are plugged into the host device.
However, as far as I can see from my debugging and pulled hair, it doesn't apply to cases where you let AVPlayer handle displaying your video. We actually don't do anything at all to make videos play on TV, since AVPlayer's allowsExternalPlayback property is YES by default.
If I'm understanding docs correctly, while using that mode with Apple TV, only URL/data from device is sent to Apple TV, and aTV does the decoding and rendering part on it's own, as per [1]:
External playback mode is when video data is sent to an external device such as Apple TV via AirPlay and the mini-connector-based HDMI/VGA adapters for full screen playback at its original fidelity. AirPlay Video playback is also considered as an "external playback" mode.
which could potentially explain why I don't receive any events on device (e.g. someone at Apple thought that since aTV does the heavy lifting and actually decoding and rendering, apps on device shouldn't receive those events).
So, my question is basically this - is there any obvious tree I'm missing from the forest, or do I have no retreat other than either:
ugly hacks using KVO on playback position and playback rate, and punishing users for 'cheating'
reimplementing whole video rendering on my own, treating TV screen as second display
Any pointers will be greatly appreciated.
[0] https://developer.apple.com/library/ios/documentation/AudioVideo/Conceptual/AirPlayGuide/EnrichYourAppforAirPlay/EnrichYourAppforAirPlay.html
[1] https://developer.apple.com/library/ios/documentation/AVFoundation/Reference/AVPlayer_Class/Chapters/Reference.html#//apple_ref/occ/cl/AVPlayer

How does Audiobus for iOS work?

What SDK's does Audiobus use to provide inter-app audio routing? I am not aware of any Apple SDK that could facilitate inter-app communication for iOS and was under the impression that apps were sandboxed from each other so I'm really intrigued to hear how they pulled this off.
iOS allows inter-app communication via MIDI Sysex messages. AudioBus works by sending audio as MIDI Sysex message. You can read details from the developer himself:
http://atastypixel.com/blog/thirteen-months-of-audiobus/
My guess is that they use some sort of audio over network, because I've seen log statements when our app gets started even on a different device.
Don't really know about the details of the implementation, but this could be a way of staying in the "sandbox" constraint.
The Audiobus SDK (probably) use the Audio Session rules to "organize" all the sound output from the apps using their SDK, as you can see on their videos (on bottom of the page), the apps have an lateral menu to switch back and forwards between apps.
The Audio Session Category states:
Allows mixing: if yes, audio from other applications (such as the iPod) can continue playing when your application plays sound.
This way Audiobus can "control" the sound and allow the session to be persistent between the apps.

sending video by hand-made protocol

I'm going to develop an application, whose main task would be send video frames captured from device camera to server. Server uses protocol over TCP. I heard that Apple restricts developers from using any video streaming protocols, except HTTP live streaming. Is this information correct? Will be there any problems while approving my app in appstore?
After some digging into the topic, found some info, here is another topic. And for video broadcasting from device we cannot use HLS.

Resources