So with iOS 8, we can now record the screen of iOS devices. I've searched extensively and cannot find a way to detect, let alone prevent, this recording. The app I'm working on deals with some potentially sensitive information and images and would like to prevent this if at all possible.
Thank you in advance for your responses and insights!
Anthony
Apparently, there is some way to detect whether a display or QuickTime streaming is connected, because the Netflix app will show an error when that is the case (which also means you can't just use an iOS device and stream to your computer to watch it in big). The app works perfectly if QuickTime streaming is off with the cable is plugged in.
Maybe it just detects whether an external display is connected, and screen recording behaves like that, so basically you might have some success with these APIs and notifications.
Also, you could use an encrypted HTTP Live Stream according to Apple which would be blacked out in the stream / the recording.
Related
I have an app that displays videos, and it's very important to us that we intercept all pause events, and prevent users from seeking in videos.
Doing it on device is pretty simple, we just don't expose any 'regular' controls to user, and in -remoteControlReceivedWithEvent:, we wrap all events that we're actually interested in.
But we're struggling with supporting Apple TV. It's our understanding that it should forward all events sent from Apple Remote to our app, as per [0]:
When AirPlay is in use, your media might be playing in another room from your host device. The AirPlay output device might have its own controls or respond to an Apple remote control. For the best user experience, your app should listen for and respond to remote events, such as play, pause, and fast-forward requests. Enabling remote events also allows your app to respond to the controls on headphones or earbuds that are plugged into the host device.
However, as far as I can see from my debugging and pulled hair, it doesn't apply to cases where you let AVPlayer handle displaying your video. We actually don't do anything at all to make videos play on TV, since AVPlayer's allowsExternalPlayback property is YES by default.
If I'm understanding docs correctly, while using that mode with Apple TV, only URL/data from device is sent to Apple TV, and aTV does the decoding and rendering part on it's own, as per [1]:
External playback mode is when video data is sent to an external device such as Apple TV via AirPlay and the mini-connector-based HDMI/VGA adapters for full screen playback at its original fidelity. AirPlay Video playback is also considered as an "external playback" mode.
which could potentially explain why I don't receive any events on device (e.g. someone at Apple thought that since aTV does the heavy lifting and actually decoding and rendering, apps on device shouldn't receive those events).
So, my question is basically this - is there any obvious tree I'm missing from the forest, or do I have no retreat other than either:
ugly hacks using KVO on playback position and playback rate, and punishing users for 'cheating'
reimplementing whole video rendering on my own, treating TV screen as second display
Any pointers will be greatly appreciated.
[0] https://developer.apple.com/library/ios/documentation/AudioVideo/Conceptual/AirPlayGuide/EnrichYourAppforAirPlay/EnrichYourAppforAirPlay.html
[1] https://developer.apple.com/library/ios/documentation/AVFoundation/Reference/AVPlayer_Class/Chapters/Reference.html#//apple_ref/occ/cl/AVPlayer
Hey I have a video app which uses AVPlayer, I received an email from a user saying he couldn't use his car's bluetooth anymore since I updated the app some time ago.
Now, I didn't tried to support bluetooth back then, but apparently this functionality came along with MPMoviePlayerController which I used before.
Is there a way to check if there's a bluetooth device connected and to send audio to it?
I've seen this question, but it didn't help, the volume control shows but no button letting me chose the output.
Do I have to do something else in order to use bluetooth as output?
I've only tested with my car's bluetooth, which works with phone calls and also other apps like Viber.
We created an external iOS notification light that uses the device’s audio for power.
When you get a phone call on iPhone and the light is plugged in, you still get the ringtone but when you pick up, the audio is rerouted to the headphones (the iPhone thinks our light/device is a headphones set) and the user has to extract myLED for at least 2mm to get the audio from the front receiver of the phone.
We have been exploring alternative solutions to this challange - recently we made a prototype with a particular jack shape so that it could be rotated by the user when getting a call to "reroute" the audio to the iPhone speaker/mic.
Although it may sound a clever option, this hardware solution is far from being neat - this leads to having positions where the myLED does not work/ it is not reliable, plus other complications.
I know of the existence of kAudioSessionOverrideAudioRoute_Speaker however I suspect that this will only direct the app audio to the rear speaker (the “loud” one) and not to the front receiver (because the “receiver” for the iphone is the headphones set if they are detected).
What would you suggest?
Super appreciated!
I think you're in a tough spot:
It's highly unlikely Apple will ever release the option to override audio routing for phone calls. As a key functionality of the phone, they tend to keep the call aspect under lock and key.
The headphone jack (probably - this is how most of them do it) uses the impedance between ground and one or both speakers or the remote control to determine if the plug is in. Other than breaking the circuit, there is no good way to simulate this.
The only options I think you have are these:
Require the user to remove the device when a call comes in.
Provide a microcontroller on the jack to drive a transistor; this transistor can electronically break the circuit to provide the same sort of impedance signature as an unplugged jack.
How, when, and if you can provide the information to the jack that a phone call is in progress is beyond my knowledge: is there an API for "incoming but not yet answered call" you can hook to? Will you have to do a watchdog thing to ensure communication with your app? Would it be possible for you to use the dock connector instead? I think these are really your options. Not a complete answer, but those are my thoughts.
What SDK's does Audiobus use to provide inter-app audio routing? I am not aware of any Apple SDK that could facilitate inter-app communication for iOS and was under the impression that apps were sandboxed from each other so I'm really intrigued to hear how they pulled this off.
iOS allows inter-app communication via MIDI Sysex messages. AudioBus works by sending audio as MIDI Sysex message. You can read details from the developer himself:
http://atastypixel.com/blog/thirteen-months-of-audiobus/
My guess is that they use some sort of audio over network, because I've seen log statements when our app gets started even on a different device.
Don't really know about the details of the implementation, but this could be a way of staying in the "sandbox" constraint.
The Audiobus SDK (probably) use the Audio Session rules to "organize" all the sound output from the apps using their SDK, as you can see on their videos (on bottom of the page), the apps have an lateral menu to switch back and forwards between apps.
The Audio Session Category states:
Allows mixing: if yes, audio from other applications (such as the iPod) can continue playing when your application plays sound.
This way Audiobus can "control" the sound and allow the session to be persistent between the apps.
I'd like to stream video from the camera on an iOS device to a receiver via wifi, in effect turning the device into a wireless webcam. Is there a way to build a small app that captures video input on an iOS app and sends it via an RTSP stream or similar?
As this is an ad hoc experiment, I'm not concerned about App Store guidelines and can jailbreak if necessary.
If I interpret your question correctly you more or less need to solve four problems:
Get the camera feed.
Convert/encode this to the right format.
Stream the data.
Prevent the phone from locking itself and going into deep sleep.
The first one is fairly simple and Apple has as always provided good documentation and examples -> API link. Make sure you check out their example in the end as you will get a CMSampleBufferRef data object back.
For the second and third part, you should check out the CFNetwork framework and specially CFFTPStream for streaming using FTP.
If your are only building this for yourself then you can always turn off the Auto-Lock feature in the settings. If you on the other hand would like to distribute this to other users you could use a trick to play a mute sound every 10 seconds. This is more or less how all the alarm clocks work in the App Store. Here's a tutorial. =)
I hope I helped a little bit at least.
Good luck and best regards!
I'm 70% of the way to doing the same thing. Here's how I did it:
Capture content from video input
Chop video into files for use in HTML Live Streaming.
Spin up a web server on the iPhone and make the video files available.
Connect to the IP address of the phone and viola! you've got live streaming video.
Last time I touched the code I was trying to debug my Live Streaming not working. I'll try and get my source code posted on github this weekend, if you'd like to take a look.