According to the first comment at the end of this article:
http://www.imore.com/keynote-iphone-ipad-review
And here:
http://help.apple.com/keynote/ipad/2.2/#/tand1a4ee7c
It seems that you can configure keynote for iPad, such that you can see speaker notes on the iPad when you're using a dongle to plug into a projector or big screen; but not on the big screen.
Is this functionality only afforded to Keynote through some private API in the OS level? or does anyone know of a way of achieving this programatically? My use case doesn't need to make it into the app store - so a private API hack could work for me.
No need for private API's. You can observe UIScreenDidConnectNotification notifications for when a second screen is connected whether it's airplay or hdmi.
You then provide a View/ViewController for that screen.
Related
iOS 15 introduces the ability for people to share their screen via FaceTime (https://www.apple.com/uk/newsroom/2021/06/ios-15-brings-powerful-new-features-to-stay-connected-focus-explore-and-more/) - from a developer POV, is there a way to disable this for my app?
I can't find any information on how this might be possible and, similar to screenshot functionality, I fully expect that the answer is no.
However, I've been asked to investigate - hence this question.
Many thanks for the help!
You can't opt-out as such, but you can detect that screen sharing is active and adapt your UI in response. For example, you could overlay a view stating "Screen sharing not available" or similar.
Apple describes this technique in this technote but, in summary:
You can listen for the capturedDidChange notification and when you receive this notification, you check the isCaptured property of UIScreen - If it is true then the screen is being shared to some external destination; This could be via a screen recording a broadcast extension, AirPlay, Quicktime capture via cable or the new SharePlay feature, but you probably want to handle all of these in the same way.
I have an app that uses ARKit to detect faces and send over the network the coordinates of interest, which works well. I would like this app to run in background, still sending the data over the network, while I would be using another app (almost) fullscreen.
The option 'Enable multiple windows' is activated in info.plist, but as soon as I launch my other app, the ARKit app stops sending information (the app actually probably stops).
Is there a simple way to do this, and at least is this feasible? Thanks!
This is not possible at this point. Camera and AR stuff is disabled at a system level in apps when they are displayed in Slide Over or Split View.
I'd recommend displaying a warning message when Slide Over/Split Screen is being used saying that you should use the app in full screen mode. See this answer under a different question for details.
Would anyone know the whereabouts of the documentation on how to implement AirPlay dual-screen functionality into an app?
e.g. http://www.apple.com/uk/appletv/airplay/
This link has very brief, but from what I can see totally sufficient information if you scroll down to "Make the Most of a Second Display".
In short, you register for notifications on a connect to an external display to get a handle to it, and switch between drawing on the two displays by using setScreen(). Besides that, everything drawing related should be "the usual".
Overview
The user can connect additional screens to an iOS device at any time using AirPlay or a physical cable. Each additional screen represents new space on which to display your app’s content, and is managed by a UIScreen object. For example, a game might show its content on a connected display and show game controls on the iPhone screen, as illustrated in Figure 1. Displaying Content on a Connected Screen
can you trigger airplay mirroring with iOS 5 code?
I see in the API how you can stream. Just wonder if anyone knows how to turn on and off mirroring programmically in my app on iPad2 using iOS5, or if that even is possible.
I DO see how to physically turn it on via the dock at the bottom of the screen,
so no need to post a link to the Apple demo of that.
Thanks in advance
It's currently not possible. With a bit of luck this may change tomorrow, but it isn't something I can see apple doing. (for the same reasons as they dont allow toggling bluetooth/airplane mode/wifi etc..)
I was wondering if there was a specific method called when we press the 2 buttons of the iPhone (using Home-Button & Power on/off) to take a screenshot. If yes, I would known his name to use her in programming.
There used to be a UIGetScreenImage() function that you could use to capture the screen. Apple no longer allows use of that function in App Store apps, so you have a few other options. CALayer has a -renderInContext: method—Google it—that you can use to copy a view’s contents to a graphics context; this does not, however, work for OpenGL content, video, or live imagery from a device’s camera. I’m not sure about solutions for the first two, but for the latter—getting images from the camera—you’ll need to use the AVFoundation framework.
It is a system level service for which the app never receives any notification or method call.
I believe that would be a native method, not accessible from the IPhone SDK. In what context are you going to be using this? You might be looking for this - Take screenshot from code