How can I use ARKit while using Slide Over/Split Screen on iPadOS? - ios

I have an app that uses ARKit to detect faces and send over the network the coordinates of interest, which works well. I would like this app to run in background, still sending the data over the network, while I would be using another app (almost) fullscreen.
The option 'Enable multiple windows' is activated in info.plist, but as soon as I launch my other app, the ARKit app stops sending information (the app actually probably stops).
Is there a simple way to do this, and at least is this feasible? Thanks!

This is not possible at this point. Camera and AR stuff is disabled at a system level in apps when they are displayed in Slide Over or Split View.
I'd recommend displaying a warning message when Slide Over/Split Screen is being used saying that you should use the app in full screen mode. See this answer under a different question for details.

Related

iOS Request permission dialog not showing on screen recording or quicktime video

Does anyone know when the App Store started requiring the app previews to be on a physical device and show this pop-up?
Even when I mirror my iPhone Xs to my Mac, the pop-up does not show. If no pop-up on video, app store rejects app preview.
Anyone know of a better way to get around this issue?
Another image that shows issue:
This pop-up NEVER shows on screen recording or mirroring....
We have same problem. You have two options:
You can record the screen from another mobile or camera
You can use a previous iOS. We have checked that permission dialog can be screen recorded with iOS12
What was your solution?
Yes,  has been steadily, over the past 3'ish years or so, working on removing certain system sensitive controls from being displayed in that feed. My guess it's for $SEKURITY reasons, though it doesn't completely make sense, to me, what would be the attack vector that it attempts to prevent.
The location dialog is not the only case I know that it hides. It originally started with them hiding even *** in the password text fields, and hiding the keyboard in such input, as well. Apparently it now expanded to location, and likely other system dialogs. 🤷‍♂️
Surprisingly few people seem to be discussing this.
I published an app last year (2021 / June) on 14.? which required the popup and it was accepted. Did this again with another app just after 15.? and I cannot get it past the apple store. Once they allowed me to post a video recording and I got it through, but since then ... no chance.
How are people getting their app through - I assume apple use the screen capture system to do their testing.
I had the same problem but my app was accepted.
Surely screen recording does not work, but screen capture does. I have embedded a screenshot of the permission dialog into a video and submitted it for app review.

How to take a screenshot programmatically if the app is running in background

I am writing a piece of code using swift for iOS.
It’s using location services. I need to take a screenshot of the current screen every time the location gets updated.
The thing is when the app is running in background and updating the location the screenshots taken are from my app. Not the app which is currently available in the screen.
Assume that the user plays a game in their phone. But my app run in background and updating the location when the users location is significantly changed the screenshot should be taken and it should include the game. Not my app screen.
Can some one tell me how to achieve the screenshot capability I want.
Short answer: You can't.
Apps can only take screenshots of their own content. Starting with (I think) iOS 7 Apple blocked the ability of apps to take screenshots of anything but their own content. This is for security reasons.
Imagine your app echoes each character of your password as you type it, and then replaces it with a bullet as you type the next character. Now imagine an app that runs in the background and takes a screen-shot every time your app displays a new character of the user's password.

ARKit Resuming Session

I have an ARKit app which allows the user to add a cube to the scene. This works fine and I can see the cube. But when I push the app to the background and then move the device to another location (by walking to a different room) then ARKit session is unable to determine the correct position of my old nodes.
Is there anyway I can find a workaround this problem so that when the app is resumed from coming to foreground from the background then it still remembers the position of the nodes.
UPDATE: I am looking into saving the lat and long for the user and then somehow converting those lat and long to SCNVector3 to put the node.
You probably won't be able to keep the AR running in the background. Apple does not recommend pausing the session or interrupting it and trying to resume:
Avoid interrupting the AR experience. If the user transitions to another fullscreen UI in your app, the AR view might not be an expected state when coming back.
Use the popover presentation (even on iPhone) for auxiliary view controllers to keep the user in the AR experience while adjusting settings or making a modal selection. In this example, the SettingsViewController and VirtualObjectSelectionViewController classes use popover presentation.
The issue is that once the session gets interrupted, the device stops using it's mechanisms that keep track of AR Nodes as well as your location, might have to set up a mechanism that keeps the app running constantly in the background and running the ARSession through that. You might be able to find projects on github that allow for running in the background. Another issue you might face is Apple's limitations with running apps in the background, which is apparently 3 minutes.
If you're at all interested in restarting the AR Session one you came back in, you can see my answer on this thread.

GPS based VS Beacon based ranging? Which governs Lock screen left corner app icon

There are two approaches for showing an app/app suggestion (incase not installed) on the iphone lock screen / app switcher. One is GPS based, in which the IOS decides which app to show as a suggestion. Another is beacon based, in which a particular beacon is identified.
If location services are enabled for multiple apps and say all these apps are also using beacon based approach to show their icons on the lock screen left corner, which app icon will be shown by the IOS?
Since location services are enabled for these apps,and say there is another relevant app which is NOT using beacon based approach (using just the GPS based approach), can IOS give preference to beacon based apps over the GPS based this new app.?
For instance, Estimote’s NYC office is on the same block as an Equinox gym and our phones intelligently and automatically alert us to use that app. It’s super easy and intuitive to open the app while walking into the gym - and in the process, streamline the check-in flow with the gym’s front desk. However, because it solely uses GPS geofences, the accuracy is poor. We actually get the Equinox icon over 1 block away, and there is no control for the brands or stores (in this case Equinox) on how this appears.
Apple's suggestion of apps not installed on the phone based on proximity uses an undocumented technique. While I have verified it uses GPS as an input, I have never been able to confirm that beacons are used at all.
Regardless of whether beacons are used, because this is an undocumented feature, it is unlikely you will find a way to customize the behavior.
AFAIK, Apple has never shared the implementation details of how the lock screen icon AKA "suggested apps" feature works.
However, we did some experiments at Estimote and noticed that being inside a CLRegion (both the "GPS" CLCircularRegion, and CLBeaconRegion work) that an app monitors for via Core Location, consistently makes the app's icon show up on the lock screen. So it seems that both beacons and GPS location fall into the same mechanism that governs the location-based suggestions. (Note that in iOS 9, that's not just the lock screen icon, but also a bar at the bottom of the app switcher.)
Unfortunately, we weren't able to establish what happens if you're inside multiple qualifying CLRegions, belonging to different apps. We suspect it might have something to do with the order in which the apps register regions for monitoring, but were never able to get consistent results.
Furthermore, since this whole behavior is undocumented, Apple can change it at any time. Just something to be aware of.
Side note: handoff always trumps suggested apps.

Where can I find iOS dual display for Apple TV documentation?

Would anyone know the whereabouts of the documentation on how to implement AirPlay dual-screen functionality into an app?
e.g. http://www.apple.com/uk/appletv/airplay/
This link has very brief, but from what I can see totally sufficient information if you scroll down to "Make the Most of a Second Display".
In short, you register for notifications on a connect to an external display to get a handle to it, and switch between drawing on the two displays by using setScreen(). Besides that, everything drawing related should be "the usual".
Overview
The user can connect additional screens to an iOS device at any time using AirPlay or a physical cable. Each additional screen represents new space on which to display your app’s content, and is managed by a UIScreen object. For example, a game might show its content on a connected display and show game controls on the iPhone screen, as illustrated in Figure 1. Displaying Content on a Connected Screen

Resources