iOS ARKit saving map across multiple sessions - augmented-reality

I am developing an app using iOS ARKit which has launched recently. I have a question if map and data that has been added can be extended to multiple sessions.
I am adding UI elements on top of map data, however if I close the app and open it again, i will have to do entire map detection and adding UI elements again. If there is a way to save the map please suggest how this can be achieved?

Nope.
There is no API for accessing the data ARKit uses internally for position/orientation tracking, nor for telling ARKit to save/restore such data itself.

Related

How can I use ARKit while using Slide Over/Split Screen on iPadOS?

I have an app that uses ARKit to detect faces and send over the network the coordinates of interest, which works well. I would like this app to run in background, still sending the data over the network, while I would be using another app (almost) fullscreen.
The option 'Enable multiple windows' is activated in info.plist, but as soon as I launch my other app, the ARKit app stops sending information (the app actually probably stops).
Is there a simple way to do this, and at least is this feasible? Thanks!
This is not possible at this point. Camera and AR stuff is disabled at a system level in apps when they are displayed in Slide Over or Split View.
I'd recommend displaying a warning message when Slide Over/Split Screen is being used saying that you should use the app in full screen mode. See this answer under a different question for details.

Passing Data From native iOS app to Unity App

I have iOS app now i want to add some animation in my app so i think to create Unity app for that animation. So now i want open unity app from my iOS app with button click action are you getting ?
Now animation is generate the dynamic number ball so for that i have to pass the parameter(Number of total ball, Number of ball to ball generate, Ball colour code) to unity app from iOS app.
So is there any possible way to pass the data from iOS app to Unity. I i'm thinking to use URL Scheme Or any other method available?
I am assuming you want to communicate between two apps. For that there are several ways.
To open other app within an app, please check this answer here.
Solution 1:
Don't use two apps to do single animation. Thats not good practice and not feasible. Do animation inside your iOS app and I believe iOS is powerful enough to handle your animation. Apple has already suite of game technologies to support gaming development in iOS natively. You can check this link here.
Solution 3:
If you can build app in unity, build whole app in unity.
Solution 2:
The first part of your question is very confusing though. You want to show animation in your iOS app but want to create animation in unity app? That is not going to work. Because when you open unity app, the current app will be suspended as default and you will only see unity app at that time.
As per your 2nd part of the question you want pass some arguments from iOS for animation to play in unity app. To do that you have to save data locally (in some kind of database whatever you like), like Number of total ball, Number of ball to ball generate, Ball colour code. And then you have to do code for unity to read those parameters from database and do the animation accordingly.

ARKit Resuming Session

I have an ARKit app which allows the user to add a cube to the scene. This works fine and I can see the cube. But when I push the app to the background and then move the device to another location (by walking to a different room) then ARKit session is unable to determine the correct position of my old nodes.
Is there anyway I can find a workaround this problem so that when the app is resumed from coming to foreground from the background then it still remembers the position of the nodes.
UPDATE: I am looking into saving the lat and long for the user and then somehow converting those lat and long to SCNVector3 to put the node.
You probably won't be able to keep the AR running in the background. Apple does not recommend pausing the session or interrupting it and trying to resume:
Avoid interrupting the AR experience. If the user transitions to another fullscreen UI in your app, the AR view might not be an expected state when coming back.
Use the popover presentation (even on iPhone) for auxiliary view controllers to keep the user in the AR experience while adjusting settings or making a modal selection. In this example, the SettingsViewController and VirtualObjectSelectionViewController classes use popover presentation.
The issue is that once the session gets interrupted, the device stops using it's mechanisms that keep track of AR Nodes as well as your location, might have to set up a mechanism that keeps the app running constantly in the background and running the ARSession through that. You might be able to find projects on github that allow for running in the background. Another issue you might face is Apple's limitations with running apps in the background, which is apparently 3 minutes.
If you're at all interested in restarting the AR Session one you came back in, you can see my answer on this thread.

CoreMotion - way to determine if motion is disabled in iOS Settings?

On iOS 7 the user can choose disable device motion in Settings -> General -> Accessibility -> Reduce Motion.
I am creating a UI effect based on UITableView scrolling, so I am not using CMMotionManager or the CoreMotion framework to create any motion effects.
However, I would like to respect the user's settings and not create the motion effect if the user has turned on Reduce Motion in Settings.
CMMotionManager includes an instance method deviceMotionActive to check for whether it's active (I'm assuming this is the correct check), however, I'd prefer not having to initialize the manager just to do this check, sadly I cannot find any documentation about a class method that would return a similar boolean, sort of like there exist class methods on MFMessageComposeViewController to check for iMessage/SMS availability (+(BOOL)canSendText) and so on.
Thanks!
You are confusing two separate things called "motion". CMMotionManager is used to access the sensors, such as the gyroscope and accelerometer, that report how a user physically moves a device. It has nothing to do with the motion effects, like UIMotionEffect objects, that are used in animating views.
The deviceMotionActive method merely indicates whether your app is currently registered for receiving motion updates from CoreMotion. This will only be true if your app has called one of the CMMotionManager startXXXUpdate methods. Again, it has nothing to do with user settings or UIMotionEffect objects.
UPDATE: As John mentions in the comments, there appears to be an API for this in iOS 8: See stackoverflow.com/a/25453082/2904769 .

What is the name of the method for screenshot on iOS

I was wondering if there was a specific method called when we press the 2 buttons of the iPhone (using Home-Button & Power on/off) to take a screenshot. If yes, I would known his name to use her in programming.
There used to be a UIGetScreenImage() function that you could use to capture the screen. Apple no longer allows use of that function in App Store apps, so you have a few other options. CALayer has a -renderInContext: method—Google it—that you can use to copy a view’s contents to a graphics context; this does not, however, work for OpenGL content, video, or live imagery from a device’s camera. I’m not sure about solutions for the first two, but for the latter—getting images from the camera—you’ll need to use the AVFoundation framework.
It is a system level service for which the app never receives any notification or method call.
I believe that would be a native method, not accessible from the IPhone SDK. In what context are you going to be using this? You might be looking for this - Take screenshot from code

Resources