ARKit: How to reset world orientation after interruption - ios

I'm trying to get the ARWorldTracking session to re-orient north after a session interruption. I've went over the documentation a few times but I'm finding it confusing.
Current Behavior:
When I lock the device and reopen the app, triggering the sessionWasInterrupted, the SCNNodes all shift counterclockwise on the compass by ~90 degrees or so.
When you call the run(_:options:) method with a configuration of a
different type than the session's current configuration, the session
always resets tracking
I interpreted that as saying that when I generate a new set of configurations that is different from the viewWillAppear, the session will "reset". I'm not getting what is actually happening, but the orientation after interruption is off. (and
removeExistingAnchors does nothing)
override func viewWillAppear(_ animated: Bool) {
super.viewWillAppear(animated)
let configuration = ARWorldTrackingSessionConfiguration()
configuration.planeDetection = .horizontal
configuration.worldAlignment = .gravityAndHeading
sceneView.session.run(configuration)
}
func sessionWasInterrupted(_ session: ARSession) {
let configuration = ARWorldTrackingSessionConfiguration()
configuration.planeDetection = .horizontal
configuration.worldAlignment = .gravityAndHeading
self.sceneView.session.run(configuration, options: [ARSession.RunOptions.removeExistingAnchors, ARSession.RunOptions.resetTracking])
}
Desired Behavior:
When the app detects a session interruption, I'd like it to re-orient itself back to true north.

This issue was killing me, too - you helped me out with half the solution - adding the 'Reset tracking / Remove existing anchors' flags was the magic key for me - I think the other half is the guidance from This post where you have to pause your session and remove all the nodes from the scene, then re-position them. The combination of both of these things got the compass to reset back to True North after session interruption for me.
func resetARSession() {
// Called by sessionInterruptionDidEnd
sceneView.session.pause()
sceneView.scene.rootNode.enumerateChildNodes { (node, stop) in
node.removeFromParentNode() }
setupARSession()
setupSceneView()
}
func setupARSession() {
let configuration = ARWorldTrackingConfiguration()
configuration.worldAlignment = .gravityAndHeading
sceneView.session.run(configuration, options: [ARSession.RunOptions.resetTracking, ARSession.RunOptions.removeExistingAnchors])
}

Related

ARKit does not recognize reference images

I'm trying to place a 3D model on top of a recognized image with ARKit and RealityKit - all programmatically. Before I start the ARView I'm downloading the model I want to show when the reference image is detected.
This is my current setup:
override func viewDidLoad() {
super.viewDidLoad()
arView.session.delegate = self
// Check if the device supports the AR experience
if (!ARConfiguration.isSupported) {
TLogger.shared.error_objc("Device does not support Augmented Reality")
return
}
guard let qrCodeReferenceImage = UIImage(named: "QRCode") else { return }
let detectionImages: Set<ARReferenceImage> = convertToReferenceImages([qrCodeReferenceImage])
let configuration = ARWorldTrackingConfiguration()
configuration.detectionImages = detectionImages
arView.session.run(configuration, options: [.resetTracking, .removeExistingAnchors])
}
I use the ARSessionDelegate to get notified when a new image anchor was added which means the reference image got detected:
func session(_ session: ARSession, didAdd anchors: [ARAnchor]) {
print("Hello")
for anchor in anchors {
guard let imageAnchor = anchor as? ARImageAnchor else { return }
let referenceImage = imageAnchor.referenceImage
addEntity(self.localModelPath!)
}
}
However, the delegate method never gets called while other delegate functions like func session(ARSession, didUpdate: ARFrame) are getting called so I assume that the session just doesn't detect the image. The image resolution is good and the printed image the big so it should definitely get recognized by the ARSession. I also checked that the image has been found before adding it to the configuration.
Can anyone lead me in the right direction here?
It looks like you have your configuration set up correctly. Your delegate-function should be called when the reference image is recognized. Make sure your configuration isn't overwritten at any point in your code.

CLFloor returns level 2146959360?

I’m using CLLocationManager, looking at the location property and examining the level of the floor, if any. The documentation suggests that if it couldn’t determine the floor, that it would just return nil. In practice, I am getting a CLFloor instance, but its level is 2146959360. Converting that to hex, 0x7FF80000, which looks suspiciously like some cryptic sentinel value.
lazy var locationManager: CLLocationManager = {
let locationManager = CLLocationManager()
locationManager.delegate = self
locationManager.desiredAccuracy = kCLLocationAccuracyBest
return locationManager
}()
override func viewDidAppear(_ animated: Bool) {
super.viewDidAppear(animated)
switch CLLocationManager.authorizationStatus() {
case .notDetermined: locationManager.requestWhenInUseAuthorization()
case .denied: redirectToSettings()
default: break
}
}
#IBAction func didTapGetLocation(_ sender: Any) {
updateLabels(for: locationManager.location)
}
func updateLabels(for location: CLLocation?) {
guard let location = location else {
floorLabel.text = "No location."
return
}
if let floor = location.floor {
let hexString = "0x" + String(format: "%08x", floor.level)
floorLabel.text = "\(floor.level) (\(hexString))"
} else {
floorLabel.text = "No floor."
}
}
I’m seeing this behavior in on a physical iOS 13.3.1 devices, only. FWIW, older iOS versions (I’ve only got iOS 10 device sitting here) appear to return nil, as expected, as does the simulator.
What’s going on?
This problem goes away if you call startUpdatingLocation. If you do that, then the floor property will be nil. This CLFloor instance with a level value of 2146959360 (0x7FF80000) only appears if you query the location of the CLLocationManager without having first called startUpdatingLocation.
The documentation suggests that this location property is populated with the last known value. Regardless, the floor should be nil (for my particular location, at least) but isn’t. The level is invalid.
See this repo for example of manifestation of the bug and demonstration of how calling startUpdatingLocation works.
I’ve filed a bug report (FB7638281).

ARKit switch between ARWorldTrackingConfiguration and ARFaceTrackingConfiguration - Rear and Front Camera

In my project I want to switch between ARWorldTrackingConfiguration and ARFaceTrackingConfiguration.
I use two different types of view a ARSCNView to use the rear camera and a ARView to do the face tracking.
First I start the ARSCNView and after, if the user want, he can switch to face tracking
I start my view controller in this mode:
sceneView.delegate = self
sceneView.session.delegate = self
// Set up scene content.
setupCamera()
sceneView.scene.rootNode.addChildNode(focusSquare)
let configurationBack = ARWorldTrackingConfiguration(
configurationBack.isAutoFocusEnabled = true
configurationBack.planeDetection = [.horizontal, .vertical]
sceneView.session.run(configurationBack, options: [.resetTracking, .removeExistingAnchors])
And I load my Object (.scn)
When I want to switch to front camera I and pass to ARView I do this:
let configurationFront = ARFaceTrackingConfiguration()
// here I stop my ARSCNView session
self.sceneView.session.pause()
self.myArView = ARView.init(frame: self.sceneView.frame)
self.myArView!.session.run(configurationFront)
self.myArView!.session.delegate = self
self.view.insertSubview(self.myArView!, aboveSubview: self.sceneView)
And than I load my .rcproject
So my problem begin here, when I try to return to back camera and pass to ARWorldTracking again.
This is my method:
// remove my ARView with face tracking
self.myArView?.session.pause()
UIView.animate(withDuration: 0.2, animations: {
self.myArView?.alpha = 0
}) { (true) in
self.myArView?.removeFromSuperview()
self.myArView = nil
}
// here I restart the initial ARSCNView
let configurationBack = ARWorldTrackingConfiguration(
configurationBack.isAutoFocusEnabled = true
configurationBack.planeDetection = [.horizontal, .vertical]
session.run(configurationBack, options: [.resetTracking, .removeExistingAnchors])
When I switch to back camera, the sensor doesn't track the planes correctly.
How can I fix that, so how can I switch correctly between ARWorldTrackingConfiguration and ARFaceTrackingConfiguration?
Thanks in advance
Make sure to also remove all added nodes to the scene when you're pausing the session. Add below code after where you pause the session with self.sceneView.session.pause():
self.sceneView.scene.rootNode.enumerateChildNodes { (childNode, _) in
childNode.removeFromParentNode()
}

Run and Pause an ARSession in a specified period of time

I'm developing ARKit/Vision iOS app with gesture recognition. My app has a simple UI containing single UIView. There's no ARSCNView/ARSKView at all. I'm putting a sequence of captured ARFrames into CVPixelBuffer what then I use for VNRecognizedObjectObservation.
I don't need any tracking data from a session. I just need currentFrame.capturedImage for CVPixelBuffer. And I need to capture ARFrames at 30 fps. 60 fps is excessive frame rate.
preferredFramesPerSecond instance property is absolutely useless in my case, because it controls frame rate for rendering an ARSCNView/ARSKView. I have no ARViews. And it doesn't affect session's frame rate.
So, I decided to use run() and pause() methods to decrease a session's frame rate.
Question
I'd like to know how to automatically run and pause an ARSession in a specified period of time? The duration of run and pause methods must be 16 ms (or 0.016 sec). I suppose it might be possible through DispatchQueue. But I don't know how to implement it.
How to do it?
Here's a pseudo-code:
session.run(configuration)
/* run lasts 16 ms */
session.pause()
/* pause lasts 16 ms */
session.run(session.configuration!)
/* etc... */
P.S. I can use neither CocoaPod nor Carthage in my app.
Update: It's about how ARSession's currentFrame.capturedImage is retrieved and used.
let session = ARSession()
override func viewDidAppear(_ animated: Bool) {
super.viewDidAppear(animated)
session.delegate = self
let configuration = ARImageTrackingConfiguration() // 6DOF
configuration.providesAudioData = false
configuration.isAutoFocusEnabled = true
configuration.isLightEstimationEnabled = false
configuration.maximumNumberOfTrackedImages = 0
session.run(configuration)
spawnCoreMLUpdate()
}
func spawnCoreMLUpdate() { // Spawning new async tasks
dispatchQueue.async {
self.spawnCoreMLUpdate()
self.updateCoreML()
}
}
func updateCoreML() {
let pixelBuffer: CVPixelBuffer? = (session.currentFrame?.capturedImage)
if pixelBuffer == nil { return }
let ciImage = CIImage(cvPixelBuffer: pixelBuffer!)
let imageRequestHandler = VNImageRequestHandler(ciImage: ciImage, options: [:])
do {
try imageRequestHandler.perform(self.visionRequests)
} catch {
print(error)
}
}
If what you want is to reduce the frame rate from 60 to 30, you should use the preferredFramesPerSecond property of SCNView. I'm assuming you're using an ARSCNView, which is a subclass of SCNView.
Property documentation.
I don't think the run() and pause() strategy is the way to go because the DispatchQueue API is not designed for realtime accuracy. Which means there will be no guarantee that the pause will be 16ms every time. On top of that, restarting a session might not be immediate and could add more delay.
Also, the code you shared will at most capture only one image and as session.run(configuration) is asynchronous will probably capture no frame.
As you're not using ARSCNView/ARSKView the only way is to implement the ARSession delegate to be notified of every captured frame.
Of course the delegate will most likely be called every 16ms because that's how the camera works. But you can decide which frames you are going to process. By using the timestamp of the frame you can process a frame every 32ms and drop the other ones. Which is equivalent to a 30 fps processing.
Here is some code to get you started, make sure that dispatchQueue is not concurrent to process your buffers sequentially:
var lastProcessedFrame: ARFrame?
func session(_ session: ARSession, didUpdate frame: ARFrame) {
dispatchQueue.async {
self.updateCoreML(with: frame)
}
}
private func shouldProcessFrame(_ frame: ARFrame) -> Bool {
guard let lastProcessedFrame = lastProcessedFrame else {
// Always process the first frame
return true
}
return frame.timestamp - lastProcessedFrame.timestamp >= 0.032 // 32ms for 30fps
}
func updateCoreML(with frame: ARFrame) {
guard shouldProcessFrame(frame) else {
// Less than 32ms with the previous frame
return
}
lastProcessedFrame = frame
let pixelBuffer = frame.capturedImage
let imageRequestHandler = VNImageRequestHandler(cvPixelBuffer: pixelBuffer, options: [:])
do {
try imageRequestHandler.perform(self.visionRequests)
} catch {
print(error)
}
}
If I understand It correctly, you can achieve it via DispatchQueue. If you run below code, It prints HHH first then waits for 1 second then prints ABC. You can put your own functions to make it work for you. Of course change time interval from 1 to your desired value.
let syncConc = DispatchQueue(label:"con",attributes:.concurrent)
DispatchQueue.global(qos: .utility).async {
syncConc.async {
for _ in 0...10{
print("HHH - \(Thread.current)")
Thread.sleep(forTimeInterval: 1)
print("ABC - \(Thread.current)")
}
}
PS: I'm still not sure If Thread.sleep will block your process, If It is I'll edit my answer.

Persistence of SCNAudioPlayer on SCNNodes in Xcode Instrument

I have created a subclass of SCNNode. It is made up of few child nodes.
I have declared a method, viz. soundCasual() which adds a SCNAudioPlayer to the instance of this class. Everything is working as expected and audio is being played, when this method is called. This method is called on that node whenever that node is tapped (gesture).
Code:
class MyNode: SCNNode {
let wrapperNode = SCNNode()
let audioSource5 = SCNAudioSource(fileNamed: "audiofile.mp3")
override init() {
super.init()
if let virtualScene = SCNScene(named: "MyNode.scn", inDirectory: "Assets.scnassets/Shapes/MyNode") {
for child in virtualScene.rootNode.childNodes {
wrapperNode.addChildNode(child)
}
}
}
func soundCasual() {
DispatchQueue.global(qos: .userInteractive).async { [weak self] in
if let audioSource = self?.audioSource5 {
let audioPlayer = SCNAudioPlayer(source: audioSource)
self?.wrapperNode.removeAllAudioPlayers()
self?.wrapperNode.addAudioPlayer(audioPlayer)
}
}
}
}
Issue within Instruments (Allocations)
When I analyse my whole codebase (which is several other things), I see that whenever I tap on that node, allocation count of SCNAudioPlayer is increased by one while profiling within Instruments. But all of the increase is persistent. By definition of SCNAudioPlayer, I assumed that this the player is removed after playback, which is why the increment should be in Transient allocations, but it is not working like this. That is why I tried removeAllAudioPlayers() before adding an SCNAudioPlayer to the node, as you can see in the code for soundCasual(). But the issue remains.
Till this snapshot was taken, I had tapped on that node about 17 times, and it also shows 17 against Persistent allocations for SCNAudioPlayer.
Note: SCNAudioSource is 10, as it should be, since there are 10 audio source I am using in the app
And this is happening for all other SCNNodes in my application without fail.
Kindly help as I am not able to understand what exactly am I missing.
EDIT
As per recommended, I changed my init() as
let path = Bundle.main.path(forResource: "Keemo", ofType: "scn", inDirectory: "Assets.scnassets/Shapes/Keemo")
if let path = path , let keemo = SCNReferenceNode(url: URL(fileURLWithPath: path)) {
keemo.load()
}
func soundPlay() {
DispatchQueue.global(qos: .userInteractive).async { [weak self] in
if let audioSource = self?.audioSourcePlay {
audioSource.volume = 0.1
let audioPlayer = SCNAudioPlayer(source: audioSource)
self?.removeAllAudioPlayers()
self?.addAudioPlayer(audioPlayer)
}
}
}
Despite, allocations in Instruments show audioPlayers as persistent. Though on checking node.audioPlayers it shows that at one point, there is only one audioPlayer node attached.
EDIT
This issue appears even in a simple case when I use the boilerplate codebase attached in a Scenekit app created by default in XCode. Hence, this issue has been raised as a bug to Apple. https://bugreport.apple.com/web/?problemID=43482539
WORKAROUND
I am using AVAudioPlayer instead of SCNAudioPlayer, not exactly the same thing, but at least memory this way will not cause a crash.
I am not familiar with SceneKit, but from my experience with UIKit and SpriteKit I suspect that your use of wrapperNode and virtualScene is messing with the garbage collector.
I would try removing wrapperNode and adding everything to self (since self is a SCNNode).
Which node is being used in your scene? self or wrapperNode? And with your sample code wrapperNode is not added to self so it may or may not actually be part of the scene.
Also, you should probably be using SCNReferenceNode instead of the virtual scene thing you're using.
!!! this code has not been tested !!!
class MyNode: SCNReferenceNode {
let audioSource5 = SCNAudioSource(fileNamed: "audiofile.mp3")
func soundCasual() {
DispatchQueue.global(qos: .userInteractive).async { [weak self] in
if let audioSource = self?.audioSource5 {
let audioPlayer = SCNAudioPlayer(source: audioSource)
self?.removeAllAudioPlayers()
self?.addAudioPlayer(audioPlayer)
}
}
}
}
// If you programmatically create this node, you'll have to call .load() on it
let referenceNode = SCNReferenceNode(URL: referenceURL)
referenceNode.load()
HtH!
If you haven't found the answer already, you need to remove the SCNAudioPlayer from the node once it has completed playing:
if let audioSource = self?.audioSourcePlay {
audioSource.volume = 0.1
let audioPlayer = SCNAudioPlayer(source: audioSource)
self?.removeAllAudioPlayers()
self?.addAudioPlayer(audioPlayer)
self?.audioplayer.didFinishPlayback = {
self?.removeAudioPlayer(audioPlayer)
}
}

Resources