node's position appears constant during physics simulation - ios

I'm running some physics on a node. The node moves in all directions, but when tracking it's position it appears to be stationary.
func renderer(renderer: SCNSceneRenderer, didSimulatePhysicsAtTime time: NSTimeInterval) {
print("position.y: \(starNode.position.y)")
The message in the debugger is "position.y: 5.578" (which is the position I assigned in Scene Editor) for every call to renderer:didSimulatePhysicsAtTime method
What's going on?

that a look at the presentationNode property.

Related

MapKit Coordinate Projection Gives Erroneous Results from Scene Renderer

I am working on an app that overlays a MapKit Map View with a SceneKit SceneView (similar to https://blog.classycode.com/how-to-write-a-pok%C3%A9mon-go-clone-for-ios-edf1cf1cf5ce). In my scene I have a 3D node whose position is defined such that it appears on top of a specific map coordinate.
In the scene renderer I update the nodes position by converting the map coordinate to a point and projecting that point into the 3D scene. This method allows the user to rotate the map around while the 3D node appears to stay in place.
This is working. However, the node is "flickering" as the map changes. After investigating, I discovered that the culprit is the map coordinate conversion. For some reason, when the map is rotating, sometimes the converted screen coordinate is very wrong (somewhere off the screen).
I think the issue has something to do with threading. The map rotation angle is defined in the touchesMoved function and updated in the scene renderer. I have tried moving the map rotation and node position update to the touches moved function, but that made the converted points wrong every time.
Here is what my structure looks like:
override func touchesBegan(_ touches: Set<UITouch>, with event: UIEvent) {
*grab map rotation angle start*
}
override func touchesMoved(_ touches: Set<UITouch>, with event: UIEvent) {
*calculate new map rotation angle*
}
extension GameViewController: SCNSceneRendererDelegate {
func renderer(renderer: SCNSceneRenderer, updateAtTime time: NSTimeInterval) {
*set map rotation angle*
*update 3D node position*
}
I expect the map coordinate conversion to always give me a screen point that makes sense (I know the coordinate is on the screen), but the result is intermittently wrong during rotation.
UPDATE:
Two interesting new discoveries:
1) The problem only exists when map type = sateliteFlyover (unfortunately the one I need)
2) The problem goes away when using Apple's default user-interactive map rotation (also unfortunate because I need to control the rotation differently)
It really seems like some sort of thread processing issue where there is a "dead time" after the map camera has been changed that coordinate projection just gives the wrong value. But what could Apple be doing differently from me in their rotation method that eliminates this problem?

SCNNode rotates inside anchor coordinate system

I want to place a figure straight on the floor. I see two options where to put it:
inside anchor's SCNNode with anchor's coordinates
inside rootNode, in global coordinates, with height == anchor.transform[3][1]
I don't turn off tracking because I see that stability of tracking improves in first 10-20 second.
In the first case, my figure rotates randomly (because anchor tends to increase the extent and wants to fit extent's rectangle in the tracking area). In the second case, the figure may be upper or lower than the actual floor (I can see it by adding extra "floor" inside anchor's SCNNode).
I can use the first case and make transformations to compensate rotation but it does not look like a right solution.
What is the right way to place a figure on the floor?
I guess you have the anchor from a callback from something like arConfiguration.planeDetection = .horizontal where arConfiguration is defined as let arConfiguration = ARWorldTrackingConfiguration().
When a callback like func renderer(_ renderer: SCNSceneRenderer, didAdd node: SCNNode, for anchor: ARAnchor) is called by ARKit you should add the node to the ARKit scene.rootNode. For the same plane this callback will be called: func renderer(_ renderer: SCNSceneRenderer, didUpdate node: SCNNode, for anchor: ARAnchor). Then, to have the same object in the same position during the ARKit scene exploration you should act in the "add" callback.
Did i get it right?
Hope this helps

How to keep ARKit SCNNode in place

Hey I'm trying to figure out. How to keep a simple node in place. As I walk around it in ARKit
Code:
func renderer(_ renderer: SCNSceneRenderer, didAdd node: SCNNode, for anchor: ARAnchor) {
if let planeAnchor = anchor as? ARPlaneAnchor {
if planeDetected == false { // Bool only allows 1 plane to be added
planeDetected = true
self.addPlane(node: node, anchor: planeAnchor)
}
}
}
This adds the SCNNode
func addPlane(node: SCNNode, anchor: ARPlaneAnchor) {
// We add the anchor plane here
let showDebugVisuals = Bool()
let plane = Plane(anchor, showDebugVisuals)
planes[anchor] = plane
node.addChildNode(plane)
// We add our custom SCNNode here
let scene = SCNScene(named: "art.scnassets/PlayerModel.scn")!
let Body = scene.rootNode.childNode(withName: "Body", recursively: true)!
Body.position = SCNVector3.positionFromTransform(anchor.transform)
Body.movabilityHint = .movable
wrapperNode.position = SCNVector3.positionFromTransform(anchor.transform)
wrapperNode.addChildNode(Body)
scnView.scene.rootNode.addChildNode(wrapperNode)
Ive tried adding a Plane/Anchor Node and putting the "Body" node in that but it still moves. I thought maybe it has something to do with the update function.
func renderer(_ renderer: SCNSceneRenderer, didUpdate node: SCNNode, for anchor: ARAnchor) {
}
Or most likely the position setting
wrapperNode.position = SCNVector3.positionFromTransform(anchor.transform)
Iv'e looked through every source / project file / video on the internet and nobody has a simple solution to this simple problem.
There are two kinds of "moving around" that could be happening here.
One is that ARKit is continuously refining its estimate of how the device's position in the real world maps to the abstract coordinate space you're placing virtual content in. For example, suppose you put a virtual object at (0, 0, -0.5), and then move your device to the left by exactly 10 cm. The virtual object will appear to be anchored in physical space only if ARKit tracks the move precisely. But visual-inertial odometry isn't an exact science, so it's possible that ARKit thinks you moved to the left by 10.5 cm — in that case, your virtual object will appear to "slip" to the right by 5 mm, even though its position in the ARKit/SceneKit coordinate space remains constant.
You can't really do much about this, other than hope Apple makes devices with better sensors, better cameras, or better CPUs/GPUs and improves the science of world tracking. (In the fullness of time, that's probably a safe bet, though that probably doesn't help with your current project.)
Since you're also dealing with plane detection, there's another wrinkle. ARKit is continuously refining its estimates of where a detected plane is. So, even though the real-world position of the plane isn't changing, its position in ARKit/SceneKit coordinate space is.
This kind of movement is generally a good thing — if you want your virtual object to appear anchored to the real-world surface, you want to be sure of where that surface is. You'll see some movement as plane detection gets more sure of the surface's position, but after a short time, you should see less "slip" as you move the camera around for plan-anchored virtual objects than those that are just floating in world space.
In your code, though, you're not taking advantage of plane detection to make your custom content (from "PlayerModel.scn") stick to the plane anchor:
wrapperNode.position = SCNVector3.positionFromTransform(anchor.transform)
wrapperNode.addChildNode(Body)
scnView.scene.rootNode.addChildNode(wrapperNode)
This code uses the initial position of the plane anchor to position wrapperNode in world space (because you're making it a child of the root node). If you instead make wrapperNode a child of the plane anchor's node (the one you received in renderer(_:didAdd:for:)), it'll stay attached to the plane as ARKit refines its estimate of the plane's position. You'll get a little bit more movement initially, but as plane detection "settles", your virtual object will "slip" less.
(When you make the node a child of the plane, you don't need to set its position — a position of zero means it's right where the plane is. Inf anything, you need to set its position only relative to the plane — i.e. how far above/below/along it.)
To keep an SCNNode in place you can disable sceneView plane detection once you get the result you desired.
let configuration = ARWorldTrackingConfiguration();
configuration.planeDetection = []
self.sceneView.session.run(configuration)
The reason for this is that ARKit constantly reestimates the position of the detected plane resulting in your SCNNode moving around.

ARSCNView's root node's "heading" doesn't match the device heading

I want a node which I add to my scene to point north. I get the heading data from Core Location, so that represents the direction the device is currently facing at the point my scene was created (and thus the direction my root node faces), and then I add the heading to my new sceneNode's eulerAngles.y, to rotate it so it faces north.
func renderer(_ renderer: SCNSceneRenderer, didRenderScene scene: SCNScene, atTime time: TimeInterval) {
if sceneNode == nil,
let heading = self.locationManager.heading {
sceneNode = SCNNode()
sceneNode.eulerAngles.y += Float(heading).degreesToRadians
sceneView.scene.rootNode.addChildNode(sceneNode)
}
}
The heading information is correct, and so rotating it by that much does rotate it by the required amount, presuming that the heading is the same direction that my root node is facing. But I'm finding that my root node's direction is not equivalent to where the device's heading is, and can sometimes by wildly off. So that means the assumption that the heading is the same as the scene node's "heading" is incorrect, and I need to be able to know how far out from the heading it is, so I can then correct it properly within my sceneNode.
Change your session configuration's worldAlignment to .gravityAndHeading.
With the default .gravity alignment, there's no absolute reference for where the x and z axes of the AR world coordinate system point — their directions are based on the initial orientation of your device when the session starts.
With the .gravityAndHeading option, the x and z axes are aligned to compass directions, so you can safely orient content relative to compass directions.

SpriteKit : enumerateBodiesAtPoint not showing correct bodies

I have a few non-completed-circles rotating constantly and a user going from circle to circle.
I removed all gravity, forces etc from my scene
Image A
Image B
Problem : I am trying to do a hit detection where I just check where the user is, and if there are SKNode's bodies at this point in the physics world of my scene. If it's a hit with the shape, the user can continue (image A), but fails if he is outside (image B)
Although the shapes are pretty complex, the scene.showPhysics seem to match my shapes precisely. (see image A and B)
let updatedOrigin = user.calculateAccumulatedFrame().origin
user.scene?.physicsWorld.enumerateBodiesAtPoint(updatedOrigin, usingBlock: { (body, stop) in
print("🍄 Shape contains \(body.node!.name)")
})
which prints
🍄 Shape contains Optional("User")
🍄 Shape contains Optional("circle")
🍄 Shape contains Optional("circle")
🍄 Shape contains Optional("circle")
🍄 Shape contains Optional("circle")
🍄 Shape contains Optional("Scene")
It prints the user and scene correctly, but also prints all the circle's shapes around, when there should only be one at this point, or none. The nodes are there, but the bodies physics should not hit.
Any ideas why it shows a hit for all those circles when it should only match 1 or none? Thanks!
Edit : additional info
- I had similar results when using user.physicsBody?.allContactedBodies()
- I am using a CGPath to create the PhysicsBody of my rotating node
I created a simple test project with a scene containing 3 arcs with physics bodies and 3 rectangle shape-nodes that identify the bounding box for each arc. I then added a touch handler that places a small circle at the tap point and a label that identifies the nodes returned by enumerateBodiesAtPoint with the touch location as the parameter. The figure below shows the results of tapping at various locations in the scene.
From the test results, it's not obvious how enumerateBodiesAtPoint determines if a physics body contains the specified point or not. It is clear, though, that the method is not consistent with its documentation. I suggest that you avoid using it in your app.
Alternatively, you can use SpriteKit's built-in contact detection:
class GameScene: SKScene, SKPhysicsContactDelegate {
override func didMoveToView(view: SKView) {
self.physicsWorld.contactDelegate = self
}
func didBeginContact(contact: SKPhysicsContact) {
// Handle contacts between physics bodies here
}
}
You can also test if a point is within a CGPath using CGPathContainsPoint. Here, you will need to iterate over the paths you used to create the arc-shaped physics bodies. The below figure shows the result of my test project that uses CGPathContainsPoint instead of enumerateBodiesAtPoint. You may need to convert the test point, with convertPoint, to the appropriate coordinate space prior passing it to CGPathContainsPoint.

Resources