MapKit Coordinate Projection Gives Erroneous Results from Scene Renderer - ios

I am working on an app that overlays a MapKit Map View with a SceneKit SceneView (similar to https://blog.classycode.com/how-to-write-a-pok%C3%A9mon-go-clone-for-ios-edf1cf1cf5ce). In my scene I have a 3D node whose position is defined such that it appears on top of a specific map coordinate.
In the scene renderer I update the nodes position by converting the map coordinate to a point and projecting that point into the 3D scene. This method allows the user to rotate the map around while the 3D node appears to stay in place.
This is working. However, the node is "flickering" as the map changes. After investigating, I discovered that the culprit is the map coordinate conversion. For some reason, when the map is rotating, sometimes the converted screen coordinate is very wrong (somewhere off the screen).
I think the issue has something to do with threading. The map rotation angle is defined in the touchesMoved function and updated in the scene renderer. I have tried moving the map rotation and node position update to the touches moved function, but that made the converted points wrong every time.
Here is what my structure looks like:
override func touchesBegan(_ touches: Set<UITouch>, with event: UIEvent) {
*grab map rotation angle start*
}
override func touchesMoved(_ touches: Set<UITouch>, with event: UIEvent) {
*calculate new map rotation angle*
}
extension GameViewController: SCNSceneRendererDelegate {
func renderer(renderer: SCNSceneRenderer, updateAtTime time: NSTimeInterval) {
*set map rotation angle*
*update 3D node position*
}
I expect the map coordinate conversion to always give me a screen point that makes sense (I know the coordinate is on the screen), but the result is intermittently wrong during rotation.
UPDATE:
Two interesting new discoveries:
1) The problem only exists when map type = sateliteFlyover (unfortunately the one I need)
2) The problem goes away when using Apple's default user-interactive map rotation (also unfortunate because I need to control the rotation differently)
It really seems like some sort of thread processing issue where there is a "dead time" after the map camera has been changed that coordinate projection just gives the wrong value. But what could Apple be doing differently from me in their rotation method that eliminates this problem?

Related

Detecting nodes on any side of a touched node

I have a game built using SceneKit and swift.
I have been struggling to figure out how to solve my problem.
I am trying to figure out how to detect nodes touching in my specific scenario. The image below demonstrates the issue I am facing... If a user touched any of the yellow cubes it would highlight that whole chain of yellow cubes. Same for the three red cubes on the bottom and the two red cubes on the top.
The way the game works is a user is given a shape of cubes. The shape can change position by the user swiping it various ways. Cubes may appear or get removed from the scene, so the position of the cubes can change easily. Finally a gravity function will pull the cubes down to the ground when the user swipes the shape, so if they twisted the image below to the right then it would end up as a brand new shape with most of the cubes in a new position.
Here is what I have tried:
override func touchesBegan(_ touches: Set<UITouch>, with event: UIEvent?) {
let touch = touches.first!
let location = touch.location(in: gameView)
let hitList = gameView.hitTest(location, options: nil)
if let hitObject = hitList.first {
let node = hitObject.node
//This is where I'm trying to detect the nodes and remove them
gameScene.rootNode.childNodes.filter({ $0.name == node.color }).forEach({ $0.removeFromParentNode() })
}
}
The problem with my code is that it removes all of the cubes that are the same color as the hit cube.
I would not use SceneKit APIs to solve this problem.
You have a game with cubes that can be arranged according to specific constraints. The application should have a model (abstract representation) of where each cube is, and the drawing of the cube is only a view of that model. Everything that involves your gameplay, including resolving which cubes are part of a chain of the same color, should be done on that abstract representation and then any update to the state of the cubes should be propagated to the SceneKit nod hierarchy.

ARSCNView's root node's "heading" doesn't match the device heading

I want a node which I add to my scene to point north. I get the heading data from Core Location, so that represents the direction the device is currently facing at the point my scene was created (and thus the direction my root node faces), and then I add the heading to my new sceneNode's eulerAngles.y, to rotate it so it faces north.
func renderer(_ renderer: SCNSceneRenderer, didRenderScene scene: SCNScene, atTime time: TimeInterval) {
if sceneNode == nil,
let heading = self.locationManager.heading {
sceneNode = SCNNode()
sceneNode.eulerAngles.y += Float(heading).degreesToRadians
sceneView.scene.rootNode.addChildNode(sceneNode)
}
}
The heading information is correct, and so rotating it by that much does rotate it by the required amount, presuming that the heading is the same direction that my root node is facing. But I'm finding that my root node's direction is not equivalent to where the device's heading is, and can sometimes by wildly off. So that means the assumption that the heading is the same as the scene node's "heading" is incorrect, and I need to be able to know how far out from the heading it is, so I can then correct it properly within my sceneNode.
Change your session configuration's worldAlignment to .gravityAndHeading.
With the default .gravity alignment, there's no absolute reference for where the x and z axes of the AR world coordinate system point — their directions are based on the initial orientation of your device when the session starts.
With the .gravityAndHeading option, the x and z axes are aligned to compass directions, so you can safely orient content relative to compass directions.

SpriteKit : enumerateBodiesAtPoint not showing correct bodies

I have a few non-completed-circles rotating constantly and a user going from circle to circle.
I removed all gravity, forces etc from my scene
Image A
Image B
Problem : I am trying to do a hit detection where I just check where the user is, and if there are SKNode's bodies at this point in the physics world of my scene. If it's a hit with the shape, the user can continue (image A), but fails if he is outside (image B)
Although the shapes are pretty complex, the scene.showPhysics seem to match my shapes precisely. (see image A and B)
let updatedOrigin = user.calculateAccumulatedFrame().origin
user.scene?.physicsWorld.enumerateBodiesAtPoint(updatedOrigin, usingBlock: { (body, stop) in
print("🍄 Shape contains \(body.node!.name)")
})
which prints
🍄 Shape contains Optional("User")
🍄 Shape contains Optional("circle")
🍄 Shape contains Optional("circle")
🍄 Shape contains Optional("circle")
🍄 Shape contains Optional("circle")
🍄 Shape contains Optional("Scene")
It prints the user and scene correctly, but also prints all the circle's shapes around, when there should only be one at this point, or none. The nodes are there, but the bodies physics should not hit.
Any ideas why it shows a hit for all those circles when it should only match 1 or none? Thanks!
Edit : additional info
- I had similar results when using user.physicsBody?.allContactedBodies()
- I am using a CGPath to create the PhysicsBody of my rotating node
I created a simple test project with a scene containing 3 arcs with physics bodies and 3 rectangle shape-nodes that identify the bounding box for each arc. I then added a touch handler that places a small circle at the tap point and a label that identifies the nodes returned by enumerateBodiesAtPoint with the touch location as the parameter. The figure below shows the results of tapping at various locations in the scene.
From the test results, it's not obvious how enumerateBodiesAtPoint determines if a physics body contains the specified point or not. It is clear, though, that the method is not consistent with its documentation. I suggest that you avoid using it in your app.
Alternatively, you can use SpriteKit's built-in contact detection:
class GameScene: SKScene, SKPhysicsContactDelegate {
override func didMoveToView(view: SKView) {
self.physicsWorld.contactDelegate = self
}
func didBeginContact(contact: SKPhysicsContact) {
// Handle contacts between physics bodies here
}
}
You can also test if a point is within a CGPath using CGPathContainsPoint. Here, you will need to iterate over the paths you used to create the arc-shaped physics bodies. The below figure shows the result of my test project that uses CGPathContainsPoint instead of enumerateBodiesAtPoint. You may need to convert the test point, with convertPoint, to the appropriate coordinate space prior passing it to CGPathContainsPoint.

When should NodeAtPoint method return SKShapeNode?

I have an iOS app, written with SpriteKit.
It uses 4 SKShapeNode elements to determine which part of the screen the user taps in.
All these elements are triangles and they split the screen like a big X.
They are created using CGPath, which draws a triangle.
To determine if the user tapped in the node I use scene method - NodeAtPoint.
In iOS 8.4 everything worked as expected: when I tap inside the triangle, NodeAtPoint returns the SKShapeNode I just tapped.
In iOS 9.0 NodeAtPoint returns the SKShapeNode even if I tap outside its bounds, but inside the bounds of rectangle in which my SKShapeNode fits into.
So my question is: How should NodeAtPoint method work for SKShapeNode?
It returns the SKShapeNode if the Point is inside its shape.
It returns the SKShapeNode if the Point is inside its frame.
Please refer to https://developer.apple.com/library/ios/documentation/SpriteKit/Reference/SKNode_Ref/#//apple_ref/occ/instm/SKNode/nodeAtPoint:
SKNode's nodeAtPoint function always returns an SKNode. You can then cast SKShapeNode to the SKNode. To answer your other question the frame is calculated using the calculateAccumulatedFrame function. You can see that here https://developer.apple.com/library/ios/documentation/SpriteKit/Reference/SKNode_Ref/#//apple_ref/occ/instm/SKNode/calculateAccumulatedFrame
So you might do something like this:
if let node = scene.nodeAtPoint(CGPointMake(0,0)) as? SKShapeNode {
//do whatever
}

node's position appears constant during physics simulation

I'm running some physics on a node. The node moves in all directions, but when tracking it's position it appears to be stationary.
func renderer(renderer: SCNSceneRenderer, didSimulatePhysicsAtTime time: NSTimeInterval) {
print("position.y: \(starNode.position.y)")
The message in the debugger is "position.y: 5.578" (which is the position I assigned in Scene Editor) for every call to renderer:didSimulatePhysicsAtTime method
What's going on?
that a look at the presentationNode property.

Resources