iOS ARKit - move objects along path - ios

I am trying to move implement bouncing balls using ARKit. I want the balls coming from one end of screen bouncing and moving out of screen.
Can anyone please recommend best approach or point to sample code to implement this?
Can I use UIBezierPath to create a path and move SCNNode along the path. If yes, how can I move the node along path.

Create a scene and a ball Node. Use force direction for ball Node and animate the ball Node with Physics. This is just an example.
// Create a new scene
guard let scene = SCNScene(named: "BouncingBalls.scn", inDirectory: "art.scnassets") else { return }
sceneView.scene = scene
// Add physics bodies
guard let ballsNode = scene.rootNode.childNode(withName: "balls", recursively: true) else { return }
let forceDirection = SCNVector3Make(0, 3, 0)
for ballNode in ballsNode.childNodes {
let physicsBody = SCNPhysicsBody(type: .dynamic, shape: nil)
physicsBody.applyForce(forceDirection, asImpulse: true)
ballNode.physicsBody = physicsBody
}
Try this in View didLoad. Hope this works.

The best way to get what you want is to use a predefined animation or dynamics that was made in such 3D apps as Autodesk Maya, Autodesk 3dsMax, or Maxon Cinema4D.
After producing a ball animation (or dynamics, it's up to you) you need to bake this animation (baking is a process of keyframe generation for every frame instead of using interpolated curve) and export this scene as a .dae file format. ARKit/SceneKit supports not only .dae animations but also brand-new format .usdz.
SceneKit API (as well as other Apple frameworks' APIs) is not designed for 3D animation.

Related

SCNAnimationPlayer() Seems to be playing only small part of animation while using SceneKit

I made an animation in Blender and moved it to Xcode and it is converted from .dae file
to .scn by Xcode.
I can play the animation in scene graph of Xcode as it is designed in the Blender.
I am loading geometry and animation and create node for animated object.
I use SCNAnimationPlayer() to load animation. The code is below.
let scene = SCNScene(named: "scene.scn")!
let geometry = scene.rootNode.childNode(withName: "animatedGeo",
recursively:true).geometry!
let animationNode = SCNNode(geometry: geometry)
let armature = scene.rootNode.childNode(withName: "Armature", recursively: true)!
let animationPlayer = armature.animationPlayer(forKey: "action_container-Armature")!
animationNode.addAnimationPlayer(animationPlayer, forKey: "action_container-Armature")
rootNode.addChildNode(animationNode)
I did not set anything for animationPlayer programmatically because all the settings looks
Ok in scene graph window of Xcode.
However when scene loaded I see a small movement on animated object on iPhone screen.
Looks like only first (parent) bone animation is partly playing.
I could not find the why all of animation was not playing as it is played in scene graph.
Yes, I found how to do it when I was trying to manipulate my object designed and animated in Blender programmatically instead of animation I made in Blender.
What you have to do is to add "Armature" node to your scene node.
This is all. You don't have to use SCNAnimationPlayer().
let scene = SCNScene(named: "scene.scn")!
let myObjectArmaturNode = scene.rootNode.childNode(withName: "Armature", recursively: true)!
rootNode.addChildNode(myObjectArmaturNode)
This is it.
The animation runs like it is designed in Blender.
The animation consists of 4 bones by the way.

Movement of object between planes in scenekit ios

I am trying to make ludo game in scenekit but i dont know how to make an object move from one plane (boxes in ludo game for path ) to another plane. PLEASE HELP ME
It is simple just add the position to your SCNNode() object.
let boxNode1 = SCNNode(geometry: cubeGeometry1)
boxNode1.position = SCNVector3(X,Y,Z)

Align 3D object parallel to vertical plane detected by estametedVerticalPlane

I have this book, but I'm currently remixing the furniture app from the video tutorial that was free on AR/VR week.
I would like to have a 3D wall canvas aligned with the wall/vertical plane detected.
This is proving to be harder than I thought. Positioning isn't an issue. Much like the furniture placement app you can just get the column3 of the hittest.worldtransform and provide the new geometry this vector3 for position.
But I do not know what I have to do to get my 3D object rotated to face forward on the aligned detected plane. As I have a canvas object, the photo is on one side of the canvas. On placement, the photo is ALWAYS facing away.
I thought about applying a arbitrary rotation to the canvas to face forward but that then was only correct if I was looking north and place a canvas on a wall to my right.
I'v tried quite a few solutions on line all but one always use .existingPlaneUsingExtent. for vertical plane detections. This allows for you to get the ARPlaneAnchor from the
hittest.anchor? as ARPlaneAnchor.
If you try this when using .estimatedVerticalPlane the anchor? is nil
I also didn't continue down this route as my horizontal 3D objects started getting placed in the air. This maybe down to a control flow logic but I am ignoring it until the vertical canvas placement is working.
My current train of thought is to get the front vector of the canvas and rotate it towards the front facing vector of the vertical plane detected UIImage or the hittest point.
How would I get a forward vector from a 3D point. OR get the front vector from the grid image, that is a UIImage that is placed as an overlay when ARKit detects a vertical wall?
Here is an example. The canvas is showing the back of the canvas and is not parallel with the detected vertical plane that is the column. But there is a "Place Poster Here" grid which is what I want the canvas to align with and I'm able to see the photo.
Things I have tried.
using .estimatedVerticalPlane
ARKit estimatedVerticalPlane hit test get plane rotation
I don't know how to correctly apply this matrix and eular angle results from the SO answer.
my add picture function.
func addPicture(hitTestResult: ARHitTestResult) {
// I would like to convert estimate hitTest to a anchorpoint
// it is easier to rotate a node to a anchorpoint over calculating eularAngles
// we have all detected anchors in the _Renderer SCNNode. however there are
// Get the current furniture item, correct its position if necessary,
// and add it to the scene.
let picture = pictureSettings.currentPicturePiece()
//look for the vertical node geometry in verticalAnchors
if let hitPlaneAnchor = hitTestResult.anchor as? ARPlaneAnchor {
if let anchoredNode = verticalAnchors[hitPlaneAnchor]{
//code removed as a .estimatedVerticalPlane hittestResult doesn't get here
}
}else{
// Transform hitresult to world coords
let worldTransform = hitTestResult.worldTransform
let anchoredNodeOrientation = worldTransform.eulerAngles
picture.rotation.y =
-.pi * anchoredNodeOrientation.y
//set the transform matirs
let positionMatris = worldTransform.columns.3
let position = SCNVector3 (
positionMatris.x,
positionMatris.y,
positionMatris.z
)
picture.position = position + pictureSettings.currentPictureOffset();
}
//parented to rootNode of the scene
sceneView.scene.rootNode.addChildNode(picture)
}
Thanks for any help available.
Edited:
I have notice the 'handness' or the 3D model isn't correct/ is opposite?
Positive Z is pointing to the Left and Positive X is facing the camera for what I would expects is the front of the model. Is this a issue?
You should try to avoid adding node directly into the scene using world coordinates. Rather you should notify the ARSession of an area of interest by adding an ARAnchor then use the session callback to vend an SCNNode for the added anchor.
For example your hit test might look something like:
#objc func tapped(_ sender: UITapGestureRecognizer) {
let location = sender.location(in: sender.view)
guard let hitTestResult = sceneView.hitTest(location, types: [.existingPlaneUsingGeometry, .estimatedVerticalPlane]).first,
let planeAnchor = hitTestResult.anchor as? ARPlaneAnchor,
planeAnchor.alignment == .vertical else { return }
let anchor = ARAnchor(transform: hitTestResult.worldTransform)
sceneView.session.add(anchor: anchor)
}
Here a tap gesture recognized is used to detect taps within an ARSCNView. When a tap is detected a hit test is performed looking for existing and estimated planes. If the plane is vertical, we add an ARAnchor is added with the worldTransform of the hit test result, and we add that anchor to the ARSession. This will register that point as an area of interest for the ARSession, so we'll receive better tracking and less drift after our content is added there.
Next, we need to vend our SCNNode for the newly added ARAnchor. For example
func renderer(_ renderer: SCNSceneRenderer, nodeFor anchor: ARAnchor) -> SCNNode? {
if anchor is ARPlaneAnchor {
let anchorNode = SCNNode()
anchorNode.name = "anchor"
return anchorNode
} else {
let plane = SCNPlane(width: 0.67, height: 1.0)
plane.firstMaterial?.diffuse.contents = UIImage(named: "monaLisa")
let planeNode = SCNNode(geometry: plane)
planeNode.eulerAngles = SCNVector3(CGFloat.pi * -0.5, 0.0, 0.0)
let node = SCNNode()
node.addChildNode(planeNode)
return node
}
}
Here we're first checking if the anchor is an ARPlaneAnchor. If it is, we vend an empty node for debugging purposes. If it is not, then it is an anchor that was added as the result of a hit test. So we create a geometry and node for the most recent tap. Because it is a vertical plane and our content is lying flat need to rotate it about the x axis. So we adjust it's eulerAngles to have it be upright. If we were to return planeNode directly adjustment to eulerAngles would be removed so we add it as a child node of an empty node and return it.
Should result in something like the following.

SCNNode Disobeying Position after Placement

I'm using Apples SceneKit and have a custom .dae asset. I've converted the asset to a .scn file. I am grabbing the SCNNode by name from the .scn file. After placing the SCNNode in my SCNView scene as a child node and setting it's position to be SCNVector3(0,0,-1), it ignores this position and instead follows my phone location. The asset renders right on top of me and when I walk away, it follows me. It's getting to be very annoying and I can't find a solution.
However, if I replace the SCNNode taken from the .scn file, and use a SCNBox, instead, everything works just fine. The cube stays in it's set position.
Here is the relevant code:
func addCoins(at position: SCNVector3) {
let donorScene = SCNScene(named: "art.scnassets/nodes.scn")
if let coin = donorScene?.rootNode.childNode(withName: "coin", recursively: true) {
coin.position = position // 0, 0, -1 (right in front of me)
scene.scene.rootNode.addChildNode(coin)
}
}
Attached is a screenshot showing what I see whenever I move.
So the problem was that my 3d asset was too large. The asset was created at 200cm^2 which is 2meter^2 in the 3d modeling program. After the asset was imported into Xcode, it's interpreted by SceneKit as 200meters, which is way too large for SceneKit, apparently.

Add shape to sphere surface in SceneKit

I'd like to be able to add shapes to the surface of a sphere using SceneKit. I started with a simple example where I'm just trying to color a portion of the sphere's surface another color. I'd like this to be an object that can be tapped, selected, etc... so my thought was to add shapes as SCNNodes using custom SCNShape objects for the geometry.
What I have now is a blue square that I'm drawing from a series of points and adding to the scene containing a red sphere. It basically ends up tangent to a point on the sphere, but the real goal is to draw it on the surface. Is there anything in SceneKit that will allow me to do this? Do I need to do some math/geometry to make it the same shape as the sphere or map to a sphere's coordinates? Is what I'm trying to do outside the scope of SceneKit?
If this question is way too broad I'd be glad if anyone could point me towards books or resources to learn what I'm missing. I'm totally new to SceneKit and 3D in general, just having fun playing around with some ideas.
Here's some playground code for what I have now:
import UIKit
import SceneKit
import XCPlayground
class SceneViewController: UIViewController {
let sceneView = SCNView()
private lazy var sphere: SCNSphere = {
let sphere = SCNSphere(radius: 100.0)
sphere.materials = [self.surfaceMaterial]
return sphere
}()
private lazy var testScene: SCNScene = {
let scene = SCNScene()
let sphereNode: SCNNode = SCNNode(geometry: self.sphere)
sphereNode.addChildNode(self.blueChildNode)
scene.rootNode.addChildNode(sphereNode)
//scene.rootNode.addChildNode(self.blueChildNode)
return scene
}()
private lazy var surfaceMaterial: SCNMaterial = {
let material = SCNMaterial()
material.diffuse.contents = UIColor.redColor()
material.specular.contents = UIColor(white: 0.6, alpha: 1.0)
material.shininess = 0.3
return material
}()
private lazy var blueChildNode: SCNNode = {
let node: SCNNode = SCNNode(geometry: self.blueGeometry)
node.position = SCNVector3(0, 0, 100)
return node
}()
private lazy var blueGeometry: SCNShape = {
let points: [CGPoint] = [
CGPointMake(0, 0),
CGPointMake(50, 0),
CGPointMake(50, 50),
CGPointMake(0, 50),
CGPointMake(0, 0)]
var pathRef: CGMutablePathRef = CGPathCreateMutable()
CGPathAddLines(pathRef, nil, points, points.count)
let bezierPath: UIBezierPath = UIBezierPath(CGPath: pathRef)
let shape = SCNShape(path: bezierPath, extrusionDepth: 1)
shape.materials = [self.blueNodeMaterial]
return shape
}()
private lazy var blueNodeMaterial: SCNMaterial = {
let material = SCNMaterial()
material.diffuse.contents = UIColor.blueColor()
return material
}()
override func viewDidLoad() {
super.viewDidLoad()
sceneView.frame = self.view.bounds
sceneView.backgroundColor = UIColor.blackColor()
self.view.addSubview(sceneView)
sceneView.autoenablesDefaultLighting = true
sceneView.allowsCameraControl = true
sceneView.scene = testScene
}
}
XCPShowView("SceneKit", view: SceneViewController().view)
If you want to map 2D content into the surface of a 3D SceneKit object, and have the 2D content be dynamic/interactive, one of the easiest solutions is to use SpriteKit for the 2D content. You can set your sphere's diffuse contents to an SKScene, and create/position/decorate SpriteKit nodes in that scene to arrange them on the face of the sphere.
If you want to have this content respond to tap events... Using hitTest in your SceneKit view gets you a SCNHitTestResult, and from that you can get texture coordinates for the hit point on the sphere. From texture coordinates you can convert to SKScene coordinates and spawn nodes, run actions, or whatever.
For further details, your best bet is probably Apple's SceneKitReel sample code project. This is the demo that introduced SceneKit for iOS at WWDC14. There's a "slide" in that demo where paint globs fly from the camera at a spinning torus and leave paint splashes where they hit it — the torus has a SpriteKit scene as its material, and the trick for leaving splashes on collisions is basically the same hit test -> texture coordinate -> SpriteKit coordinate approach outlined above.
David Rönnqvist's SceneKit book (available as an iBook) has an example (the EarthView example, a talking globe, chapter 5) that is worth looking at. That example constructs a 3D pushpin, which is then attached to the surface of a globe at the location of a tap.
Your problem is more complicated because you're constructing a shape that covers a segment of the sphere. Your "square" is really a spherical trapezium, a segment of the sphere bounded by four great circle arcs. I can see three possible approaches, depending on what you're ultimately looking for.
The simplest way to do it is to use an image as the material for the sphere's surface. That approach is well illustrated in the Ronnqvist EarthView example, which uses several images to show the earth's surface. Instead of drawing continents, you'd draw your square. This approach isn't suitable for interactivity, though. Look at SCNMaterial.
Another approach would be to use hit test results. That's documented on SCNSceneRenderer (which SCNView complies with) and SCNHitTest. Using the hit test results, you could pull out the face that was tapped, and then its geometry elements. This won't get you all the way home, though, because SceneKit uses triangles for SCNSphere, and you're looking for quads. You will also be limited to squares that line up with SceneKit's underlying wireframe representation.
If you want full control of where the "square" is drawn, including varying its angle relative to the equator, I think you'll have to build your own geometry from scratch. That means calculating the latitude/longitude of each corner point, then generating arcs between those points, then calculating a bunch of intermediate points along the arcs. You'll have to add a fudge factor, to raise the intermediate points slightly above the sphere's surface, and build up your own quads or triangle strips. Classes here are SCNGeometry, SCNGeometryElement, and SCNGeometrySource.

Resources