I am using this to create a simple 3D object in Single View application. In a default configuration, the following code places a 10-centimeter cube 20 centimeters in front of the camera's initial position
let cubeNode = SCNNode(geometry: SCNBox(width: 0.1, height: 0.1, length: 0.1, chamferRadius: 0))
cubeNode.position = SCNVector3(0, 0, -0.2)
sceneView.scene.rootNode.addChildNode(cubeNode)
When plane detection is enabled, ARKit adds and updates anchors for each detected plane. To add visual content for these anchors, implement
ARSCNViewDelegate methods such as the following:
func renderer(_ renderer: SCNSceneRenderer, didAdd node: SCNNode, for anchor: ARAnchor) {
guard let planeAnchor = anchor as? ARPlaneAnchor else { return }
let plane = SCNPlane(width: CGFloat(planeAnchor.extent.x), height: CGFloat(planeAnchor.extent.z))
let planeNode = SCNNode(geometry: plane)
planeNode.position = SCNVector3Make(planeAnchor.center.x, 0, planeAnchor.center.z)
planeNode.transform = SCNMatrix4MakeRotation(-Float.pi / 2, 1, 0, 0)
node.addChildNode(planeNode)
}
How can i rander the 3D Object.
Related
So, I was able to put a box node at the anchors position.
Now, how do I rotate the SCNNode in the scene?
I am trying to modify the node's transform & eulerAngles, but they have no effect:
func renderer(_ renderer: SCNSceneRenderer, nodeFor anchor: ARAnchor) -> SCNNode? {
print("nodeFor anchor \(anchor)")
guard anchor.name == "card" else { return nil }
let colors = [
UIColor.yellow, // front
UIColor.red, // right
UIColor.blue, // back
UIColor.green, // left
UIColor.purple, // top
UIColor.gray] // bottom
let sideMaterials = colors.map { color -> SCNMaterial in
let material = SCNMaterial()
material.diffuse.contents = color
material.locksAmbientWithDiffuse = true
return material
}
let boxGeometry = SCNBox(width: 0.12, height: 0.01, length: 0.07, chamferRadius: 0)
boxGeometry.materials = sideMaterials
let node = SCNNode(geometry: boxGeometry)
node.transform = SCNMatrix4MakeRotation(-Float.pi / 2, 1, 0, 0)
node.eulerAngles.x = .pi / 2
return node
}
I also tried to do the rotation in func renderer(_ renderer: SCNSceneRenderer, didAdd node: SCNNode, for anchor: ARAnchor)
func renderer(_ renderer: SCNSceneRenderer, didAdd node: SCNNode, for anchor: ARAnchor) {
node.eulerAngles.x = .pi / 2
}
But that didn't help as well.
I was successful in animating rotation with code below.
let spin = CABasicAnimation(keyPath: "rotation")
// Use from-to to explicitly make a full rotation around z
spin.fromValue = NSValue(scnVector4: SCNVector4(x: 0, y: 0, z: 1, w: 0))
spin.toValue = NSValue(scnVector4: SCNVector4(x: 0, y: 0, z: 1, w: Float(2 * Double.pi)))
spin.duration = 3
spin.repeatCount = 1
node.addAnimation(spin, forKey: "rotation")
Based on that success, I also tried (but failed)
node.rotation = SCNVector4(x: 5, y: 4, z: 3, w: 0)
Does anybody have a clue how I can rotate my node in the delegate methods of ARSCNViewDelegate ?
The ARSession will position and orientate the node you vend to it. It's best to perform rotations on a child node to the one that is vended. So in func renderer(_ renderer: SCNSceneRenderer, nodeFor anchor: ARAnchor) -> SCNNode?
Prior to returning, you could say:
let node = SCNNode(geometry: boxGeometry)
node.eulerAngles = SCNVector3(x: -Float.pi / 2, y: 0, z: 0)
let rootNode = SCNNode()
rootNode.addChildNode(node)
return rootNode
I'm trying to detect an image using ARImageTrackingConfiguration and when the image is detected, an UIWebView shows up instead of the image scanned. I read a lot of problems regarding the new WBWebView but is not working either.
The problem it seems that is the UIWebView is not being updated from the main thread even thought I'm using the DispatchMain.
On a related topic (the code I use is pretty much the same), if I try to put a SKVideoNode instead of a UIWebView, the video gets played super laggy with some big pixels on top if I set plane.cornerRadius = 0.25
The code is the following
func renderer(_ renderer: SCNSceneRenderer, didAdd node: SCNNode, for anchor: ARAnchor) {
guard let imageAnchor = anchor as? ARImageAnchor else { return }
updateQueue.async {
let physicalWidth = imageAnchor.referenceImage.physicalSize.width
let physicalHeight = imageAnchor.referenceImage.physicalSize.height
// Create a plane geometry to visualize the initial position of the detected image
let mainPlane = SCNPlane(width: physicalWidth, height: physicalHeight)
mainPlane.firstMaterial?.colorBufferWriteMask = .alpha
// Create a SceneKit root node with the plane geometry to attach to the scene graph
// This node will hold the virtual UI in place
let mainNode = SCNNode(geometry: mainPlane)
mainNode.eulerAngles.x = -.pi / 2
mainNode.renderingOrder = -1
mainNode.opacity = 1
// Add the plane visualization to the scene
node.addChildNode(mainNode)
// Perform a quick animation to visualize the plane on which the image was detected.
// We want to let our users know that the app is responding to the tracked image.
self.highlightDetection(on: mainNode, width: physicalWidth, height: physicalHeight, completionHandler: {
DispatchQueue.main.async {
let request = URLRequest(url: URL(string: "https://www.facebook.com/")!)
let webView = UIWebView(frame: CGRect(x: 0, y: 0, width: 400, height: 672))
webView.loadRequest(request)
let webViewPlane = SCNPlane(width: xOffset, height: xOffset * 1.4)
webViewPlane.cornerRadius = 0.25
let webViewNode = SCNNode(geometry: webViewPlane)
webViewNode.geometry?.firstMaterial?.diffuse.contents = webView
webViewNode.position.z -= 0.5
webViewNode.opacity = 0
rootNode.addChildNode(webViewNode)
webViewNode.runAction(.sequence([
.wait(duration: 3.0),
.fadeOpacity(to: 1.0, duration: 1.5),
.moveBy(x: xOffset * 1.1, y: 0, z: -0.05, duration: 1.5),
.moveBy(x: 0, y: 0, z: -0.05, duration: 0.2)
])
)
}
})
}
}
VIDEO VERSION
func renderer(_ renderer: SCNSceneRenderer, didAdd node: SCNNode, for anchor: ARAnchor) {
guard let imageAnchor = anchor as? ARImageAnchor else { return }
updateQueue.async {
let physicalWidth = imageAnchor.referenceImage.physicalSize.width
let physicalHeight = imageAnchor.referenceImage.physicalSize.height
// Create a plane geometry to visualize the initial position of the detected image
let mainPlane = SCNPlane(width: physicalWidth, height: physicalHeight)
mainPlane.firstMaterial?.colorBufferWriteMask = .alpha
// Create a SceneKit root node with the plane geometry to attach to the scene graph
// This node will hold the virtual UI in place
let mainNode = SCNNode(geometry: mainPlane)
mainNode.eulerAngles.x = -.pi / 2
mainNode.renderingOrder = -1
mainNode.opacity = 1
// Add the plane visualization to the scene
node.addChildNode(mainNode)
// Perform a quick animation to visualize the plane on which the image was detected.
// We want to let our users know that the app is responding to the tracked image.
self.highlightDetection(on: mainNode, width: physicalWidth, height: physicalHeight, completionHandler: {
let size = imageAnchor.referenceImage.physicalSize
var videoNode = SKVideoNode()
switch imageAnchor.name {
case "Elephant-PostCard":
videoNode = SKVideoNode(fileNamed: "Movies/cat.mp4")
case "puppy":
videoNode = SKVideoNode(fileNamed: "puppy.mov")
default:
break
}
// invert our video so it does not look upside down
videoNode.yScale = -1.0
videoNode.play()
let videoScene = SKScene(size: CGSize(width: 1280, height: 720))
videoScene.anchorPoint = CGPoint(x: 0.5, y: 0.5)
videoScene.addChild(videoNode)
let plane = SCNPlane(width: size.width, height: size.height)
plane.cornerRadius = 0.25
plane.firstMaterial?.diffuse.contents = videoScene
let planeNode = SCNNode(geometry: plane)
plane.firstMaterial?.isDoubleSided = true
mainNode.addChildNode(planeNode)
})
}
}
UIWebView from UIKit module is now deprecated.
Use WKWebView from WebKit module instead.
So I am using ARKit and ARAnchors in an ARImageTrackingConfiguration(). I am using multiple versions of the same ARAnchor with different colored backgrounds. I would like to be able to get the background color of the ARAnchor being recognized (because if I register them all as ARAnchors it mixes them up even if they have different backgrounds). The way I was thinking of doing this was to get the position of the ARAnchor in 2D space and then obtaining the color of a pixel within the frame like this.
The problem is I can't find a way to get the ARAnchor's position in 2D space. I am using a setup like below. If you have any idea how I can translate the coordinates or a better way to do this please let me know.
func renderer(_ renderer: SCNSceneRenderer, nodeFor anchor: ARAnchor) -> SCNNode? {
let node = SCNNode()
if let imageAnchor = anchor as? ARImageAnchor {
if (anchor.name == "EXAMPLE MARKER") {
let plane = SCNPlane(width: imageAnchor.referenceImage.physicalSize.width, height: imageAnchor.referenceImage.physicalSize.height)
plane.firstMaterial?.diffuse.contents = UIImage(named: "EXAMPLE")
let planeNode = SCNNode(geometry: plane)
planeNode.eulerAngles.x = -.pi / 2
node.addChildNode(planeNode)
// HOW TO CONVERT ANY OF THESE COORDINATE SYSTEMS TO GRECT? CGPOINT?
}
}
return node
}
I've tried to get the position as follows but it always comes out as all zeros
print(anchor.accessibilityFrame)
print(imageAnchor.accessibilityFrame)
print(node.frame)
print(node.boundingBox.max,node.boundingBox.min)
Results
(0.0, 0.0, 0.0, 0.0)
(0.0, 0.0, 0.0, 0.0)
(0.0, 0.0, 0.0, 0.0)
SCNVector3(x: 0.0, y: 0.0, z: 0.0) SCNVector3(x: 0.0, y: 0.0, z: 0.0)
I had the following code which produced the error:
func renderer(_ renderer: SCNSceneRenderer, didAdd node: SCNNode, for anchor: ARAnchor) {
if anchor is ARImageAnchor {
let phoneScene = SCNScene(named: "Phone_01.scn")!
let phoneNode = phoneScene.rootNode.childNode(withName: "parentNode", recursively: true)!
// rotate the phone node
let rotationAction = SCNAction.rotateBy(x: 0, y: 0.5, z: 0, duration: 1)
let inifiniteAction = SCNAction.repeatForever(rotationAction)
phoneNode.runAction(inifiniteAction)
phoneNode.position = SCNVector3(anchor.transform.columns.3.x,anchor.transform.columns.3.y + 0.1,anchor.transform.columns.3.z)
node.addChildNode(phoneNode)
}
}
Scene is modified in a rendering callback of another scene.
So I replaced it with the following:
func renderer(_ renderer: SCNSceneRenderer, didAdd node: SCNNode, for anchor: ARAnchor) {
if anchor is ARImageAnchor {
DispatchQueue.global().async {
let phoneScene = SCNScene(named: "Phone_01.scn")!
let phoneNode = phoneScene.rootNode.childNode(withName: "parentNode", recursively: true)!
DispatchQueue.main.async {
// rotate the phone node
let rotationAction = SCNAction.rotateBy(x: 0, y: 0.5, z: 0, duration: 1)
let inifiniteAction = SCNAction.repeatForever(rotationAction)
phoneNode.runAction(inifiniteAction)
phoneNode.position = SCNVector3(anchor.transform.columns.3.x,anchor.transform.columns.3.y + 0.1,anchor.transform.columns.3.z)
node.addChildNode(phoneNode)
}
}
}
}
And now the error is gone and everything works OK. My question is: is that the correct solution? Should I switch to background thread to load the scene and then to main thread to add the nodes. Are nodes even added on the main thread?
Try something like this in the method delegate. This was an example of an old proyect.
DispatchQueue.main.async {
if let imageAnchor = anchor as? ARImageAnchor {
let plane = SCNPlane(width: imageAnchor.referenceImage.physicalSize.width, height: imageAnchor.referenceImage.physicalSize.height)
plane.firstMaterial?.diffuse.contents = UIColor(white: 1.0, alpha: 0.5)
let planeNode = SCNNode(geometry: plane)
planeNode.eulerAngles.x = -.pi
node.addChildNode(planeNode)
...
}
}
So, I was able to put a box node at the anchors position.
Now, how do I rotate the SCNNode in the scene?
I am trying to modify the node's transform & eulerAngles, but they have no effect:
func renderer(_ renderer: SCNSceneRenderer, nodeFor anchor: ARAnchor) -> SCNNode? {
print("nodeFor anchor \(anchor)")
guard anchor.name == "card" else { return nil }
let colors = [
UIColor.yellow, // front
UIColor.red, // right
UIColor.blue, // back
UIColor.green, // left
UIColor.purple, // top
UIColor.gray] // bottom
let sideMaterials = colors.map { color -> SCNMaterial in
let material = SCNMaterial()
material.diffuse.contents = color
material.locksAmbientWithDiffuse = true
return material
}
let boxGeometry = SCNBox(width: 0.12, height: 0.01, length: 0.07, chamferRadius: 0)
boxGeometry.materials = sideMaterials
let node = SCNNode(geometry: boxGeometry)
node.transform = SCNMatrix4MakeRotation(-Float.pi / 2, 1, 0, 0)
node.eulerAngles.x = .pi / 2
return node
}
I also tried to do the rotation in func renderer(_ renderer: SCNSceneRenderer, didAdd node: SCNNode, for anchor: ARAnchor)
func renderer(_ renderer: SCNSceneRenderer, didAdd node: SCNNode, for anchor: ARAnchor) {
node.eulerAngles.x = .pi / 2
}
But that didn't help as well.
I was successful in animating rotation with code below.
let spin = CABasicAnimation(keyPath: "rotation")
// Use from-to to explicitly make a full rotation around z
spin.fromValue = NSValue(scnVector4: SCNVector4(x: 0, y: 0, z: 1, w: 0))
spin.toValue = NSValue(scnVector4: SCNVector4(x: 0, y: 0, z: 1, w: Float(2 * Double.pi)))
spin.duration = 3
spin.repeatCount = 1
node.addAnimation(spin, forKey: "rotation")
Based on that success, I also tried (but failed)
node.rotation = SCNVector4(x: 5, y: 4, z: 3, w: 0)
Does anybody have a clue how I can rotate my node in the delegate methods of ARSCNViewDelegate ?
The ARSession will position and orientate the node you vend to it. It's best to perform rotations on a child node to the one that is vended. So in func renderer(_ renderer: SCNSceneRenderer, nodeFor anchor: ARAnchor) -> SCNNode?
Prior to returning, you could say:
let node = SCNNode(geometry: boxGeometry)
node.eulerAngles = SCNVector3(x: -Float.pi / 2, y: 0, z: 0)
let rootNode = SCNNode()
rootNode.addChildNode(node)
return rootNode