Does anybody else also experience non-working ARKit scene on an iPhone8?
When I download Apple's ARKit example and run it on an iPhone8, it stays on Initializing - when I check the ARSCNViewDelegate implementation:
func renderer(_ renderer: SCNSceneRenderer, didAdd node: SCNNode, for anchor: ARAnchor) {
guard let planeAnchor = anchor as? ARPlaneAnchor else { return }
DispatchQueue.main.async {
self.statusViewController.cancelScheduledMessage(for: .planeEstimation)
self.statusViewController.showMessage("SURFACE DETECTED")
[..]
}
[..]
}
It seems as if it never goes beyond the guard, so a ARPlaneAnchor is never being added to the scene...
The same project on an iPhone6s / iPhone7 runs just fine though...
Does anyone else know how to fix this?
Related
I`m using ARKit to detect if mouth is open or not.
Mouth Open:
This value is controlled by how much you open your mouth.
Range: 0.0 to 1.0
However, when you yawn the value is saying that mouth is closed. (wtf)
I'm receiving values from method faceAnchor.blendShapes
func renderer(_ renderer: SCNSceneRenderer, didUpdate node: SCNNode, for anchor: ARAnchor) {
guard let faceAnchor = anchor as? ARFaceAnchor,
let faceGeometry = node.geometry as? ARSCNFaceGeometry else {
return
}
I saw the equivalent on Android on this post
Android Mobile Vision API detect mouth is open
I'm building AR Scanner application where users are able to scan different images and receive rewards for this.
When they point camera at some specific image - I place SCNNode on top of that image and after they remove camera from that image - SCNNode get's dismissed.
But when image disappears and camera stays at the same position SCNNode didn't get dismissed.
How can I make it disappear together with Reference image disappearance?
I have studied lot's of other answers here, on SO, but they didn't help me
Here's my code for adding and removing SCNNode's:
extension ARScannerScreenViewController: ARSCNViewDelegate {
func renderer(_ renderer: SCNSceneRenderer, didAdd node: SCNNode, for anchor: ARAnchor) {
DispatchQueue.main.async { self.instructionLabel.isHidden = true }
if let imageAnchor = anchor as? ARImageAnchor {
handleFoundImage(imageAnchor, node)
imageAncors.append(imageAnchor)
trackedImages.append(node)
} else if let objectAnchor = anchor as? ARObjectAnchor {
handleFoundObject(objectAnchor, node)
}
}
func renderer(_ renderer: SCNSceneRenderer, updateAtTime time: TimeInterval) {
guard let pointOfView = sceneView.pointOfView else { return }
for (index, item) in trackedImages.enumerated() {
if !(sceneView.isNode(item, insideFrustumOf: pointOfView)) {
self.sceneView.session.remove(anchor: imageAncors[index])
}
}
}
private func handleFoundImage(_ imageAnchor: ARImageAnchor, _ node: SCNNode) {
let name = imageAnchor.referenceImage.name!
print("you found a \(name) image")
let size = imageAnchor.referenceImage.physicalSize
if let imageNode = showImage(size: size) {
node.addChildNode(imageNode)
node.opacity = 1
}
}
private func showImage(size: CGSize) -> SCNNode? {
let image = UIImage(named: "InfoImage")
let imageMaterial = SCNMaterial()
imageMaterial.diffuse.contents = image
let imagePlane = SCNPlane(width: size.width, height: size.height)
imagePlane.materials = [imageMaterial]
let imageNode = SCNNode(geometry: imagePlane)
imageNode.eulerAngles.x = -.pi / 2
return imageNode
}
private func handleFoundObject(_ objectAnchor: ARObjectAnchor, _ node: SCNNode) {
let name = objectAnchor.referenceObject.name!
print("You found a \(name) object")
}
}
I also tried to make it work using ARSession, but I couldn't even get to prints:
func session(_ session: ARSession, didUpdate anchors: [ARAnchor]) {
for anchor in anchors {
for myAnchor in imageAncors {
if let imageAnchor = anchor as? ARImageAnchor, imageAnchor == myAnchor {
if !imageAnchor.isTracked {
print("Not tracked")
} else {
print("tracked")
}
}
}
}
}
You have to use ARWorldTrackingConfiguration instead of ARImageTrackingConfiguration. It's quite bad idea to use both configurations in app because each time you switch between them – tracking state is reset and you have to track from scratch.
Let's see what Apple documentation says about ARImageTrackingConfiguration:
With ARImageTrackingConfiguration, ARKit establishes a 3D space not by tracking the motion of the device relative to the world, but solely by detecting and tracking the motion of known 2D images in view of the camera.
The basic differences between these two configs are about how ARAnchors behave:
ARImageTrackingConfiguration allows you get ARImageAnchors only if your reference images is in a Camera View. So if you can't see a reference image – there's no ARImageAnchor, thus there's no a 3D model (it's resetting each time you cannot-see-it-and-then-see-it-again). You can simultaneously detect up to 100 images.
ARWorldTrackingConfiguration allows you track a surrounding environment in 6DoF and get ARImageAnchor, ARObjectAnchor, or AREnvironmentProbeAnchor. If you can't see a reference image – there's no ARImageAnchor, but when you see it again ARImageAnchor is still there. So there's no reset.
Conclusion:
ARWorldTrackingConfiguration's cost of computation is much higher. However this configuration allows you perform not only image tracking but also hit-testing and ray-casting for detected planes, object detection, and a restoration of world maps.
Use nodeForAnchor to load your nodes, so when the anchors disappear, the nodes will go as well.
I'm pretty new to AR Kit but recently I found that the image tracking feature is quite awesome. I found it's as simple as:
let referenceImages = ARReferenceImage.referenceImages(inGroupNamed: "AR Resources", bundle: Bundle.main)
let configuration = ARImageTrackingConfiguration()
configuration.trackingImages = referenceImages
configuration.maximumNumberOfTrackedImages = 1
sceneView.session.run(configuration)
which works beautifully! However, I want to further the experience by identifying which image has been tracked and display different AR objects / nodes based on the image that was tracked. Is there a way to get more information on the specific image that is currently being tracked?
In you AR Reference Group in your assets catalog, when you click the reference image, you can open the attributes inspector and enter a "Name."
This name is then reflected in the name property of the ARImageAnchor for the anchor that is created when the AR session begins to track that specific image.
Then in
func renderer(_ renderer: SCNSceneRenderer, nodeFor anchor: ARAnchor) -> SCNNode?
You can inspect the anchor and respond accordingly. For example:
func renderer(_ renderer: SCNSceneRenderer, nodeFor anchor: ARAnchor) -> SCNNode? {
guard let anchor = anchor as? ARImageAnchor else { return nil }
if anchor.name == "calculator" {
print("tracking calculator image")
return SCNNode.makeMySpecialCalculatorNode()
}
return nil
}
I am trying to return a previously created node in my Session Delegate:
func renderer(_ renderer: SCNSceneRenderer, nodeFor anchor: ARAnchor) -> SCNNode? {
guard let planeAnchor = anchor as? ARPlaneAnchor else {return SCNNode()}
let anchorNode = sceneView.anchor(for: earthNode)
if anchorNode == nil || anchorNode == planeAnchor {
return earthNode
}
return nil
}
I am trying to see whether there is already an anchor assigned to my node, and if not, return earthNode.
My problem is that let anchorNode = sceneView.anchor(for: earthNode) is either freezing, or infinite looping.
My working theory is that this is due to the fact that earthNode isn't yet placed in the scene. But that seems like a wonky explanation. I also, of course, presume that my usage of ARkit reeks of ignorance.
I understand how to extract feature points for a single ARFrame (ARFrame.rawFeaturePoints). Is there anyway to extract all feature points that have been detected in the session? Is this something I have to aggregate myself? If so, how should I handle point matching?
find the ways to capture the feature points.
You can implement the ARSCNViewDelegate, you will get a callback whenever AR finds plane..here you can capture all the feature points.
func renderer(_ renderer: SCNSceneRenderer, didAdd node: SCNNode, for anchor: ARAnchor) {
if let planeAnchor = anchor as? ARPlaneAnchor {
let cloudPoints = sceneView.session.currentFrame?.rawFeaturePoints
}
}
2.You can implement the SCNSceneRendererDelegate, implement the
func renderer(_ renderer: SCNSceneRenderer, updateAtTime time: TimeInterval) { }.