IOS11 Beta ARKit can't scale Scene object - ios

I created a basic scene, and added an dae file.
First every time i run or save the project i get the popup:
The document “billboard.dae” could not be saved.
It still runs though but is annoying.
But the issue is I can't scale the object.
I have tried different values 0.5s and also > 1 but nothing seems to work. Here is my code
override func viewDidLoad()
{
super.viewDidLoad()
sceneView.delegate = self
sceneView.showsStatistics = true
let scene = SCNScene(named: "art.scnassets/billboard.dae")!
let billboardNode = scene.rootNode.childNode(withName: "billboard", recursively: true)
// billboardNode?.position = SCNVector3Make(0, 0, 1)
billboardNode?.position.z = 10
billboardNode?.scale.z = 0.5
// billboardNode?.scale = SCNVector3Make(0.4,0.4, 0.4)
sceneView.scene = scene
}
Any ideas?
Thanks

Have you verified billboardNode is not nil? You're sending an optional (the result of looking for a child node with a given name) position and scaling messages but if it's nil (because finding the child node failed) it won't have any impact.
The error suggests to me there was some problem converting the .dae file, which might explain why the scene can't locate the asset by name. Or it might be as simple as "billboard" vs. "Billboard".

Related

Normal mapping in Scenekit

I am trying to add normal map for a 3D model in swift using SCNMaterial properties. The diffuse property is working but no other property including normal property is visible on the screen. When I debug to check if the node's material consists of the normal property, it shows the property exists with the image that I added.
I have also checked if the normal image that I am using is correct or not in the SceneKit Editor where it works fine.
I have added the code that I am using.
let node = SCNNode()
node.geometry = SCNSphere(radius: 0.1)
node.geometry!.firstMaterial!.diffuse.contents = UIColor.lightGray
node.geometry!.firstMaterial!.normal.contents = UIImage(named: "normal")
node.position = SCNVector3(0,0,0)
sceneView.scene.rootNode.addChildNode(node)
This is the output I am getting
I am expecting something like this
I got the solution. Since I did not enable DefaultLighting, there was no lighting in the scene. Added this to the code.
sceneView.autoenablesDefaultLighting = true
Given the screenshot, it seems like there is no lighting in the scene, or the material does not respond to lighting, since the sphere is not shaded. For a normal map to work, lighting has to be taken into account, because it responds to lighting direction. Have you tried creating an entirely new SCNMaterial and played with its properties? (I.E. https://developer.apple.com/documentation/scenekit/scnmaterial/lightingmodel seems interesting)
I would try setting
node.geometry!.firstMaterial!.lightingModel = .physicallyBased
Try this.
let scene = SCNScene()
let sphere = SCNSphere(radius: 0.1)
let sphereMaterial = SCNMaterial()
sphereMaterial.diffuse.contents = UIImage(named: "normal.png")
let sphereNode = SCNNode()
sphereNode.geometry = sphere
sphereNode.geometry?.materials = [sphereMaterial]
sphereNode.position = SCNVector3(0.5,0.1,-1)
scene.rootNode.addChildNode(sphereNode)
sceneView.scene = scene

How to add a 3D object as SCNNode in ARKit2?

I am new in ARKit2 and followed a couple of tutorials and official documentation.
problem
How to add 3d object as SCNNode()?
Code
let artFrame = SCNBox(width: CGFloat(w), height: CGFloat(h), length: 0.002, chamferRadius: 0.02)
artFrame.firstMaterial?.diffuse.contents = imageToDisplay
let artFrameNode = SCNNode(geometry: artFrame)
artFrameNode.physicsBody = SCNPhysicsBody(type: .static, shape: SCNPhysicsShape(geometry: artFrame, options: nil))
artFrameNode.position = SCNVector3Make(Float(x), Float(y), Float(z*3))
sceneView.scene.rootNode.addChildNode(artFrameNode)
As per the above code, I am using predefined SCNBox in SCNNode. Is there any way that I can use 3d object such as .dae, .obj instead of SCNBox and wrap with Image?
I check out the documentation and it says that you can add Mesh objects to SCNNode :
https://developer.apple.com/documentation/scenekit/scnnode/1419841-init
Edit
Whenever I am adding a new .dae and converting to .scn. Xcode is throwing follwing error:
/Users/paly/Library/Developer/Xcode/DerivedData/sample-bohiilnhthuwfkechtmhscucgccc/Build/Products/Debug-iphoneos/sample.app: resource fork, Finder information, or similar detritus not allowed
Command CodeSign failed with a nonzero exit code
To add .dae or .obj object to the scene, you need to get childNode of the .dae object and simply add it to your scene. I recommend to convert your .dae file to .scn simply using Xcode's Editor menu.
You also need the node's name, which can be accessed in Scene Graph View. Just click on your .obj file and click the Scene Graph View button on bottom left corner of the Xcode scene editor. Here, my node's name is "objectNode":
Now you can add the 3D object as a SCNNode to your scene:
override func viewDidLoad() {
super.viewDidLoad()
...
let objectScene = SCNScene(named: "object.scn") // Here, add your .obj or .scn file
let objectNode: SCNNode = objectScene.rootNode.childNode(withName: "YourObjectName", recursively: true) // Get the object name from Scene Graph View, which is "objectNode" for me.
objectNode.position = SCNVector3(0,0,-4)
let scene = SCNScene() // Main scene of the app
scene.rootNode.addChildNode(objectNode)
sceneView.scene = scene
...
}

SceneKit: too much memory persisting

I’m out of ideas here, SceneKit is piling on the memory and I’m only getting started. I’m displaying SNCNodes which are stored in arrays so I can separate components of the molecule for animation. These trees model molecules of which I will ultimately have maybe 50 to display, say one per “chapter”. The issue is when I move to another chapter the molecules from previous chapters persist in memory.
The molecule nodes are trees of child nodes. About half of the nodes are empty containers for orientation purposes. Otherwise, the geometries are SCNPrimitives (spheres, capsules, and cylinders). Each geometry has a specular and a diffuse material consisting of a UIColor, no textures are used.
When the app first boots, these molecules are constructed from code and archived into a dictionary. Then, and on subsequent boots, the archived dictionary is read into a local dictionary for use by the VC. (I’m removing safety features in this post for brevity.)
moleculeDictionary = Molecules.readFile() as! [String: [SCNNode]]
When a chapter wants to display a molecule it calls a particular function that loads the needed components for a given molecule from the local dictionary into local SCNNode properties.
// node stores (reuseable)
var atomsNode_1 = SCNNode()
var atomsNode_2 = SCNNode()
. . .
func lysozyme() { // called by a chapter to display this molecule
. . .
components = moleculeDictionary["lysozyme"]
atomsNode_1 = components[0] // protein w/CPK color
baseNode.addChildNode(atomsNode_1)
atomsNode_2 = components[2] // NAG
baseNode.addChildNode(atomsNode_2)
. . .
}
Before the next molecule is to be displayed, I call a “clean up” function:
atomsNode_1.removeFromParentNode()
atomsNode_2.removeFromParentNode()
. . .
When I investigate in instruments, most of the bloated memory is 32 kB chunks called by C3DMeshCreateFromProfile and 80 kB chunks of C3DMeshCreateCopyWithInterleavedSources.
I also have leaks I need to trace which are traceable to the NSKeyedUnarchiver decoding of the archive. So I need to deal with these as well but they are a fraction of the memory use that’s accumulating each molecule call.
If I return to a previously viewed molecule, there is no further increase in memory usage, it all accumulates and persists.
I’ve tried declaring atomsNode_1 and its kin as optionals then setting them to nil at clean up time. No help. I’ve tried, in the clean up function,
atomsNode_1.enumerateChildNodesUsingBlock({
node, stop in
node.removeFromParentNode()
})
Well, the memory goes back down but the nodes seem to now be permanently gone from the loaded dictionary. Damn reference types!
So maybe I need a way to archive the [SCNNode] arrays in such a way as to unarchive and retrieve them individually. In this scenario I would clear them out of memory when done and reload from the archive when revisiting that molecule. But I know not yet how to do either of these. I’d appreciate comments about this before investing more time to be frustrated.
Spheres, capsules, and cylinders all have fairly dense meshes. Do you need all that detail? Try reducing the various segment count properties (segmentCount, radialSegmentCount, etc). As a quick test, substitute SCNPyramid for all of your primitive types (that's the primitive with the lowest vector count). You should see a dramatic reduction in memory use if this is a factor (it will look ugly, but will give you immediate feedback on whether you're on a usable track). Can you use a long SCNBox instead of a cylinder?
Another optimization step would be to use SCNLevelOfDetail to allow substitute, low vertex count geometry when an object is far away. That would be more work than simply reducing the segment counts uniformly, but would pay off if you sometimes need greater detail.
Instead of managing the components yourself in arrays, use the node hierarchy to do that. Create each molecule, or animatable piece of a molecule, as a tree of SCNNodes. Give it a name. Make a flattenedClone. Now archive that. Read the node tree from archive when you need it; don't worry about arrays of nodes.
Consider writing two programs. One is your iOS program that manipulates/displays the molecules. The other is a Mac (or iOS?) program that generates your molecule node trees and archives them. That will give you a bunch of SCNNode tree archives that you can embed, as resources, in your display program, with no on-the-fly generation.
An answer to scene kit memory management using swift notes the need to nil out "textures" (materials or firstMaterial properties?) to release the node. Seems worth a look, although since you're just using UIColor I doubt it's a factor.
Here's an example of creating a compound node and archiving it. In real code you'd separate the archiving from the creation. Note also the use of a long skinny box to simulate a line. Try a chamfer radius of 0!
extension SCNNode {
public class func gizmoNode(axisLength: CGFloat) -> SCNNode {
let offset = CGFloat(axisLength/2.0)
let axisSide = CGFloat(0.1)
let chamferRadius = CGFloat(axisSide)
let xBox = SCNBox(width: axisLength, height: axisSide, length: axisSide, chamferRadius: chamferRadius)
xBox.firstMaterial?.diffuse.contents = NSColor.redColor()
let yBox = SCNBox(width: axisSide, height: axisLength, length: axisSide, chamferRadius: chamferRadius)
yBox.firstMaterial?.diffuse.contents = NSColor.greenColor()
let zBox = SCNBox(width: axisSide, height: axisSide, length: axisLength, chamferRadius: chamferRadius)
zBox.firstMaterial?.diffuse.contents = NSColor.blueColor()
let xNode = SCNNode(geometry: xBox)
xNode.name = "X axis"
let yNode = SCNNode(geometry: yBox)
yNode.name = "Y axis"
let zNode = SCNNode(geometry: zBox)
zNode.name = "Z axis"
let result = SCNNode()
result.name = "Gizmo"
result.addChildNode(xNode)
result.addChildNode(yNode)
result.addChildNode(zNode)
xNode.position.x = offset
yNode.position.y = offset
zNode.position.z = offset
let data = NSKeyedArchiver.archivedDataWithRootObject(result)
let filename = "gizmo"
// Save data to file
let DocumentDirURL = try! NSFileManager.defaultManager().URLForDirectory(.DocumentDirectory, inDomain: .UserDomainMask, appropriateForURL: nil, create: true)
// made the extension "plist" so you can easily inspect it by opening in Finder. Could just as well be "scn" or "node"
// ".scn" can be opened in the Xcode Scene Editor
let fileURL = DocumentDirURL.URLByAppendingPathComponent(filename).URLByAppendingPathExtension("plist")
print("FilePath:", fileURL.path)
if (!data.writeToURL(fileURL, atomically: true)) {
print("oops")
}
return result
}
}
I did also experience a lot of memory bloat from SceneKit in my app, with similar memory chunks as you in Instruments (C3DGenericSourceCreateDeserializedDataWithAccessors, C3DMeshSourceCreateMutable, etc). I found that setting the geometry property to nil on the SCNNode objects before letting Swift deinitialize them solved it.
In your case, in you cleanup function, do something like:
atomsNode_1.removeFromParentNode()
atomsNode_1.geometry = nil
atomsNode_2.removeFromParentNode()
atomsNode_2.geometry = nil
Another example of how you may implement the cleaning:
class ViewController: UIViewController {
#IBOutlet weak var sceneView: SCNView!
var scene: SCNScene!
// ...
override func viewDidLoad() {
super.viewDidLoad()
scene = SCNScene()
sceneView.scene = scene
// ...
}
deinit {
scene.rootNode.cleanup()
}
// ...
}
extension SCNNode {
func cleanup() {
for child in childNodes {
child.cleanup()
}
geometry = nil
}
}
If that doesn't work, you may have better success by setting its texture to nil, as reported on scene kit memory management using swift.

Affecting a Child From Another Scene Using Sprite Kit and Swift

I'm trying to teach myself Sprite Kit and Swift. What I'm trying to do is access a child node from an ArcheryScene.sks file and affect that child from the ArcheryScene.swift file. For Example: In my ArcheryScene.swift file I have added this line of code: let scene = SKScene(fileNamed: "ArcheryScene").
This compiles fine and when I say println(scene), it correctly prints the scene I want it to print, this is how I know that ArcheryScene.sks is truly in my scene variable. After this, I access a child from that scene by adding this line of code: let ballChild = scene.childNodeWithName("Ball"). When I use println(ballChild), it prints the correct child, letting me know that the variable truly contains Ball child that is in ArcheryScene.sks. And now to my problem...
Why can't I say things in my ArcheryScene.swift file like:
ballChild?.physicsBody?.affectedByGravity = false
or
ballChild?.position.x = self.frame.size.width / 2
or
let move = SKAction.moveByX(40, y: 0, duration: 5.0)
ballChild?.runAction(move)
All of this code will compile without errors but when I run the game, the Ball is not affected at all. Also, if I run ballChild?.position.x = self.frame.size.width / 2 and then print the ballChild position, it will show up as x: 512, which is what it should be, but still when I run the game, the Ball is not affected. This is really confusing to me and I'd just like to figure out what is going on. Any suggestions would be appreciated, thank you.
If you look at the definition of Class ArcheryScene you will see it is a subclass of SKScene - so the class you are coding in is already your scene. The UIViewController subclass in your project has loaded ArcheryScene.sks and associated it with an instance of ArcheryScene.
When you subsequently say let scene=SKScene(fileName:"ArcheryScene.sks") you are actually creating a new instance of the scene that isn't presented into your SKView. You then modify the ball in that scene, but nothing happens as this is not the ball that is visible.
Instead you should say
let ballChild=self.childNodeWithName("Ball")
if (ballChild? != nil) {
let move=SKAction.moveByX(50 y:10 duration:10.0)
ballChild!.runAction(move)
}
When you write your following line :
let scene = SKScene(fileNamed: "ArcherySceneSKS")
You are creating a new scene. So the ballChild you are accessing is another one (not the one inside your self (ArcheryScene class)).
If you ArcheryScene class is properly instantiated, can't you access the ball by doing like so ?
let ballChild = self.childNodeWithName("Ball")
Let me know if it helped.

How to programmatically wrap png texture around cube in SceneKit

I'm new to SceneKit... trying to get some basic stuff working without much success so far. For some reason when I try to apply a png texture to a CNBox I end up with nothing but blackness. Here is the simple code snippet I have in viewDidLoad:
let sceneView = (view as SCNView)
let scene = SCNScene()
let boxGeometry = SCNBox(width: 10.0, height: 10.0, length: 10.0, chamferRadius: 1.0)
let mat = SCNMaterial()
mat.locksAmbientWithDiffuse = true
mat.diffuse.contents = ["sofb.png","sofb.png","sofb.png","sofb.png","sofb.png", "sofb.png"]
mat.specular.contents = UIColor.whiteColor()
boxGeometry.firstMaterial = mat
let boxNode = SCNNode(geometry: boxGeometry)
scene.rootNode.addChildNode(boxNode)
sceneView.scene = scene
sceneView.autoenablesDefaultLighting = true
sceneView.allowsCameraControl = true
What it ends up looking like is a white light source reflecting off of a black cube against a black background. What am I missing? I appreciate all responses
If you had different images, you would build a different SCNMaterial object from each like so:
let material_L = SCNMaterial()
material_L.diffuse.contents = UIImage(named: "CapL")
Here, CapL refers to a .png file that has been stored in the project's Assets.xcassets folder. After building 6 such objects, you hand them to the boxNode as follows:
boxGeometry.materials = [material_L, material_green_r, material_K, material_purple_r, material_g, material_j]
Note that "boxGeometry" would be better named "box" or "cube". Also, it would be a good idea to do that work in a new class in your project, constructed like:
class BoxScene: SCNScene {
Which you would then call with modern Swift in your viewController's viewDidLoad method like this:
let scnView = self.view as! SCNView
scnView.scene = BoxScene()
(For that let statement to work, go to Main.storyboard -> View Controller Scene -> View Controller -> View -> Identity icon Then under Custom Class, change it from UIView to SCNView. Otherwise, you receive an error message, like:
Could not cast value of type 'UIView' to 'SCNView'
Passing an array of images (to create a cube map) is only supported by the reflective material property and the scene's background.
In your case, all the images are the same, so you would only have to assign the image (not an array) to the contents to have it appear on all sides of the box

Resources