I am using a few DAE models in a SceneKit scene in my app. However, textures do not show up on the models in the application even though they are shown in Xcode. I am also using the textures that I include in the app's asset catalogue.
Here is the texture I applying to the diffuse channel of the model. I am dragging this to the model in Xcode to apply it, and Xcode shows the model with the texture in its model view.
And the model itself:
https://www.dropbox.com/s/lzqkgoumu9yshcf/amalthea.dae?dl=0
The model was exported (in Blender) from this Blender file:
https://www.dropbox.com/s/kt3n9f2kn6w6cij/amalthea.blend?dl=0
I am loading the model into my scene as follows:
let scene = SCNScene(named: "amalthea.dae")
Anything obvious that I am overlooking?
Revised answer:
After downloading the dae file.
The most likely cause is that your textures are not on a path that is reachable relative to the model location.
The texture reference in the dae file is:
<init_from>Assets.xcassets/jupiter/amalthea/amaltheamap.imageset/amalthea.jpg</init_from>
I think it should be relative to model location. Try changing the texture reference in the dae (using any text editor) to just "amalthea.jpg". Then copy the jpg to same folder where the model is.
<init_from>amalthea.jpg</init_from>
A more detailed discussion can be found here.
Related
I downloaded a 3D model(https://www.cgtrader.com/free-3d-models/industrial/machine/fuel-gas-scrubber-skid) and converted it to .dae using SketchUp.
I am not able to apply texture to this model in Xcode 9 Scene Editor. When i select any texture image(using materials->diffuse), it turns into black!!
I did the same for other models before and it was working fine. Not able to figure out what is the issue now.
I tried even changing the multiply, emission, reflective etc. properties to white color but still not able to see the texture.
I found what is wrong with the model. The model i downloaded is not UV mapped. That is why i am not able to apply the texture to it.
Looking at the apple sample AR app, there are many realistic looking objects (cup, candle, etc). However working with the scene kit editor on Xcode it is clear that this only allows you to create basic objects.
My question is, what software/file can be used to create realistic scn objects? I'm sure that there is software that allows you to create 3D models and covert them to scn files. I just don't know which software to use or what files can be converted to scn
Note: I understand that this question may be too vague/broad for the Stackoverflow guidelines. I just don't know where to pose my question and this seems like the best place
To get some existing models to work with, here is what I did...just the basics that I know.
I went to Turbosquid and found a 3D model that would work for me.
Make sure it has the OBJ files. Buy and download.
Next download Blender. Import the OBJ file you just bought.
Export a DAE file.
Drag the DAE files and any textures (png files) into the .scnassets folder in your project.
Click on the DAE model in the .scnassets folder. Click to select the object in the Scene graph.
Click the globe in the upper right.
I clicked Diffuse and selected one of the PNGs I dragged in to apply it to the model
You can also skip the Blender conversion and just use one of the free online OBJ to DAE conversion tools. Google it. And try to buy a cheap $5 range model on Turbosquid that just has a OBJ file and not a lot of other piece parts. They are too big and create other issues as a starter approach anyway.
Update After watching a Apple WWDC presentation on Model IO in Xcode, I now see that you could drop in a OBJ file into your .scnassets folder. Select that file, go to Editor and select Convert to SCN file. That should work also but I have not tried. Worth trying with a copy of your OBJ file.
Update December 2018:
I've been working more with 3D files. Here is some incremental help on this issue.
Try using MeshLab to import your 3D model and convert it to a .DAE file. Drag the .DAE file into a folder in Xcode. That is what you are going to use to display in your app.
http://www.meshlab.net/
If your source 3D model is a .OBJ file, there are two related files that should be in the same folder as the .OBJ file. These are a *.mtl file and a *.jpg or *.BMP file. The .mtl file can be opened with TextEdit.
Open and make sure it has a line that says: map_Kd *.jpg. The .jpg is the texture image the wraps around the 3D mesh file. I've found that it is best to make sure your texture file is in .jpg format. If it isn't, change it to .jpg format (in Preview for example by resaving it as a jpeg) and then edit the .mtl file for the new .jpg file name.
I had some texture files that were .bmp and I just converted to .jpg, edited the .mtl file and I was good.
The second issue is the name of the node in the .obj file. The .obj file can also be opened with TextEdit. The .obj file should reference the .mtl file in the same folder. If not, you have a problem.
And here is the tricky part. When you are adding a childNode to the rootNode in a SceneKit scene, you have to fill in the "withName:" text name. Open the converted .DAE file that you have made from your .obj + .jpg + .mtl (all three are used when importing into MeshLab but after exporting to .DAE, there is only the .DAE file), and search for "node id =" . It might say: "node id="node". If so, the word "node" is the name of the childNode you enter for your text name in the "withName:" property of the scene.rootNode.childNode(withName: "node", recursively: true) call.
You can change your node name ID to node if it isn't already.
Hope this helps. Many hours of work and help from others to understand this next round of working with 3D models.
Xcode's SceneKit editor isn't a 3D art authoring package — just like its SpriteKit editor and Interface Builder components aren't equivalent to the likes of Photoshop, Illustrator, Pixelmator, Affinity Designer, etc. To create 3D art assets yourself, you'll need to learn Blender, Maya, 3DS Max, or one of the other major 3D authoring tools. Beware, the learning curve to becoming a 3D artist is a bit steeper than learning how to paint in 2D.
What the SceneKit editor is for is taking the output from a 3D artist and preparing or combining it for use in a SceneKit-based app or game — tweaking material definitions so they look right with SceneKit's renderer, arranging separate assets to create a game level or other scene, adding dynamic SceneKit-specific features like particle effects and physics, etc.
You bridge between these two worlds by exporting assets from your 3D art tools in one of the formats SceneKit can import. Digital Asset Exchange (.dae) is one of the best options here, but through SceneKit's lower level counterpart, Model I/O, you can also import other formats like OBJ or Pixar USD.
When you open those in Xcode, you get the SceneKit editor, so you can start marking SceneKit-specific edits and save the results for use in your app as .scn files.
There are a few things you can do in the process of authoring and prepping 3D assets that makes them look more realistic in ARKit. The ARKit session from WWDC (and the ReadMe file in the sample code project attached to that session) includes a few such tips:
use physically based materials
"bake" ambient occlusion and other static lighting effects
add invisible shadow planes
You can create your realistic 3D models in .DAE format which is supported by many tools. Then in Xcode, you can convert that .DAE file to .SCN format. For conversion, check this
How to convert .DAE to .SCN
In SceneKit's Scene graph you can import 3D assets with animation (from Blender, Maya or Houdini), create 3D primitives (SCNBox, SCNSphere, SCNPlane, SCNCylinder, etc) and assign UV-mapped textures on 5 SceneKit's shaders (Blinn, Lambert, Phong, Constant and PBR).
The proper way to create life-like animated models is to use professional 3D authoring tools like Autodesk Maya, Autodesk 3dsMax, Maxon Cinema 4D, The Foundry Modo, Blender or SideFX Houdini. These apps allow you create not only a perfect geometry but also realistic UV-mapped textures containing render passes for diffuse, transparency, metalness, bump, and occlusion slots of SceneKit's material editor.
When your 3d model and set of UV-mapped textures are ready for use all you need to do is to save a .dae or .usdz file. Format .dae must be converted to .scn. For this select a .dae file and apply command from menu:
// Editor - Convert to SceneKit file format...
Then for converted .scn model you can choose a Physically Based shader (PBR) in Material Inspector. Physically Based shaders for .scn and .usdz makes your models look realistically believable.
I grabbed a simple pool table made in Blender from the web, exported it to dae format, added it into a scene by dragging the file and dropping it into the scene editor. The pool table shows fine there, however, when testing it on device/sim the model isn't visible and I just get a black screen. I can confirm it's something related to that model since I created a sphere from the primitive shapes in scene editor and that shows up fine. Also somehow the 3D model is there because the sphere sits on top of it. The sphere has a rigid body setup and its affected by gravity and it doesn't fall because of the pool table.
Dragging the dae file into the .scnassets folder, converting it to .scn by using the built-in converter and then dragging the resulting file into the main scene file fixed the issue.
Check the normals of your object. Since SceneKit always culls backfaces, it won't show faces whose normals are pointing inwards. You can fix them in Blender by going into Edit Mode, selecting everything with A, and pressing Ctrl + N. If it still doesn't work, then it is possible something is wrong with the exporter and you'll have to examine the normals of the .dae file with a different program.
I had the same problem when I dragged and dropped a .obj file into XCode (into the scene.scnassets folder). The model would appear in the scene editor but wouldn't be there at runtime (on the simulator or on a device).
Solved the problem by adding the .obj file to the scene.scnassets folder in Finder, rather than XCode. The file then appears in XCode (refreshes automatically). I then selected the .obj file within XCode and clicked Editor > Convert to .scn file.
The texture will not be there, so you need to do the same process of copying the .png file into the scene.scnassets folder in Finder, and then dragging and dropping the .png file onto the Diffuse property.
Using SketchUp, I made a DAE file, with a basic shape which has a few textures too.
The DAE file also comes with a folder the contains those textures.
I also am using SceneKit and have an existing scene, camera, light etc. I at the moment, I render many cubes into certain positions. How can I render the DAE model instead of the cube?
Cheers.
It's sure possible but please note that I am in objective C not swift(sorry).
Here is the code (Objective C but it's pretty easy to translate):
SCNScene *somthing = [SCNScene sceneNamed:#"mySketchupScene.dae"];
Now, when you import the DAE (we'll be calling it by it's real name Collada from here on out) into your app resources or .scnassets (perhaps by click-dragging) make sure that you first import the texture folder, then import the Collada.
I found that if you don't do the textures first, then Xcode messes up the matireals and you have to go manually re-assign all the textures.
I am able to export meshes created in Blender for use in SceneKit by using the COLLADA/.dae format - however no textures show up on iOS device.
Also, Preview on OS X won't open any COLLADA file exported from Blender - yet the sidebar preview does show the mesh. What are the options needed at export to make this work?
Create a scnassets folder first (not required but useful)
Create a folder on your desktop and give it an extension of ".scnassets"
Put your collada(.dae) file along with any textures you will be using in it.
Drag the folder into your project and save as copy like usual.
Click on your scnassets folder and you will see a checked box (if not check it) for converting to a "y" up axis.
Assigning textures to your scene model
Click on your collada(.dae) file inside your project in xcode.
You should see your scene and on its left side a list of cameras, lights, materials, and such. This is your scene tree.
Open the materials tab and click on one of your materials.
On the right hand side in the inspection window click on the blue ball shaped icon(Material inspector) to view the diffuse, specularity, and such of that one material.
Click on the diffuse tab and when it opens you should have an option of colors and your textures within your project. Select the texture you used on your model within 3d program. As long as you UV unwrapped them properly in your 3D program, they should apply instantly within your scene view.
What if I want to change my material after loading my scene? Glad you asked!
To do this we must use entryWithIdentifier method of SCNSceneSource class. I am going to use swift here because it is awesome! Here we go...
Get the url of your scene(.dae) like so...
let url = NSBundle.mainBundle().URLForResource("YourFolder.scnassets/yourScene", withExtension "dae")
Now lets put that url to use...
let source = SCNSceneSource(URL: url, options: nil)
Click on your .dae and under Scene Graph is list of items one of which is your geometry. It will have a tee kettle just to the right of it signifying so. We are going to use its name here. Let say your geometry's name is Geo. Use it like so...
let yourGeometry = source.entryWithIdentifier("Geo", withClass: SCNGeometry.self) as SCNGeometry
Now we have a source attatched to a SCNNode called yourGeomerty. Lets create a new material with a UIColor like so...
let newMaterial = SCNMaterial()
newMaterial.diffuse.contents = UIColor.redColor()
Finally we will switch out the old material with the newMaterial like so...
yourGeometry.geometry.replaceMaterialAtIndex(0, withMaterial: newMaterial)
Play around with it and other indexes if you have more than one material. You can also use the UIImage class to use another texture instead of a color.
TIP
If you want to add to or delete something from your scnassets folder, navigate to your project folder to do so and not xcode. This took me a while to figure out so thought I would save everyone the trouble.
Collada files don't embed textures, they only have references to them. Make sure that your textures are reachable from the collada file when you open it in Preview and make sure to include the textures in the app bundle when building an app.
3 things I had to do to make it work.
Make sure images are packed in the .blend file (this is an option in UV editor)
when you export, the file will not automatically include the uv's or the materials. there is a checkbox to include the UV's and Materials in the DAE. I missed this as well when doing it.
This one you only need if your putting in a playground. The dae isn't quite applized, but i was able to use their scntool in the developer tools to export a dae that is. here is a sample command line that i used.
./scntool --convert ~/Documents/Art/BlenderArt/tableandappleUV.dae --format c3d --output ~/Documents/Table5.dae
notice the c3d format. use that and your playground can now also recognize the dae. (One other note: if you want to used the dae in the playground you need to put in resources folder of the playground)