In Xcode, I have a .SCN file that was transformed from a .DAE file. I've worked with the person who made the model to set all of the Physically Based (PBR) settings. But no matter what I do, the preview is always black.
Also, if I change the environment to Procedural Sky the model will also display as black.
I'm aware that adding a light to the scene will "fix" this, but should I have to do that? Since it gives my models unrealistic shadows?
I changed the Illumination settings to be the _m file in my list of textures. I suppose this has something to do with the way the model interacts with the light in the scene.
I just rebuilt my lights, i.e. I had 4 directional lights, so I made a new directional light to replace the first one, manually copied all of the settings including of course the rotation settings, deleted the one I copied from, made a second directional light, copied settings from the second one, etc. I think the problem started for me because I was copying lights from other .scn files, and something between the lights and the PBR materials is getting disconnected.
EDIT: Ok I just realized that rebuilding isn't necessary - the lights were within a group, which is how I copied them over to this scene. I just grabbed all the lights in the group and moved them out of that group and magically they all started working for the PBR materials again. It's a Christmas miracle.
Related
I'm facing a problem using SceneKit in my iOS app.
I have a sceneView showing a 3D model (from a .scn file) that the user can customize changing some details like hair color, eyes color and hair style.
What it's happening, is that the memory used by the app goes up to around 250 MB, and it happens exactly when the view starts showing my 3D model.
I programmatically compose the "full" 3D model by adding nodes coming from different scenes.
Just to be a bit more clear, I have one .scn containing the body, another .scn containing the hair style and so on. When the view loads up, I just create a "fullModel" that is a SCNNode() running through each node of the various .scn files and adding them to my fullModel node. Then I add the fullNode to my scene.rootnode
I'm new in iOS programming and I tried so long to understand what'a going on, even using XCode Instruments "Allocations" and "Leaks" functionalities but nothing helped.
Any suggestion on what I could try to solve the high memory usage will be very appreciated!
Thanks a lot!
I am working on a project that will display objects below the ground using AR Quick Look. However, the AR mode seems to bring everything above the ground based on the bounding box of the objects in the scene.
I have tried using the USDZ directly and composing a simple scene in Reality Composer with the object or with a simple cube with the exact same result. AR preview mode in Reality Composer is showing the object below the ground or below an image anchor correctly. However, if I export the scene as a .reality file and open it in using AR Quick Look, it brings the object above the ground as well.
Is there a way to achieve showing an object below the detected horizontal plane or image (horizontal) using AR Quick Look?
This is still an issue a year later. I have submitted feedback to Apple. I suggest you do too. I have suggested adding a checkbox to keep Y axis persistent. My assumption is this behaves this way to prevent the object from colliding with the ground, but I don't think it's necessary. It's just a limitation right now.
I have a problem with 3D model in SceneKit. I also use maxstARobjec After updating to iOS 12 it's start looking brighter than before (SCREENSHOTS WITH THE SAME ISSUE HERE)
I use 2 light node, without autoenablesDefaultLighting.
And one more fact about this issue: when I hide all lights the model (which must be black) is grey like it's has extra light...
Sorry for my bad language, and I very need your help!
I solve the same issue by convert .obj file to .scn files in Xcode, and use this scenes as nodes. Editor -> Convert to SceneKit file format (.scn)
screenshot
A few questions to game developers. I am very beginner in this. I want to create a game level for example a green plane with trees. I have played a little in Blender and SceneKit. I know that I can export .dae from Blender and import it to Xcode. My questions:
Should I delete camera and light node before export? Why?
Should I design all level in one .dea file or make it separately? For example one .dea for plane and four different trees in for .dea's How to merge them in Xcode?
Can I use many times one .dea to generate for example a forest? How?
If creating design in separately is better way how to keep proportions between them to protect yourself from creation man bigger than tree?
I will be very great full if somebody someone dedicates to these questions. It will cut my time to learn basic. Thanks in advance. :)
I'll tell you how I do it:
1) .dea files use only for models(trees, charecters, building, etc...)
2) Game scene: floor, models, light, camera, obstacles build using Xcode scene builder or by code or mixed (based on the scene).
3) Based on size of world/level it can be split into several scenes(visible/invisible by player). Then you can create one blank scene and load/unload these scenes during runtime.
4) For a model you create a reference and after that build forest using reference of tree. If in the future you need to change the color of tree, all trees in all scenes will be updated.
5) For each model(SCNNode) (loaded from .dea file) you can set scale attribute (from code or by Xcode scene builder)
Also, 3D Apple Games by Tutorials is very good for starting.
I've loaded a model into my scene (.scn) and when zero-ing the model out, it seems to be at a 90 degree angle on the x-axis (even though the inspector says 0).
This is incorrect, but strangely when the running the scene on the emulator the model loads in the correct position.
Has anyone experienced this before? It's rather annoying.
Yes! This is a common thing to experience.
Depending on where you made your model, blender, sketchup, 3dsmax, etc.
Some programs use a "Y-up Axis" that means that the Y axis is up. However, SceneKit uses a "Z-up Axis".
The reason it appears differently when running your app is because in your ".xcaccets" folder, or whatever it is called, you have "always convert to z-up axis" checked.
I'm not quite sure how to convert it before editing in Xcode's editor, but perhaps, you could use write(to:options:delegate:progressHandler:) to export the corrected version out of SceneKit for non-eye-sore Z-up editing.
Hope This helps!