I'm facing a problem using SceneKit in my iOS app.
I have a sceneView showing a 3D model (from a .scn file) that the user can customize changing some details like hair color, eyes color and hair style.
What it's happening, is that the memory used by the app goes up to around 250 MB, and it happens exactly when the view starts showing my 3D model.
I programmatically compose the "full" 3D model by adding nodes coming from different scenes.
Just to be a bit more clear, I have one .scn containing the body, another .scn containing the hair style and so on. When the view loads up, I just create a "fullModel" that is a SCNNode() running through each node of the various .scn files and adding them to my fullModel node. Then I add the fullNode to my scene.rootnode
I'm new in iOS programming and I tried so long to understand what'a going on, even using XCode Instruments "Allocations" and "Leaks" functionalities but nothing helped.
Any suggestion on what I could try to solve the high memory usage will be very appreciated!
Thanks a lot!
Related
I need to show a 3D model on a SceneView in my iOS app, after its relative path (String) has been loaded from Firebase and passed to my SCNScene(named: path).
The function that reads the path runs very quickly, but once I try to attach the scene to my SceneView, it takes pretty long before it can actually be displayed (5-6 sec).
The .scn file I'm trying to show is about 20MB, and if I set showStatistics = true, it says my scene contains more than 200K triangles.
I need a help in order to understand how I could strongly reduce the "loading" time of the scene.
I've already tried reducing the number of polygons and also using the SceneView method prepare(_:completionHandler:) in order to render my scene on a background thread and (ideally) speed up the process to display it, but nothing worked for me.
Any help would be really appreciated. Thanks!
I am working on a project that will display objects below the ground using AR Quick Look. However, the AR mode seems to bring everything above the ground based on the bounding box of the objects in the scene.
I have tried using the USDZ directly and composing a simple scene in Reality Composer with the object or with a simple cube with the exact same result. AR preview mode in Reality Composer is showing the object below the ground or below an image anchor correctly. However, if I export the scene as a .reality file and open it in using AR Quick Look, it brings the object above the ground as well.
Is there a way to achieve showing an object below the detected horizontal plane or image (horizontal) using AR Quick Look?
This is still an issue a year later. I have submitted feedback to Apple. I suggest you do too. I have suggested adding a checkbox to keep Y axis persistent. My assumption is this behaves this way to prevent the object from colliding with the ground, but I don't think it's necessary. It's just a limitation right now.
I have a problem with 3D model in SceneKit. I also use maxstARobjec After updating to iOS 12 it's start looking brighter than before (SCREENSHOTS WITH THE SAME ISSUE HERE)
I use 2 light node, without autoenablesDefaultLighting.
And one more fact about this issue: when I hide all lights the model (which must be black) is grey like it's has extra light...
Sorry for my bad language, and I very need your help!
I solve the same issue by convert .obj file to .scn files in Xcode, and use this scenes as nodes. Editor -> Convert to SceneKit file format (.scn)
screenshot
Using Unity2D 2017.1.1f1, Tiled and Tiled2Unity, I exported a tiled map in Unity and there are no problems in the editor. I also tried it played maximised and there are no gaps present.
The problem shows up when the game is ran in iOS specifically iPhone 6s. There are noticeable gaps showing up.
Also, I also got the settings like this:
Any suggestions? Thanks..
(I'm the Tiled2Unity author)
Those gaps you are seeing are seams and they're common in Unity development when using tile or sprite sheets that "touch" each other. There are a number of ways that you can fix them described here.
However, these seams are fixed automatically with SuperTiled2Unity which is still free (or name your price) and is currently under soft release. Just be aware that all your Tiled files (TMX, TSX, textures) will need to be in Unity now (that's a good thing).
Dragging in all your files (with relative paths intact) to Unity should take care of the importing process for you.
In Xcode, I have a .SCN file that was transformed from a .DAE file. I've worked with the person who made the model to set all of the Physically Based (PBR) settings. But no matter what I do, the preview is always black.
Also, if I change the environment to Procedural Sky the model will also display as black.
I'm aware that adding a light to the scene will "fix" this, but should I have to do that? Since it gives my models unrealistic shadows?
I changed the Illumination settings to be the _m file in my list of textures. I suppose this has something to do with the way the model interacts with the light in the scene.
I just rebuilt my lights, i.e. I had 4 directional lights, so I made a new directional light to replace the first one, manually copied all of the settings including of course the rotation settings, deleted the one I copied from, made a second directional light, copied settings from the second one, etc. I think the problem started for me because I was copying lights from other .scn files, and something between the lights and the PBR materials is getting disconnected.
EDIT: Ok I just realized that rebuilding isn't necessary - the lights were within a group, which is how I copied them over to this scene. I just grabbed all the lights in the group and moved them out of that group and magically they all started working for the PBR materials again. It's a Christmas miracle.