How do I save the blender file? - gltf

I want to implement the blender file in threejs However, the light object and texture are not implemented.
I think the blender export is wrong. How do I save it?

Related

3D- Model is not rendering in ARCore Augmented Faces in iOS

I am following ARCORE AUGMENTED FACES iOS SDK. In-build fox_face.scn is working fine for me.
Now we have created some 3d models in Blender & export them in both .dae OR .obj formats. From xcode I converted these models in scn but when i try to render my scn models, its not rendering on face.
Same scn model is working fine with ARKIT but not working with ARCORE
In case, your model has any animation, check if your 3D file follows the requirements from here: https://developers.google.com/ar/develop/java/sceneform/animation/obtaining-assets
Rendering on iOS is done within ARKit scene, not ARCore. ARCore Face Augmentation is generating the 3D face assets which are delivered to SceneKit to render with each frame callback.
I'm not sure exactly why you said scn model is working fine with ARKit, but not AR Core?
I have been successful in exporting from Blender to .dae, and then converting to SceneKit file in xcode.
Having said that, I have been unsuccessful in cleanly exporting the default fox face and bones (and my geometry) from blender directly into Xcode to create what the demo has from default.
Instead I have had to copy/paste 3D geometry content from the imported/converted scene from blender into the original fox_face scene that comes with the project, ensuring all axis are correct.
In order to correctly position the asset relative to the original fox face I had to create some code to move the model around in the world.
I hope that helps.
But I would be very interested if you find a way to export cleanly from blender (including default face or fox ears etc) directly in as a whole scene, including your new geometry.

Blender Composting Material into SceneKit

I created a 3D object in blender that has a material to it that I created in blender's compositing pane. I exported my object as a dae and loaded into the scene but it doesn't have the clearish watery material that I get when I do "Render" in blender, it just has the viewport color, what am I missing here?
Using the "compositing pane" means you made a node based material, possibly a cycles material, node based materials are rather specific to blender and don't export to game engines very well.
Set the render engine to Blender Render and then setup your material without nodes.
If you google blender material export scenekit you should find some more tips.

Wrong normals when blender export object file to WebGL

I have an object created using blender and I set all the normal correctly using ctrl+n. I'm sure that all the normals are set correctly in blender. However when I export the object to .obj format with 'write normal' option and export to WebGL, half of my mesh are disappear, seems like the normals are export incorrectly.
The mesh that disappear are using mirror modifier and LoopTools-bridge to generate.
When I disable the 'write normal' option, the object display correctly but I cannot put lights on the object.
What is the possible problem, which step I miss when I export the file from blender?
It should be a sphere like object but it lost half of the mesh
I just found that WebGL only support 16bit vertex buffer, if my object
include normals. I have to downsample my vertex number to 16bit/3. And
the problem fixed.
Edit-
I fix this problem by reduce my vertex number

Workflow between blender and xcode (scenekit)

Im using blender to create landscape for a game being built with scenekit.
As it is just a landscape, I won't be using any animations so I'm wondering before I dive too deep into blender, should I be using blender to create the geometry and then create my own materials in scenekit?
I could still create the shadow, emission, specular etc. maps in blender but would there be a performance benefit or penalty for doing it this way?
Also if this is a path I could take should I then be exporting as .dae or is there a way to export to a normal map that xcode would be happy with?
SceneKit supports materials exported in DAE from Blender. It doesn't support every possible shading option that Blender has, but unless you're doing exotic stuff it should cover most of what you're looking for.
At run time there's no difference between materials loaded from DAE and those created programmatically.
What you do want to think about at authoring/export time is stuff like real-time versus static lighting/shadows and high-poly geometry versus baked normal maps. In other words, material performance is more about how the materials are set up (complexity) than where they're set up (imported or at run time). See WWDC 2014 session 610: Building a Game with SceneKit for some tips.

How to create a simple 3d sphere for cocos3d using blender and PowerVR SDK

I am a fresher in cocos3d. I want to create a simple project - a 3d sphere rotating. i have designed a 3d sphere using blender. So i want help in creating collada file and pod file. What all are the things should be taken care while creating this simple 3d object using blender and PowerVR SDK. Thanks
How about you make the simple sphere in blender, and then export it using Jeff LaMarche's Blender-to-iOS script? This wouldn't even require Cocos or PowerVR, but it's a good start. Since you can integrate Cocos with non-Cocos classes easily in iOS it might be helpful. You could go further and leverage Apple's GLKit which would probably be straightforward as well.
Just suggestions....
After you create the sphere in blender you need to export in .dae format then use collada to POD convertor which is free. It will convert .dae file to .pod file. and then pod file can easily be parsed into cocos3d.

Resources