Is there a way to load/render models with KTX2 textures in SceneKit? I want to visualize a GLTF model with KTX2 textures in an iOS app using SceneKit.
Related
I am following ARCORE AUGMENTED FACES iOS SDK. In-build fox_face.scn is working fine for me.
Now we have created some 3d models in Blender & export them in both .dae OR .obj formats. From xcode I converted these models in scn but when i try to render my scn models, its not rendering on face.
Same scn model is working fine with ARKIT but not working with ARCORE
In case, your model has any animation, check if your 3D file follows the requirements from here: https://developers.google.com/ar/develop/java/sceneform/animation/obtaining-assets
Rendering on iOS is done within ARKit scene, not ARCore. ARCore Face Augmentation is generating the 3D face assets which are delivered to SceneKit to render with each frame callback.
I'm not sure exactly why you said scn model is working fine with ARKit, but not AR Core?
I have been successful in exporting from Blender to .dae, and then converting to SceneKit file in xcode.
Having said that, I have been unsuccessful in cleanly exporting the default fox face and bones (and my geometry) from blender directly into Xcode to create what the demo has from default.
Instead I have had to copy/paste 3D geometry content from the imported/converted scene from blender into the original fox_face scene that comes with the project, ensuring all axis are correct.
In order to correctly position the asset relative to the original fox face I had to create some code to move the model around in the world.
I hope that helps.
But I would be very interested if you find a way to export cleanly from blender (including default face or fox ears etc) directly in as a whole scene, including your new geometry.
I am trying to develop a building information model viewer for iPad and I am faced with the current challenge. Should I use SpriteKit or SceneKit? I know SceneKit is meant for rendering 3D while SpriteKit is 2D. From my research so far, SceneKit seems more appropriate for Building Information Modelling as it will represent a 3D Model of a building. However, I would like to know if I can do it with SpriteKit (I read SpriteKit is easier to learn) or should I used SceneKit? Thanks for your input. I am new to iOS dev, so any assistance would be helpful.
SceneKit and SpriteKit are very similar to each other. SceneKit is a little harder to learn but it's pretty simple. SceneKit would be the only way to have a 3D model(With the options you provided). You can have a SpriteKit scene over top of the SceneKit scene to display labels that stay put.
You'll probably want to use both of them. SceneKit represents 3D very nicely, but can also accept SpriteKit scenes to use for backgrounds, foreground overlays, and object textures/materials.
You can assume SpriteKit is at Top of the SceneKit, As using SceneKit you can add 3D models into Augmented Reality while SpriteKit is used to add extra sprites onto the model.
In short SpriteKit is revolution in Gaming.
I'm trying to learn SceneKit for iOS and get beyond basic shapes. I'm a little confused on how textures work. In the example project, the plane is a mesh and a flat png texture is applied to it. How do you "tell" the texture how to wrap to the object? In 3D graphics you UV unwrap, but I don't know how I would do this in SceneKit.
SceneKit doesn't have capabilities to create a mesh (other than programatically creating vertex positions, normals, UVs etc). What you'd need to do is create your mesh and texture in another bit of software (I use Blender). Then export the mesh as a collada .dae file and export the textures your model uses too as .png files. Your exported model will have UV coordinates imported with it that will correctly wrap your imported textures on your model.
Im using blender to create landscape for a game being built with scenekit.
As it is just a landscape, I won't be using any animations so I'm wondering before I dive too deep into blender, should I be using blender to create the geometry and then create my own materials in scenekit?
I could still create the shadow, emission, specular etc. maps in blender but would there be a performance benefit or penalty for doing it this way?
Also if this is a path I could take should I then be exporting as .dae or is there a way to export to a normal map that xcode would be happy with?
SceneKit supports materials exported in DAE from Blender. It doesn't support every possible shading option that Blender has, but unless you're doing exotic stuff it should cover most of what you're looking for.
At run time there's no difference between materials loaded from DAE and those created programmatically.
What you do want to think about at authoring/export time is stuff like real-time versus static lighting/shadows and high-poly geometry versus baked normal maps. In other words, material performance is more about how the materials are set up (complexity) than where they're set up (imported or at run time). See WWDC 2014 session 610: Building a Game with SceneKit for some tips.
we want to make a 3D Game for Apple iPad and we ar thinking about
a possibility, to import 3D-Models from Blender into Xcode.
Is there a way to do that?
We want to use opengl-es 2.0.
XCode isn't a game engine or 3D SDK. You can't 'import' blender files into XCode. What you can do is take your 3D assets and use them within your apps, either directly through OpenGL (rather low level), or using a 3D engine such as Unity (easier).
Either way, there are a number of questions already on Stackoverflow that you might find useful:
Opengl ES how to import a 3D model and map textures to it on runtime
Iphone + OpenGL ES + Blender Model: Rotation by Touch
Choosing 3D Engine for iOS in C++
...I highly recommend you take a look at what options are out there, decide on the best way to implement your 3D game (be it raw OpenGL or an engine), and go from there.