COLLADA transformations in SceneKit - ios

We try to make viewer for models in Xcode and use SceneKit to render model. If we add COLLADA (.dae) file to Xcode, it transforms the model using scntool to c3d format (found this on the Internet). But we want to load models in runtime (we download them from server to iOS device). How can we transform .dae to this format not from Mac OS? Our server running Ubuntu, we can transform model and send it already converted to iOS devices.
Thanks a lot!

you can run scntool manually but you'll need a machine running OS X to do that. The compressed format used by SceneKit on iOS is not documented and there is no other tool that you can use to make the conversion.

Related

Load glTF model into RealityKit scene?

I'd like to load a glTF file generated by another program into RealityKit. I get the impression that the only way to load models into RealityKit is via USD or Reality files.
Anyone know a way to get some other model into RealityKit? Not necessarily as a file -- I'd be happy to be able to generate a MeshResource and array of Materials myself and load them in that way.
Reality Converter
Apple discussed this in the WWDC20 video "The artist’s AR toolkit".(link)
They show how to convert FBX, OBJ, USD and GLTF files to USDZs for use in Reality Composer.
Reality Converter is still in beta and needs to be downloaded from the Apple Developer website. I used it and it is quite nice.
There is also other tools you can use on the command line if this is more your thing. At WWDC 2019, Apple announced the USDZ Tools or also called USD Python Tools.
USDZ Tools is a pre-compiled Python library containing binaries of Pixar’s USD library for macOS. This is the link. You will need to download and install the library.
I would give a try to the Reality Converter first. I think it is here to stay since probably Apple has no intention to add support for glTF files in Reality Composer in the future, since they love USBZ!
I ended up using GLTFKit, an open source library by Warren Moore. It does exactly what I want -- lets me load a glTF file into SceneKit/RealityKit.
https://github.com/warrenm/GLTFKit
Alas, as you said, at the moment the only way to load your .gltf model in RealityKit scene – is firstly to convert it into .usdz model via Xcode command line tools. Also in RealityKit you can use .reality format (use it for a much faster uploading time) and .rcproject format that can be exported from Reality Composer app. These two file formats allow you store not only PBR shaders and animation but also a dynamics.
Please, read this post for further details.

Convert ( .obj / .fbx ) to .dae runtime which must support iOS SceneKit

I am facing trouble to find a way to convert .obj / .fbx to .dae (iOS scenefit Supported) automatically in background.
In python, it may be available to convert the file from .obj / .fbx to .dae file format. This process should run in background, immediately run after we will get .obj / .fbx file on server.
Here is the sample file, which we are trying to convert.
https://s3.ap-south-1.amazonaws.com/p9-platform/DAE/barware_s11624.obj
Please help me, if you have any suggestion.
Scenekit on IOS doesn’t support dae unless it was included in the app. So because of the “iOS Scenekit Supported” requirement there is no right answer, sort of. Although there are third party libraries (like https://github.com/dmsurti/AssimpKit ) to read and convert many 3d model formats, it won’t change the fact dae isn’t properly supported on IOS Scenekit .
That said, it is possible to convert OBJ to DAE in SceneKit using the following steps (in IOS 11.2 and later)
Load the obj file into a SCNScene.
Write the scene to a file with .dae extension using SCNScene’s writeToURL method.
That will create a .dae file SceneKit can support (but not directly, i.e. would need to be included in xcode or converted first) that starts with the following:
<?xml version="1.0" encoding="UTF-8"?>
<COLLADA xmlns="http://www.collada.org/2005/11/COLLADASchema" version="1.4.1">
<asset>
<contributor>
<authoring_tool>SceneKit Collada Exporter v1.0</authoring_tool>
</contributor>
I would second the recommendation for using Assimp or AssimpKit (I’ve only used the former but the latter might be an easier starting point).
I believe the DAEs on iOS aren’t DAEs at all, they just left the suffix the same and the actual files are SceneKit archives. I’m not sure if the API to write them is exposed yet, but I think it might be now since Xcode is now willing to load DAEs and write out SceneKit archives (but it adds the “.scn” suffix, not “.dae”).
It’s possible that iOS SceneKit can just load “.scn” files — it won’t load true DAEs because the DEA-reading/writing framework was licensed from Sony and is HUGE and the iOS team just doesn’t want that giant ugly framework on its system.
Another option would be to just link the iOS app against Assimp — it can load a ton of formats natively so you could skip all the intermediate stuff. It’s not NEARLY as huge as Sony’s DAE library so it might be acceptable to ship it with your app.

How to get furniture models from 3d.io into ARKit

I'm displaying furniture from furniture.3d.io in my AR app, which works well in a web view using Google's WebARonARKit and aframe. However the tracking and lighting seems to be better when using ARKit natively.
ARKit requires models to be either .scn, .dae, or .obj formats. Is there any way to export the furniture from 3d.io so that I can use it in my app? Aframe has a gltf exporter that I could use, so I might try to manually convert a few models using from 3d.io -> .gltf -> .dae and blender, but can't figure out how to do it in a more automated way.
I would suggest you go from 3d.io -> .blend -> export .dae for any model that you wish to take to ARKit. Blender has a great collada exporter.
I'm not sure if exporting furniture / any 3d object is possible yet from 3d.io, but it should be possible from archilogic.com. You can also directly export collada files from archilogic, but texture export is not yet supported.
Edit: actually archilogic now exports diffuse, normal and specular maps for collada

How to animate object 360 in ARToolKit in iOS

Currently I am working on the Augmented Reality and it works perfectly for me but I wanted to know how to animate the object in ARToolkit my issue is animation is not working for me
What i did i have created a fbx file from blender then i have converted into the .osgt format then i have converted to .osg format
Can any one help me out how we can animation the object in ARToolkit for iOS ?
You need to generate an .OSG file from whatever 3D Modelling/Animation tool you are using. You can use that .OSG file together with ARToolKit5.
For export from Blender have a look here: https://github.com/cedricpinson/osgexport
Then you need to read through this C-Interface that ARToolKit5 uses to interact with OpenSceneGraph(OSG) http://artoolkit.github.io/artoolkit5/doc/apiref/arosg_h/index.html
There are several arOSGSetModelAnimation* functions that you can use to manipulate the animation.
Have a look at the Android example: https://github.com/artoolkit/artoolkit5/tree/master/AndroidStudioProjects/ARNativeOSGProj
The C-Classes show you how you can load and use .OSG files with ARToolKit.
However, you need to use C on iOS and you cannot use Swift.
==Edit==
This question here gives some background regarding .FBX usage.
Import FBX to ARToolKit
In short .FBX is not supported and one should use the OSG commandline tool to convert .FBX to .OSG
--- Personal note ---
From my experience is easier to use the Unity3D plugin of ARToolKit5 or ARToolKit6 do you animations with Unity3D and export an iOS app.

How to convert Blender blend (or obj) file to Qualcom Vuforia .h file

I'm developing an iOS app with augmented reality using Qualcomm Vuforia and I have difficulty in understanding how to create 3D models from Blender (or other softwares). All the examples use .h files with the coordinates of the vertices to generate i.e. a teapot.
I can not find documentation useful for me.
Is there a tool to convert .blend or .obj files to .h (OpenGL ES)?
thanks
I developed a script called mtl2opengl that does exactly what you need, based on the project obj2opengl. The script works with .obj and .mtl files, which I think can be exported straight from Blender, and produces .h files with vertex data. I use it extensively in my iOS augmented reality applications (though I haven't used the Vuforia SDK yet) and the accompanying resources include a sample Xcode project too. Hope it helps!
You should check out BlenderVuforiaExport (developed by a coworker of mine) here:
https://github.com/StickyBeat/BlenderVuforiaExport
It exports objects from Blender to the same .h-format used in the Vuforia example project.
I don't know much about vufoia, but here are two answers about using blender to get 3d models and displaying them on iPhone. These may help you.
How to get proper number of vertices in OBJ file from DCC tools such as Blender for use in OpenGL ES?
Put a Cinema 4D model and Texture into an iPhone App

Resources