How to Export SolidWorks file to use in WebGL? - webgl

Is there a way to export or convert a SolidWorks file to be used in any way for WebGL?

Not sure where you want to go with this, but http://showwebgl.com/ (see http://plopbyte.net/2011/04/showwebgl/ for a brief description) will upload and display models in a variety of formats, and uses a WebGL framework called ogs.js. So you just need to convert your SolidWorks file to one of those.
A lot of WebGL frameworks accept Collada format. If you can't track these down, let me know.

Here is a Solidworks plugin which can export Solidworks models to a lightweight visualization format .pyd. This site has a webgl based pyd viewer to view published pyd file.

You can use the Collada export from the SolidWorks labs page. But you should consider simplifying your models before exporting them to Collada with a tool like Swap3D. This will decrease filesize dramatically. Export to Collada works much better when exporting a part instead of an assembly. So you should always consider saving an assembly as part.

Related

Load glTF model into RealityKit scene?

I'd like to load a glTF file generated by another program into RealityKit. I get the impression that the only way to load models into RealityKit is via USD or Reality files.
Anyone know a way to get some other model into RealityKit? Not necessarily as a file -- I'd be happy to be able to generate a MeshResource and array of Materials myself and load them in that way.
Reality Converter
Apple discussed this in the WWDC20 video "The artist’s AR toolkit".(link)
They show how to convert FBX, OBJ, USD and GLTF files to USDZs for use in Reality Composer.
Reality Converter is still in beta and needs to be downloaded from the Apple Developer website. I used it and it is quite nice.
There is also other tools you can use on the command line if this is more your thing. At WWDC 2019, Apple announced the USDZ Tools or also called USD Python Tools.
USDZ Tools is a pre-compiled Python library containing binaries of Pixar’s USD library for macOS. This is the link. You will need to download and install the library.
I would give a try to the Reality Converter first. I think it is here to stay since probably Apple has no intention to add support for glTF files in Reality Composer in the future, since they love USBZ!
I ended up using GLTFKit, an open source library by Warren Moore. It does exactly what I want -- lets me load a glTF file into SceneKit/RealityKit.
https://github.com/warrenm/GLTFKit
Alas, as you said, at the moment the only way to load your .gltf model in RealityKit scene – is firstly to convert it into .usdz model via Xcode command line tools. Also in RealityKit you can use .reality format (use it for a much faster uploading time) and .rcproject format that can be exported from Reality Composer app. These two file formats allow you store not only PBR shaders and animation but also a dynamics.
Please, read this post for further details.

How to prepare a 3D model to embed in an AR App

I have a newbie-questions regarding a 3D model I want to use in an AR App (with sceneform). The model itself is in .fbx format and I have 5 textures or maps (as .jpg files) for opacity, metal, roughness, base color and normal. Importing the .fbx model works, but I have no idea how to assign the textures to it. According to the documentation (https://developers.google.com/ar/develop/java/sceneform/custom-material), I need a .mat file. And that's my problem, how to create one. Manually or automatically. Where to start? Any idea/direction/good reading on the topic is helpful. Thank you in advance!
Convert model to .sfb using sceneform plugin or using .gltf extension.
On converted model or on model with .gltf extension, you can add texture on model pragmatically, there is a sample project for doing this on the following link
https://medium.com/temy/dynamic-textures-in-sceneform-98d7a2d35406
i implemented this in java. i hope this help you.
You only need a custom material (and a mat file) if you want to create an own shader for your model. Setting your different maps will be done in the *.sfa file. Just use the Android Studio Sceneform plugin and import your FBX model. It will automatically create a SFA file and you can set your maps there. An overview about which maps can be set for an FBX model can be found here.

How to get furniture models from 3d.io into ARKit

I'm displaying furniture from furniture.3d.io in my AR app, which works well in a web view using Google's WebARonARKit and aframe. However the tracking and lighting seems to be better when using ARKit natively.
ARKit requires models to be either .scn, .dae, or .obj formats. Is there any way to export the furniture from 3d.io so that I can use it in my app? Aframe has a gltf exporter that I could use, so I might try to manually convert a few models using from 3d.io -> .gltf -> .dae and blender, but can't figure out how to do it in a more automated way.
I would suggest you go from 3d.io -> .blend -> export .dae for any model that you wish to take to ARKit. Blender has a great collada exporter.
I'm not sure if exporting furniture / any 3d object is possible yet from 3d.io, but it should be possible from archilogic.com. You can also directly export collada files from archilogic, but texture export is not yet supported.
Edit: actually archilogic now exports diffuse, normal and specular maps for collada

ios : Displaying a simple 3D model with GLEssentials sample code

My goal is to display a simple 3D model and apply a texture on it.
I've downloaded the GLEssentials ios sample project to learn how to develop this (i'm new in ios OpenGL-ES API)
But the example model is a .model file, which I never heard about and which never appear in model bank websites.
What is this kind of file?
is the sample code compatible with other common model types (.obj, .c2d, .3ds)?
is it a good idea to start from this project?
Have a look at this question:
How to convert Blender blend (or obj) file to Qualcom Vuforia .h file
In my answer, I describe a script and accompanying Xcode project that converts .obj/.mtl files to header files suitable for OpenGL ES on iOS [link].
In response to your questions:
I believe the .model file is only appropriate for the sample project and is a proprietary Apple extension. It most likely contains simple data such as vertex positions.
I think you'd struggle to fit other model types into the sample code, which is very complex for OpenGL ES beginners. You might want to have a look at .pod files on Cocos2D here. I've seen and heard great things about it.
I wouldn't recommend it :)

How to convert Blender blend (or obj) file to Qualcom Vuforia .h file

I'm developing an iOS app with augmented reality using Qualcomm Vuforia and I have difficulty in understanding how to create 3D models from Blender (or other softwares). All the examples use .h files with the coordinates of the vertices to generate i.e. a teapot.
I can not find documentation useful for me.
Is there a tool to convert .blend or .obj files to .h (OpenGL ES)?
thanks
I developed a script called mtl2opengl that does exactly what you need, based on the project obj2opengl. The script works with .obj and .mtl files, which I think can be exported straight from Blender, and produces .h files with vertex data. I use it extensively in my iOS augmented reality applications (though I haven't used the Vuforia SDK yet) and the accompanying resources include a sample Xcode project too. Hope it helps!
You should check out BlenderVuforiaExport (developed by a coworker of mine) here:
https://github.com/StickyBeat/BlenderVuforiaExport
It exports objects from Blender to the same .h-format used in the Vuforia example project.
I don't know much about vufoia, but here are two answers about using blender to get 3d models and displaying them on iPhone. These may help you.
How to get proper number of vertices in OBJ file from DCC tools such as Blender for use in OpenGL ES?
Put a Cinema 4D model and Texture into an iPhone App

Resources