.fbx file with animations in ARKit - ios

I have an .fbx file which is having animations.
I tried using AssimpKit to import .fbx file but could not import the file I have added the file in project it self.
I used following code:
let filePath = Bundle.main.path(forResource: "Gag01_08", ofType: "fbx")
let animFile = SCNAssimpScene(named: "Gag01_08", inDirectory: filePath, options: nil)
How to import fbx file for ARKit?
Can the animations will auto play?

I just checked the project-space of AssimpKit. It does not support the Assimp-release 4.1, which brings a more stable fbx-support with it.
To work around this issue you can try to upgrade the AssimpKit on your own or use assimp 4.1 as the native wrapper: without AssimpKit in it.

Related

How do I search for multiple file types using Bundle.main.path

At the moment I've got
let audioPath = Bundle.main.path(forResource: songs[indexPath.row], ofType: "flac")
I want to have the audioPath find songs with multiple audio extensions like .mp3, .wav, etc but I can't figure out how to do this?
I looked in the bundle programming guide (https://developer.apple.com/library/content/documentation/CoreFoundation/Conceptual/CFBundles/AccessingaBundlesContents/AccessingaBundlesContents.html#//apple_ref/doc/uid/10000123i-CH104-SW7) but still couldn't figure it out.
Thanks heaps
edit: I know the names of the files, but in a situation where the folder has mp3's and flacs and wav files and all those other extensions, I want the code to be able to play all of those types of audio files and not just one.
edit: maybe this bit of code is also relevant where I tell it to find only flac files (also wasn't sure how to extend this to other audio file types):
for song in songPath
{
var mySong = song.absoluteString
if mySong.contains (".flac")

SCNScene is nil when loading using `named: `

I am using the following code to render a scene in scenekit and it works perfectly when the dae file is loaded from art.scnassests folder.
let scene = SCNScene(named: "art.scnassets/idle.dae")
However I want to download the asset and apply it and I am getting an error
let documentsURL = FileManager.default.urls(for: .documentDirectory, in: .userDomainMask).first!
let scene = SCNScene(named: documentsURL.absoluteString+"idle.dae")
A file named idle.dae exists in the folder.
I get the error: fatal error: unexpectedly found nil while unwrapping an Optional value
How to load the downloaded asset and apply dynamically? What am I doing wrong? Any pointers please? I am a noob to iOS programming.
Unless something has changed for iOS 11, you won't be able to download and instantiate a DAE file at runtime on iOS. They are compressed/compiled at build time using a utility named scntool.
Can you instead use one of the file formats supported by Model I/O? See https://developer.apple.com/videos/play/wwdc2015/602/?time=320 for the original list (Alembic .abc, Polygon .ply Triangles .stl, WaveFront .obj), and https://developer.apple.com/videos/play/wwdc2017/610/ for a quick discussion of Pixar's USD (Universal Scene Description).
If you're stuck with DAE files, Frederik Jacques has an article at https://the-nerd.be/2014/11/07/dynamically-load-collada-files-in-scenekit-at-runtime/ which outlines his experience reverse engineering the DAE processing pipeline. His technique allows downloaded SCN files which have been processed from DAE files on a server.
See also Load uncompressed collada file using iOS Scene Kit (with comments by an authoritative source) and https://forums.developer.apple.com/thread/38010.

Accessing a Sound-File via NSBundle - pathForResource

So I'm just trying to get into iOS Programming by going through some Online-Video-Courses. The problem is, that those are not up-to-date with Swift 3.
In the project I am at the moment it is required to import a Sound-File via the NSBundle-Framework using the following:
let path = NSBundle.mainBundle().pathForResource(name, ofType:)
The problem is, that I won't get this running in Swift 3 and I'm not able to get the right pieces off the Apple Documentation. So my Code looks the following:
import AVFoundation
import Foundation //don't know if this is right by 100 per cent
let path = NSBundle.mainBundle().pathForResource("btn", ofType: "wav")
But it seems like It doesn't even know the Class NSBundle, because Auto-Completion isn't even offering me any functions of NSBundle. I can't seem to find the right pieces which changed in Swift 3 to get this thing working. Anybody here who's able to help me with this?
I believe this is the ticket:
let path = Bundle.main.path(forResource: "btn", ofType: "wav")

How do you convert Wavefront OBJ file to an SCNNode with Model I/O

I've imported a Wavefront OBJ file from a URL and now I'd like to insert it into my scene (SceneKit) on my iOS 9 app (in Swift). What I've done so far is:
let asset = MDLAsset(URL: localFileUrl)
print("count = \(asset.count)") // 1
Any help converting this to a SCNNode would be appreciated. According to to Apple's docs:
Model I/O can share data buffers with the MetalKit, GLKit, and SceneKit frameworks to help you load, process, and render 3D assets efficiently.
But I'm not sure how to get buffer from an MDLAsset into a SCNNode.
Turns out this quite easy as many of the ModelIO classes already bridge. I was doing import ModelIO which gave me access to all the ModelIO classes and likewise import SceneKit which gave me the SceneKit classes, but, I was missing import SceneKit.ModelIO to bring in the SceneKit support for ModelIO.
let url = NSURL(string: "url-to-your-obj-here")
let asset = MDLAsset(URL: url!)
let object = asset.objectAtIndex(0)
let node = SCNNode(MDLObject: object)
Easy as that...

How can I use OBJ file or CTM file instead of DAE file in SceneKit?

I used to render 3d scene with openGL and metal on IOS, and the file format which I used was OBJ and CTM. These days I am trying Scene Kit. It seems that SceneKit only load DAE file. All the demos I can found on the Internet use DAE file , and I can't see the array of vertex and facet in their codes.
How can I load OBJ file or CTM file instead of DAE file?
Loading an OBJ file
It is as simple as passing MDLAsset a valid URL.
private func nodeForURL(url: NSURL) -> SCNNode
{
let asset = MDLAsset(URL: url)
let object = asset.objectAtIndex(0)
let node = SCNNode(MDLObject: object)
return node
}
This will not only correctly load the .obj file, but it will load referenced .mtl files.
you can do that by writing your own importer. Take a look at SCNGeometry, SCNGeometrySource and SCNGeometryElement.
edit: starting iOS 9.0 and OS X 10.11 SceneKit can open OBJ files or any other file format supported by Model I/O. You can use previously existing APIs to do that (such as +sceneNamed:) or the new +sceneWithMDLAsset: method.
As of iOS 9/OS X 10.11, you can use Model I/O's MDLAsset to import OBJ files (and a few other formats). How do you convert Wavefront OBJ file to an SCNNode with Model I/O has sample code.
EDIT: ModelIO can probably load OBJ files now. I haven’t tried this path myself. This answer was written before iOS 9 and OS X 10.11:
SceneKit can't load DAE files on iOS, actually, it actually precompiles the DAE files to an internal format for iOS devices.
If you want to transform your OBJs to DAEs you can write a simple importer/exporter on OS X to do so — on OS X SceneKit will actually read OBJ files (it's not documented but it works) and will write DAEs.
Or you could download the "Assimp" project on github if you'd like to try to read OBJs yourself, but it's going to be a bit of work getting it into SceneKit objects.
Further information regarding the supported file formats:
The following 3D file formats are supported by SceneKit and can be imported to an .scn file using the Scene Editor in Xcode:
DAE, OBJ, Alembic, STL and PLY files.
Source: WWDC 2015 Session "Enhancements to SceneKit" at 02:24

Resources