SCNSceneSource identifiers with usdz file are empty - ios

We are working on a project with ARKit 2 + SceneKit, iOS 12.
We were able to retrieve CAAnimation instances with .dae files and control them at runtime.
Now, we are trying .usdz, but none of the classes can be extracted from SCNSceneSource with usdz :
let source = SCNSceneSource(url: url, options: options)
let animationIdentifiers = source?.identifiersOfEntries(withClass: CAAnimation.self)
animationIdentifiers is always empty. The same goes for all classes listed in the documentation for this function https://developer.apple.com/documentation/scenekit/scnscenesource/1523656-identifiersofentries
We have tested our own usdz creations as well as some usdz from https://fusionar.app.
Tough, the animations play nicely when viewing the file either on iOS or in Xcode, where we have access to the animations and scene graph settings :
But definitely not from code at runtime, so we are unable to control the animations.
Have you ever had this issue ?
Any insight on this ?

Retrieving entries from a SCNSceneSource only works for Collada files. When working with USDZ file you'll have to traverse the node hierarchy and retrieve the animation from the node that holds it using -animationPlayerForKey:.

Related

SCNScene is nil when loading using `named: `

I am using the following code to render a scene in scenekit and it works perfectly when the dae file is loaded from art.scnassests folder.
let scene = SCNScene(named: "art.scnassets/idle.dae")
However I want to download the asset and apply it and I am getting an error
let documentsURL = FileManager.default.urls(for: .documentDirectory, in: .userDomainMask).first!
let scene = SCNScene(named: documentsURL.absoluteString+"idle.dae")
A file named idle.dae exists in the folder.
I get the error: fatal error: unexpectedly found nil while unwrapping an Optional value
How to load the downloaded asset and apply dynamically? What am I doing wrong? Any pointers please? I am a noob to iOS programming.
Unless something has changed for iOS 11, you won't be able to download and instantiate a DAE file at runtime on iOS. They are compressed/compiled at build time using a utility named scntool.
Can you instead use one of the file formats supported by Model I/O? See https://developer.apple.com/videos/play/wwdc2015/602/?time=320 for the original list (Alembic .abc, Polygon .ply Triangles .stl, WaveFront .obj), and https://developer.apple.com/videos/play/wwdc2017/610/ for a quick discussion of Pixar's USD (Universal Scene Description).
If you're stuck with DAE files, Frederik Jacques has an article at https://the-nerd.be/2014/11/07/dynamically-load-collada-files-in-scenekit-at-runtime/ which outlines his experience reverse engineering the DAE processing pipeline. His technique allows downloaded SCN files which have been processed from DAE files on a server.
See also Load uncompressed collada file using iOS Scene Kit (with comments by an authoritative source) and https://forums.developer.apple.com/thread/38010.

How to set path to picture in folder in device in android appcelerator app

So I have app where I want to let users to share screenshot of score to facebook etc... I'm using SocialShare widget. In documentation it says to set path to image like this: "image:fileToShare.nativePath", but I'm not really sure how to set it. Another problem is that I need to share picture that has always different name, it saves screenshots with names like tia7828157.png,tia107997596.png... in folder in device internal memory in pictures/enigmania/ I'm new to appcelerator, so I dont know if there is something like wildcard I could use for this? Thanks for any help.
This is my code so far which I know is wrong, I know the widget works because it shares text without problem:
function shareTextWidget(e){
// share text status
var socialWidget=Alloy.createWidget('com.alcoapps.socialshare');
socialWidget.share({status:"Enigmania kvíz",androidDialogTitle:"hoho",image:test.png/pictures/enigmania});
}
You should use Ti.Filesystem class methods/properties to get the path of any file located on internal or external storage.
Also aware of the permissions of reading storage on Android 6+. Use Storage Permissions before accessing any file on Android 6+.
Simple code snippet to create a directory on internal storage at this location: pictures/enigmania and then write an image file of captured view in this directory.
function shareTextWidget(e){
var directory = Ti.Filesystem.getFile(Ti.Filesystem.externalStorageDirectory, 'pictures/enigmania');
!directory.exists() && directory.createDirectory();
var fileToShare = Ti.Filesystem.getFile(directory.resolve(), 'screen.jpg');
fileToShare.write($.SCREENSHOT_VIEW.toImage()); // write the blob image to created file
var socialWidget=Alloy.createWidget('com.alcoapps.socialshare');
socialWidget.share({status:"Enigmania kvíz",androidDialogTitle:"hoho",image:fileToShare.nativePath});
}
This code should work without any issues.
Note that $.SCREENSHOT_VIEW is the View ID for which you will take screenshot, so it depends on you how you maintain your View positions in order to capture correct screenshot, but point is to use Ti.UI.View toImage() method to capture the screenshot of particular view.
Let me know if this works for you or not, else we can look into other methods by getting your exact requirements. Good Luck!!!!

How can I use OBJ file or CTM file instead of DAE file in SceneKit?

I used to render 3d scene with openGL and metal on IOS, and the file format which I used was OBJ and CTM. These days I am trying Scene Kit. It seems that SceneKit only load DAE file. All the demos I can found on the Internet use DAE file , and I can't see the array of vertex and facet in their codes.
How can I load OBJ file or CTM file instead of DAE file?
Loading an OBJ file
It is as simple as passing MDLAsset a valid URL.
private func nodeForURL(url: NSURL) -> SCNNode
{
let asset = MDLAsset(URL: url)
let object = asset.objectAtIndex(0)
let node = SCNNode(MDLObject: object)
return node
}
This will not only correctly load the .obj file, but it will load referenced .mtl files.
you can do that by writing your own importer. Take a look at SCNGeometry, SCNGeometrySource and SCNGeometryElement.
edit: starting iOS 9.0 and OS X 10.11 SceneKit can open OBJ files or any other file format supported by Model I/O. You can use previously existing APIs to do that (such as +sceneNamed:) or the new +sceneWithMDLAsset: method.
As of iOS 9/OS X 10.11, you can use Model I/O's MDLAsset to import OBJ files (and a few other formats). How do you convert Wavefront OBJ file to an SCNNode with Model I/O has sample code.
EDIT: ModelIO can probably load OBJ files now. I haven’t tried this path myself. This answer was written before iOS 9 and OS X 10.11:
SceneKit can't load DAE files on iOS, actually, it actually precompiles the DAE files to an internal format for iOS devices.
If you want to transform your OBJs to DAEs you can write a simple importer/exporter on OS X to do so — on OS X SceneKit will actually read OBJ files (it's not documented but it works) and will write DAEs.
Or you could download the "Assimp" project on github if you'd like to try to read OBJs yourself, but it's going to be a bit of work getting it into SceneKit objects.
Further information regarding the supported file formats:
The following 3D file formats are supported by SceneKit and can be imported to an .scn file using the Scene Editor in Xcode:
DAE, OBJ, Alembic, STL and PLY files.
Source: WWDC 2015 Session "Enhancements to SceneKit" at 02:24

AVMutableMetadataItem's time & duration INVALID after reading

I have a question.
Recently I needed to add custom tags for recorded video. Local video on device not a streamed video. The task is to add some event specific tags in video, position of which could be set by pressing forward/backward like buttons like in any player.
It is not important whether the movie file will be mov file or mp4 format.
I searched on forum, found several samples how to add metadata using AVExportSession & it worked.
Although, when I tried to add metadata using AVAssetWriter. I wasn't able to append attributes to video.
What I do not understand is that after adding attribute, returned (time & duration) properties are always invalid.
For instance let's say I have a video with duration 2 seconds.
I have tried different key spaces. I am not able to write keys' from ID3 space.
IS ID3 used for stream video? (as far as I understood ID3 metadata of .mp3). Therefore, I was not able to write it into MPEG-4 file
I also used QuickTimeUserData & ISOUserData but again results are the same.
Here is an example
AVMutableMetadataItem *item2 = [AVMutableMetadataItem new];
item2.keySpace = AVMetadataKeySpaceiTunes;
item2.key = AVMetadataiTunesMetadataKeyUserComment;
item2.value = #"One two three";
item2.duration =CMTimeMakeWithSeconds(1, 1);
item2.time = CMTimeMakeWithSeconds(0, 1);
After reading I got the following:
AVMutableMetadataItem: 0xa4301f0, keySpace=itsk, key=\U00a9cmt, commonKey=(null), locale= (null), value=One two three, time={INVALID}, duration={INVALID}, extras={\n dataType = 1;\n}
I would like to use time & duration properties for metadata instead of writing custom data and processing it after that.
Ideally it would be great to append array of items with time = t1, duration = d1, .... (tn,dn).
Does anyone know how to accomplish that?
I've ended with a solution adding chapters to a video file instead of using metadata.
I looked at available libraries, took mpv4lib.
The library currently is not compiled for iOS, therefore, I ported the source project into static library for iOS platform.
That library allows to add custom "atoms" to mp4 file, and one of them is Quick Time text track, containing chapters.
I do similar with that post
The library is located here.

Flex/Flash Builder/Actionscript/AIR/Mobile iOS How to take video using the camera and/or browse for & view/access video stored in the 'Camera Roll"

My understanding currently is that:
CameraUI
I can use the CameraUI to access the built in camera for MediaType.VIDEO and that delegates to the built-in video camera app and lets me record a video. My app does that now.
When I stop recording and click the "Use" button, I am returned to my app and theoretically I have a valid MediaPromise.
iOS does -not- provide a valid/usable url/filename to the recorded video (or to photos) and so I would have to use a Loader to bring-in/use/access the 'recorded' video... AND... iOS does not actually create a file anywhere on the device, most importantly, in the Camera Roll where one would expect by the normal behavior when uses the system native camera/video app.
The documentation says that the Loader can load various image types and SWFs but nothing about video data, so I conclude from that that I cannot actually use the CameraUI to generate a valid MediaPromise that I can then pass to a Loader class or similar to read in the information created by the system camera and then manipulate (upload, save to applicationStorageDirectory, and/or display in one of the two video player components available in the API).
CameraRoll
I can have video entities in the iOS Camera Roll but the AS3/Air3.5 CameraRoll class won't let me view/access/reference them in any way.
Normal File I/O
All my attempts to use the Air3.5 File classes to browse to the storage location of the iOS Camera Roll have been rebuffed.
------- Questions -------
Am I correct in believing that there is a way to take video but no way to use the video that's been captured. (No way to use the resulting MediaPromise successfully).
I believe you can take video and access it using Android, but there's nothing in the documentation that says that you cannot using iOS.
Am I correct in believing that iOS sandboxes apps so that they cannot browse to video/photo storage using standard File I/O, but only through the apparently non-workable means I've tried (CameraUI & CameraRoll)
Am I wrong to think that these should be rather obvious NEEDS that one can achieve using the XCode Objective C++ etc route but the AIR Mobile Framework does not allow either because of Apple blocking functionality or because Adobe has failed to meet reasonable expectations?
One item of ironic note to convey. If I use the iOS system camera app to record a video, a thumnail of that video then appears in the Gallery/Camera Roll, and of course, I can share it or view it, or whatever... If I use AIR's CameraRoll.browseForImage(), provided I haven't used the camera to take another image, when it shows me the folder where the pictures are stored, the folder icon uses the thumbnail of the last object added... in this case, the video I took, but if I then enter the folder, the video cannot be found. It's teasing us. It knows it's there, but it is apparently forbidden fruit.
I can't answer all your questions, so this entry may not be acceptable, but I found this page while searching a solution for some the problems you described and thought that someone else may find this answer (partially) useful.
To save the movie you just took you need to open and read the data from the promise.
The iOS won't save the file anywere, so the MediaPromise.file is always null.
This is my solution to the problem:
private var camera:CameraUI;
private var dataInput:IDataInput;
public function recordVideo():void
{
// Start the camera and ask for a video
camera = new CameraUI();
camera.addEventListener(MediaEvent.COMPLETE, onCameraComplete);
camera.launch(MediaType.VIDEO);
}
private function onCameraComplete(event:MediaEvent):void
{
// event.data is a MediaPromise and MediaPromise.open() returns a IDataInput
// Let's cast it to a dispatcher and check when it's complete
dataInput = event.data.open();
var dispatcher:IEventDispatcher = IEventDispatcher(dataInput);
dispatcher.addEventListener(Event.COMPLETE, onDataInputComplete);
}
private function onDataInputComplete(event:Event):void
{
// We can do whatever we want with the data, so we'll store it in a File
var file:File = new File();
var bytes:ByteArray = new ByteArray();
var stream:FileStream = new FileStream();
// Reading the data from the opened MediaPromise
dataInput.readBytes(bytes);
stream.open(file, FileMode.WRITE);
stream.writeBytes(bytes, 0, bytes.bytesAvailable);
stream.close();
}
Also, I'm still looking for a way to put the movie in the CameraRoll

Resources