Spark AR How to rotate a target tracker - augmented-reality

Is it possible to rotate a target tracker in Spark AR? I have tried transforming it but the options are greyed out.
Also, I'm currently new to Spark AR but where can I get simple fbx models to play around with?

It is not possible to rotate a target tracker because it is not an object. The problem I was facing is that the texture used in the target tracker is for some reason rotated at 90 degrees with respect to the camera orientation, so the objects appearing upon the texture detection were rotated at 90 degrees as well.
THe workaround which worked for me was to rotate the objects by 90 degrees in the project; as a result, everything looks rotated at 90 degrees in the Spark AR viewport but works fine in the Spark AR Player app on the mobile.

Related

Why AR objects are trembling on vuforia`s 5*-markers?

We build the app for the jewelry store where AR object (the ring) is recognized when the ios app camera is pointed on the paper (and metal) marker we built in Vuforia. This is 5* marker (which is considered to be a good quality according to Vuforia) that we place on the finger.
So what we have - we have a pretty fast recognition, BUT we have unpleasant shaking of the ring, that is being recognized. The closer we point the camera to the marker - the more shaking is seen.
The paper maker has a normal cylinder form. The lighting is always enough while testing, etc
Any ideas on why this shaking appears?
Thanks in advance!
We were trying different markers, test in different conditions with different lighting, we played with camera settings, used different vuforia versions, etc - no luck
If you the target is cylindric in shape than you should create a Vuforia cylinder target https://library.vuforia.com/objects/cylinder-targets. Standard ImageTarget are expected to be planar in 3D shape. The detector still works as as the plane can locally approximate the cylindrical geometry but tracking will be extremely inaccurate and brittle.

ARKit Environment Depth makes 3D objects disappear

We are developing an Unity/ARKit app for outside exterior use. While testing inside, everything works fine and environment occlusion works as expected. But outside, when reaching a certain distance to the AR-objects, they start flickering until they disappear completely. Please see attached video.
https://www.youtube.com/watch?v=paJ92c1kNPE
I've tried to visualize the depth map, but i don't see any change in the values.
If i switch off environment depth, the models appear again.
What causes this and why is the occlusion not happening according to the depth map?
For example at second 0:25 in the video you see occlusion is working, but the depth map is completely red?
Or is the depth map only visualizing what the LIDAR sees?
Or is this just a limitation of ARKit?

Using OpenCV rotation and translation in iOS Metal to draw AR cube

I am new to playing the AR, computer vision and graphics. I am trying to use make a AR experience using OpenCV and Metal in iOS. Rotation and Translation was retrieved by SolvePnP and passed to Metal to draw the cube pose.
However, there are a few problems:
The cube jump out of the view as soon as the R and T are passed.
Even can see the cube sometimes, the rotation seems not match with phone movement.
Translation values seems not doing to move the cube with respect to phone movement. Therefore, failed to anchor to any surface or a point at space.
I can provide more information to analysis my problem if needed. Thank you.

Rotate MTLTexture

I'm using Metal to display camera frames. I convert the outputted sample buffers into id<MTLTexture>s using CVMetalTextureCacheCreateTextureFromImage, and that works great... except that the frames come rotated 90 degrees clockwise.
How can I rotate the id<MTLTexture> 90 degrees counter clockwise?
I considered doing this when I display the texture (in the MTKView), but then it will still be rotated the wrong way when I record the video.
You have at least a couple of different options here.
The easiest is probably requesting "physically" rotated frames from AVFoundation. Assuming you're using an AVCaptureConnection, you can use the setVideoOrientation API (when available) to specify that you want frames to be rotated before they're delivered. Then, displaying and recording can be handled in a uniform way with no further work on your part.
Alternatively, you could apply a rotation transform both when drawing the frame into the Metal view and when recording a movie. It sounds like you already know how to do the former. The latter just amounts to setting the transform property on your AVAssetWriterInput, assuming that's what you're using. Similar APIs exist on lower-level AVFoundation classes such as AVMutableComposition.

Inverse zoom in/out (scaling) of AR 3D model using ARToolkit for Unity5

I have done a AR project using ARToolkit for Unity. It works fine but the problem I'm trying to solve here is to inverse the scaling on 3D model. Right now when you take camera further away from the marker 3D object go smaller (Zooms out) and if I bring the camera closer,the 3D model gets bigger. But what I want to do is opposite of this behaviour.
Any ideas on how I go about it?
I think this is a bad idea because it completely breaks the concept of AR, which is that the 3D objects are related to the real world, but it is definitely possible.
ARToolkit provides a transformation matrix for each marker. That matrix includes position and rotation. On each iteration, the object is updated with those values. What you need to do is find where that is applied to the object and then measure the distance to the camera and update the translation to be at the distance you want.
That code is in the Unity plugin, so it should be easy to find.

Resources