AR Core - Record camera motion - augmented-reality

How can I track the motion of the camera in AR Core so as to have a good estimate of the path the camera took while the user was moving their phone?

Related

How do you switch the AR inputs in an iPhone to drone outputs?

I'm having a hard time finding documentation for utilizing data from a drone to perform AR functions in an iPhone. These are the tasks I would like to do with drone data:
Object capture
Object tracking
World tracking
In summary, can you replace the depth camera and motion tracking inputs in an iPhone with exterior data flows?
Object capture seems straight forward - I would just need to pull visual and depth data from the drone to create a USDZ with absolute measurements like in Capturing Photographs for RealityKit Object Capture. However, I can't find any documentation on how to track objects from a drone like in Scanning and Detecting 3D Objects or create a world map from a drone (e.g. with motion tracking data, etc.) like in Saving and Loading World Data
I do NOT want to use object detection like in Recognizing Objects in Live Capture because still picture models take days to train compared to 3D object detection (like in 'Scanning and Detecting 3D Objects' above)

Creating a Trajectory using a 360 camera video without use of GPS, IMU, sensor, ROS or LIDAR

Creating a Trajectory using a 360 camera video without use of GPS, IMU, sensor, ROS or LIDAR
Input is a video, created using a 360 camera(Samsung Gear 360). I need to plot a trajectory (without the use of ground truth poses) as I move around in an indoor location(that is I need to know the camera locations and plot accordingly).
Firstly, camera calibration was done by capturing 21 pics of the chessboard, and by using OpenCV methods, camera matrix(3x3 matrix which includes fx,fy,cx,cy, and skew factor) was achieved which was then given input to a text file.
Have tried: Feature detection(ORB, SIFT, AKAZE..) and tracking (Flann and Brute Force) methods. It works well for a single space but fails if a video is of a multi-storey building. Tested on this multi-storey building:https://youtu.be/6DPFcKoHiak and results obtained were:
An example of camera motion estimation that is required: https://arxiv.org/pdf/2003.08056.pdf
Any help regarding on how to plot camera poses with the use of VSLAM Visual odometry or any other.

Object Detection with moving camera

I understand that with a moving object and a stationary camera, it is easy to detect objects by subtracting the previous and current camera frames. It is also possible to detect moving objects when the camera is moving freely around the scene.
But is it possible to detect stationary objects with a camera rotating around the object? The movement of the camera is predefined and the camera is only restricted to the specified path around the object.
Try camshift demo which locates in opencv source code with this path: samples/cpp/camshiftdemo.cpp. Or other algorithms like meanshift,KCF,etc. These are all object tracking algorithms.

Show 3d object on device live camera in Unity3d for iOS

How can I render a 3d or 2d (.jpg or .png) object at device live camera in Unity3d? This 3d object shouldn't be stuck to the camera. It has to show at specific point, like we can set any object in game scene at particular location.

Get pitch, roll and yaw relative to geographic north on iOS?

I see that I can retrieve CMAttitude from a device and from it I can read 3 values which I need (pitch, roll and yaw).
As I understand, this CMAttitude object is managed by CoreMotion which is a Sensor Fusion manager for calculating correct results from compass, gyro and accelerometer together (on Android it is SensorManager Class).
So my questions are:
Are those values (pitch, roll and yaw) relative to the magnetic north and gravity?
If above is correct, how can I modify it to give me results relative to the geographic north?
If a device (such as iPhone 3GS) doesn't have an gyroscope, do I have to tell it to Manager or can I just tell it to give me the device's attitude based on the sensors it has (acc + gyro + compas OR acc + compas)
and 2:
iOS 5.0 simplifies this task. CMMotion manager has new method:
- (void)startDeviceMotionUpdatesUsingReferenceFrame:(CMAttitudeReferenceFrame)referenceFrame
As reference frame you can use this values:
CMAttitudeReferenceFrameXMagneticNorthZVertical for magnetic north,
CMAttitudeReferenceFrameXTrueNorthZVertical for true north.
If you want to do this with older iOS im afraid you have to calibrate this by yourself using current user location.
Try checkout this resources:
"What's New in Core Motion" WWDC 2011 video,
"Sensing Device Motion in iOS 4" WWDC 2010 video
3.
If device has no gyro, the deviceMotionAvailable property of CMMotionManger will be "NO" (it is equivalent to gyroAvailable property) and you cannot get attitude using device motion. The only thing you can do is to read accelerometer and magnetometer data directly.

Resources