Metadata-Extractor orientation gyroscope - orientation

I would like to extract the following metadata (user name, date, time, orientation and gyroscope) from photos taken indoors with a smartphone camera. Can Metadata Extractor extract the gyroscope values for each photo from the EXIF file? The value of the gyroscope will determine the tilt of the phone - whether it's pointed at the floor, wall or ceiling. Thanks.

Related

AR Core - Record camera motion

How can I track the motion of the camera in AR Core so as to have a good estimate of the path the camera took while the user was moving their phone?

Calculate speed of a moving object with cameras

Is it possible to find the speed or direction of ships that are moving by, using a camera mounted on another ship?
The information I know is the speed, heading (true north), roll, pitch, and camera parameters of the ship where the camera is installed.
You could of course calculate the speed and direction of objects in terms of pixels per frame.
To get speed values of the real object, you would however need to use something like calibrated stereo cameras to know the distance of the objects to the camera.
Once the distance of objects in the images is know, the parameters of the moving camera could be included into the calculation.

Can I measure the size of an object in a photo taken by iPhone?

The ARKit can measure real world things, but can I measure the size of an object in an image if I have the exif data telling the focal length and camera resolution ?

Path creation using accelerometer and gyroscope

I am using TI sensor tags, and getting the raw data from this sensor in form of gyroscope (x,y,z) , accelerometer (x,y,z) and magnetometer (x,y,z). Is there a way to find out the path by which I am moving the sensor.I am rotating the sensor in circular path and want to draw the same path.

Get pitch, roll and yaw relative to geographic north on iOS?

I see that I can retrieve CMAttitude from a device and from it I can read 3 values which I need (pitch, roll and yaw).
As I understand, this CMAttitude object is managed by CoreMotion which is a Sensor Fusion manager for calculating correct results from compass, gyro and accelerometer together (on Android it is SensorManager Class).
So my questions are:
Are those values (pitch, roll and yaw) relative to the magnetic north and gravity?
If above is correct, how can I modify it to give me results relative to the geographic north?
If a device (such as iPhone 3GS) doesn't have an gyroscope, do I have to tell it to Manager or can I just tell it to give me the device's attitude based on the sensors it has (acc + gyro + compas OR acc + compas)
and 2:
iOS 5.0 simplifies this task. CMMotion manager has new method:
- (void)startDeviceMotionUpdatesUsingReferenceFrame:(CMAttitudeReferenceFrame)referenceFrame
As reference frame you can use this values:
CMAttitudeReferenceFrameXMagneticNorthZVertical for magnetic north,
CMAttitudeReferenceFrameXTrueNorthZVertical for true north.
If you want to do this with older iOS im afraid you have to calibrate this by yourself using current user location.
Try checkout this resources:
"What's New in Core Motion" WWDC 2011 video,
"Sensing Device Motion in iOS 4" WWDC 2010 video
3.
If device has no gyro, the deviceMotionAvailable property of CMMotionManger will be "NO" (it is equivalent to gyroAvailable property) and you cannot get attitude using device motion. The only thing you can do is to read accelerometer and magnetometer data directly.

Resources