Get pitch, roll and yaw relative to geographic north on iOS? - ios

I see that I can retrieve CMAttitude from a device and from it I can read 3 values which I need (pitch, roll and yaw).
As I understand, this CMAttitude object is managed by CoreMotion which is a Sensor Fusion manager for calculating correct results from compass, gyro and accelerometer together (on Android it is SensorManager Class).
So my questions are:
Are those values (pitch, roll and yaw) relative to the magnetic north and gravity?
If above is correct, how can I modify it to give me results relative to the geographic north?
If a device (such as iPhone 3GS) doesn't have an gyroscope, do I have to tell it to Manager or can I just tell it to give me the device's attitude based on the sensors it has (acc + gyro + compas OR acc + compas)

and 2:
iOS 5.0 simplifies this task. CMMotion manager has new method:
- (void)startDeviceMotionUpdatesUsingReferenceFrame:(CMAttitudeReferenceFrame)referenceFrame
As reference frame you can use this values:
CMAttitudeReferenceFrameXMagneticNorthZVertical for magnetic north,
CMAttitudeReferenceFrameXTrueNorthZVertical for true north.
If you want to do this with older iOS im afraid you have to calibrate this by yourself using current user location.
Try checkout this resources:
"What's New in Core Motion" WWDC 2011 video,
"Sensing Device Motion in iOS 4" WWDC 2010 video
3.
If device has no gyro, the deviceMotionAvailable property of CMMotionManger will be "NO" (it is equivalent to gyroAvailable property) and you cannot get attitude using device motion. The only thing you can do is to read accelerometer and magnetometer data directly.

Related

The coordinate system of ARKit unstable

I load a model in the AR environment and add an ARAnchor to stabilize the model. When I place the device on the desktop and picked up later. The model’s position is not changed, but it will fly away soon. The coordinate system of ARKit will fly and be unstable.
How to avoid or deal with this situation
ARKit/RealityKit world tracking system is based on a combination of five sensors:
Rear RGB Camera
LiDAR Scanner
Gyroscope
Accelerometer
Magnetometer
Three latter ones are known as Inertial Measurement Unit (IMU) that operates at 1000 fps. But what sees your RGB Camera (running at 60 fps) and LiDAR (also at 60 fps) is very important too.
Hence, a stability of world tracking greatly depends on camera image.
Here are some recommendations for high-quality tracking:
Track only well-lit environment (if you don't have LiDAR)
Track only static objects (not moving)
Don't track poor-textured surfaces like white walls (if you don't have LiDAR)
Don't track surfaces with repetitive texture pattern (like Polka Dots)
Don't track mirrors, chrome and glass objects (reflective and refractive)
Move your iPhone slowly when tracking
Don't shake iPhone when tracking
Track as much environment as possible
Track high-contrast objects in environment (if you don't have LiDAR)
If you follow these recommendations, coordinate system in ARKit will be stable.
And look at the picture in this SO post – there are a good example for tracking and a bad one.

True heading and true course of an iphone

I am working on some project where I required to make an app which can tell course of an iphone but without GPS. I can use GPS to get initial fix.
Now I can get true heading using compass but that is the orientation of phone w.r.t north pole not the direction in which phone is moving?
How can I get course using compass.
I have looked at accelerometer also but in most of the stack quesions it is advised not to use acclerometer for distance and speed calculation.
Any help appreciated!!
The 'course' you are referring to is actually the ground speed vector measured by the GPS (measured by combining two or more consecutive GPS readings and calculating the speed using the timestamps : v = dx/dt).
The compass has nothing to do with course. You can hold your iPhone however you want and walk at a certain direction. Your compass is sensitive to your phone orientation, not movement.
The accelerometer, as its name implies, measures acceleration. It would be difficult to deduce your course solely from that information because of noise.
So your solution is found in CLLocation class : properties speed and course
https://developer.apple.com/library/ios/documentation/CoreLocation/Reference/CLLocation_Class/index.html#//apple_ref/occ/instp/CLLocation/speed

Is Apple's iPhone magnetometer calibration working properly?

I'm currently developing an iPhone App (on iPhone 5, iOS 7, Xcode 5) which requires a very accurate determination of the current attitude. The "attitude" of CMDeviceMotion does not fulfil these requirements because Apple's sensor fusion algorithm seems to rely too much on the gyroscope which drifts away rather fast (in my experience). That's why I decided to read out the bare sensor data and later I want to combine it within a sensor fusion algorithm by myself.
When asking for magnetometer data one has two possibilities:
via CMMagnetometerData in CMMotionManager
via CMCalibratedMagneticField in CMDeviceMotion about which Apple says
The CMCalibratedMagneticField returned by this property gives you the total magnetic field in the device’s vicinity without device bias. Unlike the magneticField property of the CMMagnetometer class, these values reflect the earth’s magnetic field plus surrounding fields, minus device bias.
In principle (2.) is exactly what I want.
There is a very simple test if magnetometer data is calibrated properly. For simplicity one can restrict oneself to two dimensions. When the device lies on it's back, the combination B_x^2 + B_y^2 must be constant, independent of the direction the device is pointing to. It must just equal the horizontal component of the Earth's magnetic field (assuming no other fields in the vicinity of the device). Thus, when performing a 360 degrees turn of the device which lies on it's back, the measured data B_y over B_x should display a circle. See here for details.
Now the point: the data of CMCalibratedMagneticField does NOT result in a circle!
Does anyone have an explanation for that? Or does anyone know, how the CMCalibratedMagneticField comes about? Is the magnetometer calibrated in the sense of the link from above when performing the "eight-shaped" movement of the device or what is the movement good for?
Btw. why the "eight-shaped" movement and not flipping the device around it's three axis, which would allow a calibration as described in the link from above?
I would be very glad for any clarification with this issue... Thanks!
There is a problem with the magnetometer in iOS 7, it has an error of +-7º. Try using the 7.1 beta version.
EDIT
The magnetometer has zero-drift over time, but is pretty inaccurate for sudden changes in position. The accelerometer and gyroscope on the other hand adjust quickly for sudden changes but, being inertial sensors, they lose accuracy over a period of time.
So when CMCalibratedMagneticField tries compensate for your rotational motion it uses data from the gyroscope and accelerometer. This is when the accelerometer and gyroscope's +-7º error creeps in and throws your circle off track. Check this answer and this wikipedia article for more info.
As regards to the figure of eight:
Both do the same thing, they orient the "North" of your device in each direction in hope of cancelling out magnetic interference. Flipping your device along all three axes will work better but it is harder to perform and not as easily understood by the user.
Hope this helps.

CMDeviceMotion userAcceleration is upside down?

I'm seeing some unexpected readings from the userAcceleration field in CMDeviceMotion. When I look at the raw accelerometer data from CMAccelerometerData, I see that if the iPhone is flat on a table the reading is 1G straight down (1G in -Z axis) and if I drop the iphone (on a soft surface of course) then the acceleromtere reading goes to zero as expected. That's all fine. When I instead use the CMDeviceMotion class, the userAcceleration reading is zero as expected when the iPhone is flat on table. Again this is fine. But when I drop the iPhone and read the CMDeviceManager userAcceleration, the userAcceleration values are 1G straight up (+Z) not down (-Z) as expected. It appears that the userAcceleration readings are actually the exact opposite of what acceleration the device is really experiencing. Has anyone else observed this? Can I just invert (multiply by -1) all the userAcceleration values before I try to integrate for velocity and position, or am I misunerstanding what userAcceleration is reading?
There are some conceptual differences between CMAccelerometerData.acceleration and CMDeviceMotion.userAcceleration
Raw accelerometer data is just the sum of all accelerations measured i.e. a combination of gravity and current acceleration of the device.
Device motion data is the result of sensor fusion of all 3 sensors i.e. accelerometer, gyroscope and magnetometer. Thus bias and errors are eliminated (in theory) and the remaining acceleration data is separated into gravity and acceleration to be used conveniently.
So if you want to compare both you have to check CMAccelerometerData.acceleration against CMDeviceMotion.userAcceleration + CMDeviceMotion.gravity to compare like with like.
In general CMDeviceMotion is your first choice in most cases when you want precise values and hardware independency.
Another thing to consider is the CMAttitudeReferenceFrame you provide when starting Device Motion updates via startDeviceMotionUpdatesUsingReferenceFrame. I am not sure what is the default when using the basic version startDeviceMotionUpdates
You stated that you want to integrate the values to get velocity and position. There are several discussions about this and at the bottom line I can say it's impossible to get reasonable results. See:
Finding distance using accelerometer in iPhone
Getting displacement from accelerometer data with Core Motion
How can I find distance traveled with a gyroscope and accelerometer?
If your app concept forces you to rely on precise results for more than half a second, try to change it.
It turns out the CMAcceleration is not obey the right hand rule, which x is point to left, y is point to the screen bottom, in that case, with a typical right hand system, z axis should point to the upper side,but its not.
It makes me uncomfortable when dealing with motion sensors!

Astronomical altitude from iOS device

How can I retrieve the astronomical altitude that an iOS device is pointed towards? The goal is to be able to point the devices camera at the sky and have it display the altitude.
Astronomical altitude is the angle between an object and the observer's local horizon. This is different then the altitude returned from the location manager. More info: http://en.wikipedia.org/wiki/Altitude_(astronomy)
There is no altitude or similar returned by a CLLocationManager.
What you need is CMMotionManager.
Start a timer with the readout frequency of your choice and read the attitude of your CMMotionManager. Remember to create only one instance of CMMotionManager.
The CMAttitude object has a property called pitch which gives you the rotation around a lateral axis that passes through the device from side to side. So basically it is the same as your desired altitude.

Resources