Detecting phone tilt with CoreMotion - ios

I'm trying to use CoreMotion to detected whether phone is a little bit tilted in order to alert the user when the phone is not in an upright position.
Let's say that the device is in upright position when it's rotated left or right in less than 40 degrees and it's rotated backward or forward in less than 40 degrees.
Now I've used the CMMotionManager like that:
motionManager.deviceMotionUpdateInterval = 1.0 / 60.0
motionManager.showsDeviceMovementDisplay = true
motionManager.startDeviceMotionUpdates(to: OperationQueue()) { motion, error in
if let data = motion {
let pitch = data.attitude.pitch.toDegrees()
let roll = data.attitude.roll.toDegrees()
let yaw = data.attitude.yaw.toDegrees()
print("Pitch: \(pitch), roll: \(roll), yaw: \(yaw).")
}
}
extension Double {
func toDegrees() -> Double {
self * 180.0 / .pi
}
}
If I understand correctly, the pitch value will tell me that the phone is rotated left or right, and the yaw value will tell me that the phone is rotated forward or backward. Now I have looked at the printed values, and pitch seems to be fine, it is equal to 90 degrees when the phone is upright. But the yaw value is not changing as I would expected while rotating the phone.
The question is, am I doing something wrong? Which values I should use to check whether phone is not rotated forward / backward or left / right?

Related

Calculate Image Rotation - (0° 360°)

I'd like to calculate the rotation of an image in degrees (calculate affine transformation ) .
This is what I got so far:
extension Double {
var toDegrees: Double { return self * 180 / .pi }
}
let rotationAngle = Double(atan2f(Float(self.myView.transform.b), Float(self.myView.transform.a))).toDegrees
print(rotationAngle)
The "rotationAngle" I'm getting looks quite good but tbh I't is not:
But if I rotate the image about 0° or about 180° I get the same values (same for rotating the image about 90° && -90° --> values are both 45°):
EDIT: This is how I can apply rotation on my imageView:
self.ImageView.transform = CGAffineTransform(rotationAngle: CGFloat(.pi))
How can I fix this to get a unique value for every rotation angle? (0° - 360°).
EDIT:
See some more rotations (the first value is the degree value I've turned the image, the second one is the calculated rotationAngle):
EDIT 2:
0° --> rotationAngle: 0.249977666546474, transform.b: 0.004362960593833, transform.a: 0.999990482242134
-90° --> rotationAngle: -44.999557289943, transform.b: -0.999984511428548, transform.a: 0.00556568980525373
180° --> rotationAngle: 0.0352338243182017, transform.b: 0.00061494629500543, transform.a: -0.999999810920509
90° --> rotationAngle: 44.9995128937134, transform.b: 0.99998300906796, transform.a: -0.00582937178330692

How to calculate how much ARKit SCNNode is rotated around the y axis

i am trying to know how much SCNNode is rotated around the y axis. Everytime ARKit world is created it has different rotation, my idea is to scan QRCode, recognize if it rotated 0,90,180,270 degrees (already done), and place object always on exacly same direction, QRCode not rotated (0 degree) will point for example to North.
I was trying to get eulerAngles from pointOfView but don't know how to translate it and let to know how is rotated from (0,0,0). Any ideas?
ANSWER:
if let hitTestResult = hitTestResults.first, let camera = self.pointOfView {
let yaw = atan2f(camera.simdWorldTransform.columns.0.x, camera.simdWorldTransform.columns.1.x)
// convert yaw from [-180; 180] to 0; 360
let yawEuler = ARKitMathNormalizedDegressToCircleDegrees(Double(yaw))
// calculate angle between qrCode north rotation and current point of view
let angleToRotateTransform = ARKitMathDistanceAngleBetween(firstAngleInDegrees: yawEuler, secondAngleInDegrees: qrCodeRotation)
// make rotation matrix and apply to hit results
let rotationMatrix = self.getRotationAroundY(Float(angleToRotateTransform.degreesToRadians))
let transform = simd_mul(hitTestResult.worldTransform, rotationMatrix)
}
`

Using CMQuaternion to calculate yaw

I'm currently working on an app that has to track yaw (rotation around z-axis).
The user will have the iPhone in landscape orientation pointing to an object like a car in front of him. As the user starts walking around the object I have to track the yaw for a complete 360 rotation.
I'm currently using CoreMotion with CMQuaternion to track the rotation:
private func startDeviceMotionUpdates() {
if motionManager.isDeviceMotionAvailable {
let queue = OperationQueue()
motionManager.deviceMotionUpdateInterval = 1 / 60
motionManager.startDeviceMotionUpdates(using: .xArbitraryZVertical, to: queue, withHandler: { (motion, error) in
guard let motion = motion else { return }
let quat = motion.attitude.quaternion
let yaw = 2 * (quat.x * quat.y + quat.w * quat.z)
let yawDegrees = self.degreesFromRadians(yaw)
print(yawDegrees)
})
}
}
private func stopDeviceMotionUpdates() {
motionManager.stopDeviceMotionUpdates()
}
private func degreesFromRadians(_ radians: Double) -> Double {
return radians * 180 / .pi
}
This currently works fine as long as the roll or pitch of the device doesn't change.
What I would like to know is:
Why does the yaw change when the roll or pitch of device changes?
How can I accurately track the yaw around the z-axis without it being affected by change on the x and y-axis?
I've been banging my head for two weeks against this problem. I would really appreciate if someone could help me understand why this is happening and how to get around it.
try below code.
let yaw = atan2(motion.gravity.x, motion.gravity.y) - Double.pi
let yawDegrees = self.degreesFromRadians(yaw)
print(yawDegrees)

Core Motion: how to tell which way is "up"?

I'm trying to duplicate the functionality in the Compass app - and I'm stuck on a particular bit: how do I figure out which way is "up" in the interface?
I've got a label onscreen, and I've got the following code that orients it to remain horizontal as the device moves around:
self.motionManager = CMMotionManager()
self.motionManager?.gyroUpdateInterval = 1/100
self.motionManager?.startDeviceMotionUpdatesToQueue(NSOperationQueue.mainQueue(), withHandler: { (deviceMotion, error) -> Void in
let roll = -deviceMotion.attitude.roll
self.tiltLabel?.transform = CGAffineTransformRotate(CGAffineTransformIdentity, CGFloat(roll))
})
This effect is pretty good, but it's got a few states where it's wrong - for example, the label flips erratically when the iPhone's lightning connector is pointed up.
How do I consistently tell which direction is up using CoreMotion?
UPDATE: Apparently, roll/pitch/yaw are Euler angles, which suffer from gimbal lock - so I think the correct solution might involve using quaternions, which don't suffer from this issue, or perhaps the rotationMatrix on CMAttitude might help: https://developer.apple.com/library/ios/documentation/CoreMotion/Reference/CMAttitude_Class/index.html
It doesn't need to be quite so complicated for the 2D case. "Up" means "opposite gravity", so:
motionManager.startDeviceMotionUpdatesToQueue(NSOperationQueue.mainQueue()) { (motion, error) in
// Gravity as a counterclockwise angle from the horizontal.
let gravityAngle = atan2(Double(motion.gravity.y), Double(motion.gravity.x))
// Negate and subtract π/2, because we want -π/2 ↦ 0 (home button down) and 0 ↦ -π/2 (home button left).
self.tiltLabel.transform = CGAffineTransformMakeRotation(CGFloat(-gravityAngle - M_PI_2))
}
But simply "opposite gravity" has less meaning if you're trying to do this in all 3 dimensions: the direction of gravity doesn't tell you anything about the phone's angle around the gravity vector (if your phone is face-up, this is the yaw angle). To correct in three dimensions, we can use the roll, pitch, and yaw measurements instead:
// Add some perspective so the label looks (roughly) the same,
// no matter what angle the device is held at.
var t = self.view.layer.sublayerTransform
t.m34 = 1/300
self.view.layer.sublayerTransform = t
motionManager.startDeviceMotionUpdatesToQueue(NSOperationQueue.mainQueue()) { (motion, error) in
let a = motion.attitude
self.tiltLabel.layer.transform =
CATransform3DRotate(
CATransform3DRotate(
CATransform3DRotate(
CATransform3DMakeRotation(CGFloat(a.roll), 0, -1, 0),
CGFloat(a.pitch), 1, 0, 0),
CGFloat(a.yaw), 0, 0, 1),
CGFloat(-M_PI_2), 1, 0, 0) // Extra pitch to make the label point "up" away from gravity
}

Map device tilt to physicsWorld gravity?

I'm building a "marble" labyrinth game in order to learn spritekit basics.
I want to map the gravity of the game to the tilt of the device. I've been trying to figure out how to do it but I've only been able to map the y axis successfully:
class func obtainGravity(motionManager: CMMotionManager) {
var vec = CGVectorMake(0, 0)
if let attitude = motionManager.deviceMotion?.attitude? {
let y = CGFloat(-attitude.pitch * 2 / M_PI) // This works, it returns 1/-1 when the device is vertical (1 when the home button is upside down)
let x = CGFloat(attitude.roll * 2 / M_PI) // This doesn't work
physicsWorld.gravity = CGVectorMake(x, y)
}
}
I could map the Y axis which makes the ball go "up" or "down" (relative to portrait mode) however I don't understand how to map the X axis (the pull from the side of the device).
For example, when putting the device flat on the table (x, y) should be (0,0) and when putting it on the table screen-down it should also be (0,0) however attitude.roll returns -179. Also if I keep my device vertical (on portrait mode) and turn on my feet keeping the device still, gravity should remain (x: 0, y: 1) however x continues to change as it's based on attitude.roll
How can this be done?
The most straightforward way would be to take accelerometer updates, not device motion updates, and directly read the gravity vector from there — that's exactly what the accelerometer captures: a measure of gravity in the device's coordinate system.
Sadly I'm a Swift thickie so can't provide example code but you're looking to provide a block of type CMAccelerometerHandler which will receive a CMAccelerationData from which you can obtain CMAcceleration struct and that is, directly, the gravity vector to apply.
if let data = motionManager.accelerometerData? {
vec = CGVectorMake(CGFloat(data.acceleration.x), CGFloat(data.acceleration.y))
}

Resources