How to convert Yaw,Pitch and roll values to CGPoint iOS? - ios

I want to convert Yaw,Pitch and Roll values to CGPoint inorder to draw shapes .But the CGPoint consists of two float values don't know how to add yaw value to cgpoint.My code
float ratio = 125/25.0f;
CMQuaternion quat = self.motionManager.deviceMotion.attitude.quaternion;
CGFloat roll = radiansToDegrees(atan2(2*(quat.y*quat.w - quat.x*quat.z), 1 - 2*quat.y*quat.y - 2*quat.z*quat.z)) ;
CGFloat pitch = radiansToDegrees(atan2(2*(quat.x*quat.w + quat.y*quat.z), 1 - 2*quat.x*quat.x - 2*quat.z*quat.z));
CGFloat yaw = radiansToDegrees(2*(quat.x*quat.y + quat.w*quat.z));
CGPoint point=CGPointMake(roll*ratio,pitch*ratio);
Can anyone please provide me some information regarding this....
Thanks in Advance...

Related

Objective C compare two CGPoint to see if they are close?

So I currently get the location of a touch by using
CGPoint location = [touch locationInView:self.view];
Now what I want to do is check the location on the next touch to see if the locations are close, say 25 points on x or y axis.
There are a few posts that show how to compare if two touches are equivalent but is there to calculate the distance between multiple points? Any info would be awesome.
To estimate the distance between two CGPoints, you can make use of simple Pythagorean formula:
CGFloat dX = (p2.x - p1.x);
CGFloat dY = (p2.y - p1.y);
CGFloat distance = sqrt((dX * dX) + (dY * dY));

Pan to seek AVPlayer

I am trying to pan and seek forwards and backwards in my AVPlayer. It is kind of working but the basic math of determining where the pan is translated to the length of the asset is wrong. Can any one offer assistance?
- (void) handlePanGesture:(UIPanGestureRecognizer*)pan{
CGPoint translate = [pan translationInView:self.view];
CGFloat xCoord = translate.x;
double diff = (xCoord);
//NSLog(#"%F",diff);
CMTime duration = self.avPlayer.currentItem.asset.duration;
float seconds = CMTimeGetSeconds(duration);
NSLog(#"duration: %.2f", seconds);
CGFloat gh = 0;
if (diff>=0) {
//If the difference is positive
NSLog(#"%f",diff);
gh = diff;
} else {
//If the difference is negative
NSLog(#"%f",diff*-1);
gh = diff*-1;
}
float minValue = 0;
float maxValue = 1024;
float value = gh;
double time = seconds * (value - minValue) / (maxValue - minValue);
[_avPlayer seekToTime:CMTimeMakeWithSeconds(time, 10) toleranceBefore:kCMTimeZero toleranceAfter:kCMTimeZero];
//[_avPlayer seekToTime:CMTimeMakeWithSeconds(seconds*(Float64)diff , 1024) toleranceBefore:kCMTimeZero toleranceAfter:kCMTimeZero];
}
You are not normalizing the touch location and the corresponding time values. Is there a 1:1 relationship between the two? That's not possible.
Take the minimum and maximum touch location values of the pan gesture and the minimum and maximum values of the asset's duration (obviously, from zero to the length of the video), and then apply the following formula to translate the touch location to the seek time:
// Map
#define map(x, in_min, in_max, out_min, out_max) ((x - in_min) * (out_max - out_min) / (in_max - in_min) + out_min)
Here's the code I wrote that uses that formula:
- (IBAction)handlePanGesture:(UIPanGestureRecognizer *)sender {
if (sender.state == UIGestureRecognizerStateChanged){
CGPoint location = [sender locationInView:self];
float nlx = ((location.x / ((CGRectGetMidX(self.frame) / (self.frame.size.width / 2.0)))) / (self.frame.size.width / 2.0)) - 1.0;
//float nly = ((location.y / ((CGRectGetMidY(self.view.frame) / (self.view.frame.size.width / 2.0)))) / (self.view.frame.size.width / 2.0)) - 1.0;
nlx = nlx * 2.0;
[self.delegate setRate:nlx];
}
}
I culled the label that displays the rate, and the Play icon that appears while you're scrubbing, and which changes sizes depending on how fast or slow you are panning the video. Although you didn't ask for that, if you want it, just ask.
Oh, the "times-two" factor is intended to add an acceleration curve to the pan gesture value sent to the delegate's setRate method. You can use any formula for that, even an actual curve, like pow(nlx, 2.0) or whatever...
If you want to make it more precise and useful you should implement different "levels of sensitivity".
Apple does that with their slider: if you drag away from the slider and then to the sides, that pace at which the video moves changes. The farther your are from the slider the more precise it gets/the less you can reach.

How to curve CGMutablePath?

With the following shape:
I was wondering how do you get it to curve like this:
Also similarly:
I'm assuming that the all of the circles / lines are packed into one CGMutablePath, and then some kind of curve, arc, or quad curve, is applied to it, though I'm having trouble coming even close to replicating it. Does anyone know how to do this?
In your first example, you start with a path that has several closed subpaths. Apparently you want to warp the centers of the subpaths, but leave the individual subpaths unwarped relative to their (new) centers. I'm going to ignore that, because the solution even without that is already terribly complex.
So, let's consider how to define the “warp field”. We'll use three control points:
The warp leaves fixedPoint unchanged. It moves startPoint to endPoint by rotation and scaling, not by simply interpolating the coordinates.
Furthermore, it applies the rotation and scaling based on distance from fixedPoint. And not just based on the simple Euclidean distance. Notice that we don't want apply any rotation or scaling to the top endpoints of the “V” shape in the picture, even though those endpoints are a measurable Euclidean distance from fixedPoint. We want to measure distance along the fixedPoint->startPoint vector, and apply more rotation/scaling as that distance increases.
This all requires some pretty heavy trigonometry. I'm not going to try to explain the details. I'm just going to dump code on you, as a category on UIBezierPath:
UIBezierPath+Rob_warp.h
#import <UIKit/UIKit.h>
#interface UIBezierPath (Rob_warp)
- (UIBezierPath *)Rob_warpedWithFixedPoint:(CGPoint)fixedPoint startPoint:(CGPoint)startPoint endPoint:(CGPoint)endPoint;
#end
UIBezierPath+Rob_warp.m
Note that you'll need the Rob_forEach category from this answer.
#import "UIBezierPath+Rob_warp.h"
#import "UIBezierPath+Rob_forEach.h"
#import <tgmath.h>
static CGPoint minus(CGPoint a, CGPoint b) {
return CGPointMake(a.x - b.x, a.y - b.y);
}
static CGFloat length(CGPoint vector) {
return hypot(vector.x, vector.y);
}
static CGFloat dotProduct(CGPoint a, CGPoint b) {
return a.x * b.x + a.y * b.y;
}
static CGFloat crossProductMagnitude(CGPoint a, CGPoint b) {
return a.x * b.y - a.y * b.x;
}
#implementation UIBezierPath (Rob_warp)
- (UIBezierPath *)Rob_warpedWithFixedPoint:(CGPoint)fixedPoint startPoint:(CGPoint)startPoint endPoint:(CGPoint)endPoint {
CGPoint startVector = minus(startPoint, fixedPoint);
CGFloat startLength = length(startVector);
CGPoint endVector = minus(endPoint, fixedPoint);
CGFloat endLength = length(minus(endPoint, fixedPoint));
CGFloat scale = endLength / startLength;
CGFloat dx = dotProduct(startVector, endVector);
CGFloat dy = crossProductMagnitude(startVector, endVector);
CGFloat radians = atan2(dy, dx);
CGPoint (^warp)(CGPoint) = ^(CGPoint input){
CGAffineTransform t = CGAffineTransformMakeTranslation(-fixedPoint.x, -fixedPoint.y);
CGPoint inputVector = minus(input, fixedPoint);
CGFloat factor = dotProduct(inputVector, startVector) / (startLength * startLength);
CGAffineTransform w = CGAffineTransformMakeRotation(radians * factor);
t = CGAffineTransformConcat(t, w);
CGFloat factoredScale = pow(scale, factor);
t = CGAffineTransformConcat(t, CGAffineTransformMakeScale(factoredScale, factoredScale));
// Note: next line is not the same as CGAffineTransformTranslate!
t = CGAffineTransformConcat(t, CGAffineTransformMakeTranslation(fixedPoint.x, fixedPoint.y));
return CGPointApplyAffineTransform(input, t);
};
UIBezierPath *copy = [self.class bezierPath];
[self Rob_forEachMove:^(CGPoint destination) {
[copy moveToPoint:warp(destination)];
} line:^(CGPoint destination) {
[copy addLineToPoint:warp(destination)];
} quad:^(CGPoint control, CGPoint destination) {
[copy addQuadCurveToPoint:warp(destination) controlPoint:warp(control)];
} cubic:^(CGPoint control0, CGPoint control1, CGPoint destination) {
[copy addCurveToPoint:warp(destination) controlPoint1:warp(control0) controlPoint2:warp(control1)];
} close:^{
[copy closePath];
}];
return copy;
}
#end
Ok, so how do you use this crazy thing? In the case of a path like the “V” in the example, you could do it like this:
CGRect rect = path.bounds;
CGPoint fixedPoint = CGPointMake(CGRectGetMidX(rect), CGRectGetMinY(rect));
CGPoint startPoint = CGPointMake(fixedPoint.x, CGRectGetMaxY(rect));
path = [path Rob_warpedWithFixedPoint:fixedPoint startPoint:startPoint endPoint:endAnchor];
I'm computing fixedPoint as the center of the top edge of the path's bounding box, and startPoint as the center of the bottom edge. The endAnchor is under user control in my test program. It looks like this in the simulator:
A bubble-type path looks like this:
You can find my test project here: https://github.com/mayoff/path-warp

Rotate clock's hour hand along with minute hand

I am using CALayers to make an analog clock. The real-time clock works perfect and all the hands are moving smoothly just like default clock app icon in iOS. However, when I try to move the clock hands by touch/dragging, the minute hand moves perfectly but the hour hand doesn't. I am trying to move the clock hands just as they move in a real gear-based clock. Any solutions on how to calculate the angle for hour hand with respect to the movement of minute hand both clockwise and counter-clockwise? Actually I want somewhat similar functionality to this app https://itunes.apple.com/us/app/bb-teaching-clock/id612261763?mt=8
Here is the code for minute hand rotation
float dx = touchPoint.x - minHand.position.x;
float dy = touchPoint.y - minHand.position.y;
deltaAngle = atan2f(dy,dx);
startTransform = minHand.transform;
float angleDifference;
float dx = pt.x - minHand.position.x;
float dy = pt.y - minHand.position.y;
float ang = atan2f(dy,dx);
angleDifference = deltaAngle - ang;
minHand.transform = CATransform3DRotate(startTransform, -angleDifference,0 ,0 ,1);
NSLog(#"angleDifference: %f", angleDifference);
CGFloat hourAngle = (1/12)*(angleDifference);
// hour hand should move in the same direction as minute hand but at a much slower rate
hourHand.transform = CATransform3DRotate (hourHand.transform, -hourAngle, 0, 0, 1);
Try making the following changes, where centerPoint is the center of the clock...
float dx = touchPoint.x - centerPoint.x;
float dy = touchPoint.y - centerPoint.y;
deltaAngle = atan2f(dy,dx);
startTransform = minHand.transform;
float angleDifference;
dx = minHand.position.x - centerPoint.x;
dy = minHand.position.y - centerPoint.y;
float ang = atan2f(dy,dx);
angleDifference = ang - deltaAngle;

Measuring tilt angle with CMMotionManager

Suppose you are holding an iphone/ipad vertically in front of you with the screen facing you, in portrait orientation. You tilt the device to one side, keeping the screen facing you. How do you measure that static tilt angle using CMMotionManager? It seems a simple question which should have a simple answer, yet I cannot find any method that does not disappear into quaternions and rotation matrices.
Can anyone point me to a worked example?
Look at gravity:
self.deviceQueue = [[NSOperationQueue alloc] init];
self.motionManager = [[CMMotionManager alloc] init];
self.motionManager.deviceMotionUpdateInterval = 5.0 / 60.0;
// UIDevice *device = [UIDevice currentDevice];
[self.motionManager startDeviceMotionUpdatesUsingReferenceFrame:CMAttitudeReferenceFrameXArbitraryZVertical
toQueue:self.deviceQueue
withHandler:^(CMDeviceMotion *motion, NSError *error)
{
[[NSOperationQueue mainQueue] addOperationWithBlock:^{
CGFloat x = motion.gravity.x;
CGFloat y = motion.gravity.y;
CGFloat z = motion.gravity.z;
}];
}];
With this reference frame (CMAttitudeReferenceFrameXArbitraryZVertical), if z is near zero, you're holding it on a plane perpendicular with the ground (e.g. as if you were holding it against a wall) and as you rotate it on that plane, x and y values change. Vertical is where x is near zero and y is near -1.
Looking at this post, I notice that if you want to convert this vector into angles, you can use the following algorithms.
If you want to calculate how many degrees from vertical the device is rotated (where positive is clockwise, negative is counter-clockwise), you can calculate this as:
// how much is it rotated around the z axis
CGFloat angle = atan2(y, x) + M_PI_2; // in radians
CGFloat angleDegrees = angle * 180.0f / M_PI; // in degrees
You can use this to figure out how much to rotate the view via the Quartz 2D transform property:
self.view.layer.transform = CATransform3DRotate(CATransform3DIdentity, -rotateRadians, 0, 0, 1);
(Personally, I update the rotation angle in the startDeviceMotionUpdates method, and update this transform in a CADisplayLink, which decouples the screen updates from the angle updates.)
You can see how far you've tilted it backward/forward via:
// how far it it tilted forward and backward
CGFloat r = sqrtf(x*x + y*y + z*z);
CGFloat tiltForwardBackward = acosf(z/r) * 180.0f / M_PI - 90.0f;
It is kind of a late answer but you can found a working example on github and the blog article that comes with it.
To summarize the article mentioned above, you can use quaternions to avoid the gimbal lock problem that you are probably facing when holding the iPhone vertically.
Here is the coding part that compute the tilt (or yaw) :
CMQuaternion quat = self.motionManager.deviceMotion.attitude.quaternion;
double yaw = asin(2*(quat.x*quat.z - quat.w*quat.y));
// use the yaw value
// ...
You can even add a simple Kalman filter to ease the yaw :
CMQuaternion quat = self.motionManager.deviceMotion.attitude.quaternion;
double yaw = asin(2*(quat.x*quat.z - quat.w*quat.y));
if (self.motionLastYaw == 0) {
self.motionLastYaw = yaw;
}
// kalman filtering
static float q = 0.1; // process noise
static float r = 0.1; // sensor noise
static float p = 0.1; // estimated error
static float k = 0.5; // kalman filter gain
float x = self.motionLastYaw;
p = p + q;
k = p / (p + r);
x = x + k*(yaw - x);
p = (1 - k)*p;
self.motionLastYaw = x;
// use the x value as the "updated and smooth" yaw
// ...
Here is an example that rotates a UIView self.horizon to keep it level with the horizon as you tilt the device.
- (void)startDeviceMotionUpdates
{
CMMotionManager* coreMotionManager = [[CMMotionManager alloc] init];
NSOperationQueue* motionQueue = [[NSOperationQueue alloc] init]
CGFloat updateInterval = 1/60.0;
CMAttitudeReferenceFrame frame = CMAttitudeReferenceFrameXArbitraryCorrectedZVertical;
[coreMotionManager setDeviceMotionUpdateInterval:updateInterval];
[coreMotionManager startDeviceMotionUpdatesUsingReferenceFrame:frame
toQueue:motionQueue
withHandler:
^(CMDeviceMotion* motion, NSError* error){
CGFloat angle = atan2( motion.gravity.x, motion.gravity.y );
CGAffineTransform transform = CGAffineTransformMakeRotation(angle);
self.horizon.transform = transform;
}];
}
This is a little oversimplified - you should be sure to have only one instance of CMMotionManager in your app so you want to pre-initialise this and access it via a property.
Since iOS8 CoreMotion also returns you a CMAttitude object, which contains pitch, roll and yaw properties, as well as the quaternion. Using this will mean you don't have to do the manual maths to convert acceleration to orientation.

Resources