Unit of measurement between two CGPoints - ios

How to determine unit of measurement between two CGPoint's. I basically want to convert distance between two CGPoints in centi meter and milli meter. I cannot find in any docs to implement correctly.
CGFloat xDist = point2.x - point1.x;
CGFloat yDist = point2.y - point1.y;
CGFloat distance = sqrt((xDist * xDist) + (yDist * yDist));

There is no API to correlate a device's physical screen size to the number of points on the screen.
You also have to realize that pixels (and points) aren't square. So you need both horizontal and vertical values.
Your only (bad) option is to hardcode values for every known iOS device and update your app every time a new device comes out.

Related

Using CoreMotion / CoreLocation to compute north heading from back of phone

I want to get the azimuth from the back of the phone (-Z axis) for an augmented reality app. My application only runs in Landscape Right. Testing this on iPhone 5S.
Currently, I'm using the following approach:
CoreLocation heading base on back camera (Augmented reality)
I have 2 problems with this approach:
If I'm pointing the back of the device towards north such that I'm currently at 0 degrees, then rotate it clockwise (yaw) a full 360 degrees, I'm now at -20 degrees. Counterclockwise rotations add 20 degrees. This pattern repeats itself such that rotating 720 degrees from 0 now yields -40 degrees and so on. Also, even if I don't necessarily do these clear rotations, but instead move the phone chaotically (spinning, shaking, etc), but end up in the same spot where I was initially, I can't even predict what value it will show.
The other problem is what I think is called gyro drift. If I don't move the device at all, I can clearly see how the value slowly changes over time, by let's say 0.1 degrees every few seconds, sometimes in one direction, sometimes the other, until a certain point where it decides to stop.
The problem is, I don't have the mathematical background to know how to account for these changes. It's especially problematic that I can't seem to compute the rotation matrix from yaw/pitch/roll from deviceMotion.attitude. I tried:
float w = motion.attitude.yaw;
float v = motion.attitude.pitch;
float u = motion.attitude.roll;
r.m11 = cos(v) * cos(w);
r.m12 = sin(u) * sin(v) * cos(w) - cos(u) * sin(w);
r.m13 = sin(u) * sin(w) + cos(u) * sin(v) * cos(w);
r.m21 = cos(v) * sin(w);
r.m22 = cos(u) * cos(w) + sin(u) * sin(v) * sin(w);
r.m23 = cos(u) * sin(v) * sin(w) - sin(u) * cos(w);
r.m31 = -sin(v);
r.m32 = sin(u) * cos(v);
r.m33 = cos(u) * cos(v);
I've tried every Tait–Bryan combination (u-v-w, u-w-v, v-u-w, v-w-u, w-v-u, w-u-v), some of them came close, but still not close enough.
From my observations, it seems like the magneticHeading from CLLocationManager is much more accurate than computed heading from CMMotionManager, but again, even if I got the correct angle, I don't know where should I start to get the equivalent angle in a different coordinate system reference frame. Any help would be greatly appreciated.

Not enough precision for points in CGPoint type

I'm developing map-based game using cocos2d v3.
I have a map with size of 2^19 points. On that map I have object that should move over time in short distance. About 60-70 points.
CGPoint offset = [_trajectoryPath offsetForNextPosition];
CGFloat x = self.position.x + offset.x;
CGFloat y = self.position.y + offset.y;
self.position = CGPointMake(x, y);
At such map size map position can be something like {300000, 40000} points.
When I try to add small step, lets say about {0.002f, 0.004f}, to animate object position I end up with still the same {300000, 40000} points...
I understand that it happens because of precision of float. Values normalised by map size, to be between 0 and 1.0, don't work either.
Is it possible somehow to increase precision of float type on iOS? Or may be someone cam give a hint about possible workaround for this problem?
Thanks.
mightee.cactus, I remember we had a similar issue while adding very small numbers to very large ones with float in c.
The solution was follows: we changed types to double to preserve accuracy; in your case you can make all the arithmetic operations with doubles and translate them into CGFloat just before use in CGPointMake.

convertCoordinate toPointToView returning bad results with tilted maps

I have a UIView overlayed on a map, and I'm drawing some graphics in screen space between two of the coordinates using
- (CGPoint)convertCoordinate:(CLLocationCoordinate2D)coordinate toPointToView:(UIView *)view
The problem is that when the map is very zoomed in and tilted (3D-like), the pixel position of the coordinate that is way off-screen stops being consistent. Sometimes the function returns NaN, sometimes it returns the right number and others it jumps to the other side of the screen.
Not sure how can I explain it better. Has anyone run into this?
During research have find a many solution. Any solution might be work for you.
Solution:1
int x = (int) ((MAP_WIDTH/360.0) * (180 + lon));
int y = (int) ((MAP_HEIGHT/180.0) * (90 - lat));
Solution:2
func addLocation(coordinate: CLLocationCoordinate2D)
{
// max MKMapPoint values
let maxY = Double(267995781)
let maxX = Double(268435456)
let mapPoint = MKMapPointForCoordinate(coordinate)
let normalizatePointX = CGFloat(mapPoint.x / maxX)
let normalizatePointY = CGFloat(mapPoint.y / maxY)
print(normalizatePointX)
print(normalizatePointX)
}
Solutuin:3
x = (total width of image in px) * (180 + latitude) / 360
y = (total height of image in px) * (90 - longitude) / 180
note: when using negative longitude of latitude make sure to add or subtract the negative number i.e. +(-92) or -(-35) which would actually be -92 and +35

How to draw line given a center point and angle in iOS?

This is so much an iOS question as it is my current inability to do coordinate geometry. Given a CGPoint to act as a point that the line will pass through and an angle in radians. How do I draw a line that extends across to the bounds of the screen (infinite line)?
I am using Quartz2d to do this and the API for creating a line is limited to two points as input. So how do I convert a point and angle to two points on the bounds of the iOS device?
This begins with simple trigonometry. You need to calculate the x and y coordinate of the 2nd point. With an origin of 0,0 and treating a line that goes straight to the right as 0 degrees, and going counterclockwise (anti-clockwise for some of you), you do:
double angle = ... // angle in radians
double newX = cos(angle);
double newY = sin(angle);
This assumes a radius of 1. Multiply each times a desired radius. Pick a number that will be bigger than the screen such as 480 for an iPhone or 1024 for an iPad (assuming you want points and not pixels).
Then add the original point to get the final point.
Assuming you have CGPoint start, double angle, and a length, your final point is:
double endX = cos(angle) * length + start.x;
double endY = sin(angle) * length + start.y;
CGPoint end = CGPointMake(endX, endY);
It's OK if the end point is off the screen.

Algorithm for creating a circular path around a center mass?

I am attempting to simply make objects orbit around a center point, e.g.
The green and blue objects represent objects which should keep their distance to the center point, while rotating, based on an angle which I pass into method.
I have attempted to create a function, in objective-c, but it doesn't work right without a static number. e.g. (It rotates around the center, but not from the true starting point or distance from the object.)
-(void) rotateGear: (UIImageView*) view heading:(int)heading
{
// int distanceX = 160 - view.frame.origin.x;
// int distanceY = 240 - view.frame.origin.y;
float x = 160 - view.image.size.width / 2 + (50 * cos(heading * (M_PI / 180)));
float y = 240 - view.image.size.height / 2 + (50 * sin(heading * (M_PI / 180)));
view.frame = CGRectMake(x, y, view.image.size.width, view.image.size.height);
}
My magic numbers 160, and 240 are the center of the canvas in which I'm drawing the images onto. 50 is a static number (and the problem), which allows the function to work partially correctly -- without maintaining the starting poisition of the object or correct distance. I don't know what to put here unfortunately.
heading is a parameter that passes in a degree, from 0 to 359. It is calculated by a timer and increments outside of this class.
Essentially what I would like to be able to drop any image onto my canvas, and based on the starting point of the image, it would rotate around the center of my circle. This means, if I were to drop an image at Point (10,10), the distance to the center of the circle would persist, using (10,10) as a starting point. The object would rotate 360 degrees around the center, and reach it's original starting point.
The expected result would be to pass for instance (10,10) into the method, based off of zero degrees, and get back out, (15,25) (not real) at 5 degrees.
I know this is very simple (and this problem description is entirely overkill), but I'm going cross eyed trying to figure out where I'm hosing things up. I don't care about what language examples you use, if any. I'll be able to decipher your meanings.
Failure Update
I've gotten farther, but I still cannot get the right calculation. My new code looks like the following:
heading is set to 1 degree.
-(void) rotateGear: (UIImageView*) view heading:(int)heading
{
float y1 = view.frame.origin.y + (view.frame.size.height/2); // 152
float x1 = view.frame.origin.x + (view.frame.size.width/2); // 140.5
float radius = sqrtf(powf(160 - x1 ,2.0f) + powf(240 - y1, 2.0f)); // 90.13
// I know that I need to calculate 90.13 pixels from my center, at 1 degree.
float x = 160 + radius * (cos(heading * (M_PI / 180.0f))); // 250.12
float y = 240 + radius * (sin(heading * (M_PI / 180.0f))); // 241.57
// The numbers are very skewed.
view.frame = CGRectMake(x, y, view.image.size.width, view.image.size.height);
}
I'm getting results that are no where close to where the point should be. The problem is with the assignment of x and y. Where am I going wrong?
You can find the distance of the point from the centre pretty easily:
radius = sqrt((160 - x)^2 + (240 - y)^2)
where (x, y) is the initial position of the centre of your object. Then just replace 50 by the radius.
http://en.wikipedia.org/wiki/Pythagorean_theorem
You can then figure out the initial angle using trigonometry (tan = opposite / adjacent, so draw a right-angled triangle using the centre mass and the centre of your orbiting object to visualize this):
angle = arctan((y - 240) / (x - 160))
if x > 160, or:
angle = arctan((y - 240) / (x - 160)) + 180
if x < 160
http://en.wikipedia.org/wiki/Inverse_trigonometric_functions
Edit: bear in mind I don't actually know any Objective-C but this is basically what I think you should do (you should be able to translate this to correct Obj-C pretty easily, this is just for demonstration):
// Your object gets created here somewhere
float x1 = view.frame.origin.x + (view.frame.size.width/2); // 140.5
float y1 = view.frame.origin.y + (view.frame.size.height/2); // 152
float radius = sqrtf(powf(160 - x1 ,2.0f) + powf(240 - y1, 2.0f)); // 90.13
// Calculate the initial angle here, as per the first part of my answer
float initialAngle = atan((y1 - 240) / (x1 - 160)) * 180.0f / M_PI;
if(x1 < 160)
initialAngle += 180;
// Calculate the adjustment we need to add to heading
int adjustment = (int)(initialAngle - heading);
So we only execute the code above once (when the object gets created). We need to remember radius and adjustment for later. Then we alter rotateGear to take an angle and a radius as inputs instead of heading (this is much more flexible anyway):
-(void) rotateGear: (UIImageView*) view radius:(float)radius angle:(int)angle
{
float x = 160 + radius * (cos(angle * (M_PI / 180.0f)));
float y = 240 + radius * (sin(angle * (M_PI / 180.0f)));
// The numbers are very skewed.
view.frame = CGRectMake(x, y, view.image.size.width, view.image.size.height);
}
And each time we want to update the position we make a call like this:
[objectName rotateGear radius:radius angle:(adjustment + heading)];
Btw, once you manage to get this working, I'd strongly recommend converting all your angles so you're using radians all the way through, it makes it much neater/easier to follow!
The formula for x and y coordinates of a point on a circle, based on radians, radius, and center point:
x = cos(angle) * radius + center_x
y = sin(angle) * radius + center_y
You can find the radius with HappyPixel's formula.
Once you figure out the radius and the center point, you can simply vary the angle to get all the points on the circle that you'd want.
If I understand correctly, you want to do InitObject(x,y). followed by UpdateObject(angle) where angle sweeps from 0 to 360. (But use radians instead of degrees for the math)
So you need to track the angle and radius for each object.:
InitObject(x,y)
relative_x = x-center.x
relative_y = y-center.y
object.radius = sqrt((relative_x)^2, (relative_y)^2)
object.initial_angle = atan(relative_y,relative_x);
And
UpdateObject(angle)
newangle = (object.initial_angle + angle) % (2*PI )
object.x = cos(newangle) * object.radius + center.x
object.y = sin(newangle) * object.radius + center.y
dx=dropx-centerx; //target-source
dy=-(dropy-centery); //minus = invert screen coords to cartesian coords
radius=sqrt(dy*dy+dx*dx); //faster if your compiler optimizer is bad
if dx=0 then dx=0.000001; //hackpatchfudgenudge*
angle=atan(dy/dx); //set this as start angle for the angle-incrementer
Then go with the code you have and you'll be fine. You seem to be calculating radius from current position each time though? This, like the angle, should only be done once, when the object is dropped, or else the radius might not be constant.
*instead of handling 3 special cases for dx=0, if you need < 1/100 degree precision for the start angle go with those instead, google Polar Arctan.

Resources