I have got CLLocation object which contains current location of user and I have got 4 lat/long pairs for each corner of a rectangle which can be angled. Now I want to check whether CLLocation coordinates are within that rectangle.
Following are the coordinates of the rectangle
#define NorthEast_LAT 51.514894
#define NorthEast_LNG -0.135306
#define SouthEast_LAT 51.514831
#define SouthEast_LNG -0.135153
#define NorthWest_LAT 51.514719
#define NorthWest_LNG -0.135858
#define SouthWest_LAT 51.514556
#define SouthWest_LNG -0.135714
I have tried following code but I think it will only work when angle of rectangle is 0 deg.
BOOL withinRect = [delegate.CLController latlngWithInBox:location
point1:CLLocationCoordinate2DMake(NorthEast_LAT, NorthEast_LNG)
point2:CLLocationCoordinate2DMake(SouthEast_LAT, SouthEast_LNG)
point3:CLLocationCoordinate2DMake(NorthWest_LAT, NorthWest_LNG)
point4:CLLocationCoordinate2DMake(SouthWest_LAT, SouthWest_LNG)];
- (BOOL) latlngWithInBox:(CLLocation *)position point1:(CLLocationCoordinate2D)point1 point2:(CLLocationCoordinate2D)point2 point3:(CLLocationCoordinate2D)point3 point4:(CLLocationCoordinate2D)point4 {
if (position.coordinate.latitude >= point3.latitude && position.coordinate.latitude <= point2.latitude
&& position.coordinate.longitude >= point3.longitude && position.coordinate.longitude <= point2.longitude) {
return YES;
}
return NO;
}
Another way to determine whether some point is within map rectangle is to use MKMapKit's functions:
MKMapPointForCoordinate - convert coordinate to map point
MKMapRectMake - to create rect using these points
MKMapRectContainsPoint - determine if specified map point lies
within the rectangle
The advantage is that MKMapKit (MKMapPoint, MKMapRect) uses Mercator projection of the map, so you do not need to provide spheroid calculations. But some of these functions available in iOS 4.0 and later.
UPDATE:
// 1 ------- 2
// | |
// | x |
// | |
// 3 ------- 4
// 1 = topLeftCorner
// 4 = bottomRightCorner
// x = targetCoordinate
CLLocationCoordinate2D topLeftCorner = /* some coordinate */, bottomRightCorner = /* some coordinate */;
CLLocationCoordinate2D targetCoordinate = /* some coordinate */;
MKMapPoint topLeftPoint = MKMapPointForCoordinate(topLeftCorner);
MKMapPoint bottomRightPoint = MKMapPointForCoordinate(bottomRightCorner);
MKMapRect mapRect = MKMapRectMake(topLeftPoint.x, topLeftPoint.y, bottomRightPoint.x - topLeftPoint.x, bottomRightPoint.y - topLeftPoint.y);
MKMapPoint targetPoint = MKMapPointForCoordinate(targetCoordinate);
BOOL isInside = MKMapRectContainsPoint(mapRect, targetPoint);
Let's ignore that the rectangle for geo-coordinates is on a spheroid (math is difficult there). So you want to find out whether a point is within a quadrangle.
Easiest way is to first add a restriction: the points have to be given in a certain order (NE, NW, SE, SW). Then, treat them as normal 2D-coordinates (longitude = x, latitude = y).
Next step is to reduce the problem: let the coordinates form 2 triangles, for example NE-NW-SE and NW-SE-SW. Then check whether your point is within one of those two triangles.
- (BOOL) latlngWithInBox:(CLLocation *)position point1:(CLLocationCoordinate2D)point1 point2:(CLLocationCoordinate2D)point2 point3:(CLLocationCoordinate2D)point3 point4:(CLLocationCoordinate2D)point4 {
//&& position.coordinate.latitude >= [[point4 objectAtIndex:0] floatValue] && position.coordinate.latitude <= [[point1 objectAtIndex:0] floatValue] && position.coordinate.longitude <= [[point4 objectAtIndex:1] floatValue] && position.coordinate.longitude >= [[point1 objectAtIndex:1] floatValue]
if (PointInTriangle(position.coordinate, point1, point2, point3) || PointInTriangle(position.coordinate, point2, point3, point4)) {
return YES;
}
return NO;
}
float sign(CLLocationCoordinate2D p1, CLLocationCoordinate2D p2, CLLocationCoordinate2D p3)
{
return (p1.longitude - p3.longitude) * (p2.latitude - p3.latitude) - (p2.longitude - p3.longitude) * (p1.latitude - p3.latitude);
}
bool PointInTriangle(CLLocationCoordinate2D pt, CLLocationCoordinate2D v1, CLLocationCoordinate2D v2, CLLocationCoordinate2D v3)
{
bool b1, b2, b3;
b1 = sign(pt, v1, v2) < 0.0f;
b2 = sign(pt, v2, v3) < 0.0f;
b3 = sign(pt, v3, v1) < 0.0f;
// NSLog(#"b1-%#", [NSNumber numberWithBool:b1]);
// NSLog(#"b2-%#", [NSNumber numberWithBool:b2]);
// NSLog(#"b3-%#", [NSNumber numberWithBool:b3]);
return ((b1 == b2) && (b2 == b3));
}
Related
I have 4 lines segment, A, B, C and D. Each line is represented as two points. Eg. line A is represented as point A1 and point A2.
What I want is
point X, which is the point where line A ray intersect with line B
distance between X and A1(origin)
When testing for intersection, line A ray should not
intersect with line segment D
intersect with line segment C
How do I do this?
Finally got it working on OpenCV C++. Based on this https://stackoverflow.com/a/32146853/457030.
// return the distance of ray origin to intersection point
double GetRayToLineSegmentIntersection(Point2f rayOrigin, Point2f rayDirection, Point2f point1, Point2f point2)
{
Point2f v1 = rayOrigin - point1;
Point2f v2 = point2 - point1;
Point2f v3 = Point2f(-rayDirection.y, rayDirection.x);
float dot = v2.dot(v3);
if (abs(dot) < 0.000001)
return -1.0f;
float t1 = v2.cross(v1) / dot;
float t2 = v1.dot(v3) / dot;
if (t1 >= 0.0 && (t2 >= 0.0 && t2 <= 1.0))
return t1;
return -1.0f;
}
// use this to normalize rayDirection
Point2f NormalizeVector(Point2f pt)
{
float length = sqrt(pt.x*pt.x + pt.y*pt.y);
pt = pt / length;
return pt;
}
// gets the intersection point
Point2f GetRayIntersectionPoint(Point2f origin, Point2f vector, double distance)
{
Point2f pt;
pt.x = origin.x + vector.x * distance;
pt.y = origin.y + vector.y * distance;
return pt;
}
Should be self explanatory.
I'm using Google Maps iOS to set up Geofencing around a building complex. I've created a polyline around the complex and if the user taps outside of the polyline it will move the marker to the closest point that's on the polyline, otherwise it will just place the marker. This seems to work relatively well using this method.
However I've noticed that this method only seems to work when the point in question is perpendicular to a point on the line, otherwise it comes up with strange results. I've posted my code and some screenshots below.
-(CLLocationCoordinate2D) findClosestPointWithinFence:(CLLocationCoordinate2D) pointToTest {
CLLocationDistance smallestDistance = 0;
CLLocationCoordinate2D closestPoint = pointToTest;
for(int i = 0; i < [geoFencePoints count] - 1; i++) {
CGPoint point = [[geoFencePoints objectAtIndex:i] CGPointValue];
CGPoint point2 = [[geoFencePoints objectAtIndex:i + 1] CGPointValue];
CLLocationCoordinate2D locationA = CLLocationCoordinate2DMake(point.x, point.y);
CLLocationCoordinate2D locationB = CLLocationCoordinate2DMake(point2.x, point2.y);
CLLocationCoordinate2D myLoc = [self findClosestPointOnLine:locationA secondPoint:locationB fromPoint:pointToTest];
if(GMSGeometryIsLocationOnPath(myLoc, dealershipParameters.path, YES)) {
if(smallestDistance == 0) {
smallestDistance = GMSGeometryDistance(myLoc, pointToTest);
closestPoint = myLoc;
} else {
if(smallestDistance > GMSGeometryDistance(myLoc, pointToTest)) {
smallestDistance = GMSGeometryDistance(myLoc, pointToTest);
closestPoint = myLoc;
}
}
}
}
return closestPoint;
}
-(CLLocationCoordinate2D) findClosestPointOnLine:(CLLocationCoordinate2D)locationA secondPoint:(CLLocationCoordinate2D)locationB fromPoint:(CLLocationCoordinate2D) pointToTest {
CGPoint aToP = CGPointMake(pointToTest.latitude - locationA.latitude, pointToTest.longitude - locationA.longitude);
CGPoint aToB = CGPointMake(locationB.latitude - locationA.latitude, locationB.longitude - locationA.longitude);
float atb2 = (aToB.x * aToB.x) + (aToB.y * aToB.y);
float atp_dot_atb = (aToP.x * aToB.x) + (aToP.y * aToB.y);
float t = atp_dot_atb / atb2;
CLLocationCoordinate2D myLoc = CLLocationCoordinate2DMake(locationA.latitude + aToB.x * t, locationA.longitude + aToB.y * t);
return myLoc;
}
-(BOOL)testIfInsideGeoFence:(CLLocationCoordinate2D) pointToTest {
return GMSGeometryContainsLocation(pointToTest, dealershipParameters.path, YES) || GMSGeometryIsLocationOnPath(pointToTest, dealershipParameters.path, YES);
}
Below the first screenshot shows the marker successfully finding the closest point, the marker off the blue line is where I initially tapped, and the marker on the blue line is the point it found. The second shows the marker failing to find the closest point. The marker on the screen is where I initially tapped, since it is unable to find a proper solution it doesn't place a second marker.
Screenshot 1
Screenshot 2
I ran into a similar issue. I think what is happening is that you are treating the line segment as a line. Since the segment does not extend to a point that would be perpendicular to the point, the closest point on the segment would be one of it endpoints, not an extension of the segment.
Here is a method I am using. It takes the endpoint of the segment and returns a struct containing the nearest point on the segment and the distance from the giving point. The key difference being the if-else statements that check whether the solution is on the segment or not. You may need to rework a few things for your purposes.
The other thing to note is that I have had more accurate results performing the math on MKMapPoints rather than CLLocationCoordinate2D objects. I think it has something to do with the earth being round or some such nonsense.
+ (struct TGShortestDistanceAndNearestCoordinate)distanceFromPoint:(CLLocationCoordinate2D)p
toLineSegmentBetween:(CLLocationCoordinate2D)l1
and:(CLLocationCoordinate2D)l2 {
return [[self class] distanceFromMapPoint:MKMapPointForCoordinate(p)
toLineSegmentBetween:MKMapPointForCoordinate(l1)
and:MKMapPointForCoordinate(l2)];
}
+ (struct TGShortestDistanceAndNearestCoordinate)distanceFromMapPoint:(MKMapPoint)p
toLineSegmentBetween:(MKMapPoint)l1
and:(MKMapPoint)l2 {
double A = p.x - l1.x;
double B = p.y - l1.y;
double C = l2.x - l1.x;
double D = l2.y - l1.y;
double dot = A * C + B * D;
double len_sq = C * C + D * D;
double param = dot / len_sq;
double xx, yy;
if (param < 0 || (l1.x == l2.x && l1.y == l2.y)) {
xx = l1.x;
yy = l1.y;
}
else if (param > 1) {
xx = l2.x;
yy = l2.y;
}
else {
xx = l1.x + param * C;
yy = l1.y + param * D;
}
struct TGShortestDistanceAndNearestCoordinate result;
MKMapPoint nearestPoint = MKMapPointMake(xx, yy);
result.shortestDistance = MKMetersBetweenMapPoints(p, nearestPoint);
result.nearestCoordinate = MKCoordinateForMapPoint(nearestPoint);
return result;
}
A very elegant solution. But I'm not sure about your test in the line "if param < 0 ... ". l1.x == l2.x iff the segment is vertical, and l1.y == l2.y iff it is horizontal. So how can this conjunction ever be true? (except when l1, l2 are identical)
I have an image of an arrow that behaves like a compass to a specific location. Sometimes it works, and other times it's mirrored. So if I was facing east and the location is directly east of me, it should point up, but sometimes it points down.
#define RADIANS_TO_DEGREES(radians) ((radians) * (180.0 / M_PI))
- (void)locationManager:(CLLocationManager *)manager didUpdateHeading:(CLHeading *)heading
{
// update direction of arrow
CGFloat degrees = [self p_calculateAngleBetween:_myLocation
and:_otherLocation];
CGFloat rads = (degrees - heading.trueHeading) * M_PI / 180;
CGAffineTransform tr = CGAffineTransformIdentity;
tr = CGAffineTransformConcat(tr, CGAffineTransformMakeRotation(rads) );
[_directionArrowView setTransform:tr];
}
-(CGFloat) p_calculateAngleBetween:(CLLocationCoordinate2D)coords0 and:(CLLocationCoordinate2D)coords1 {
double x = 0, y = 0 , deg = 0, deltaLon = 0;
deltaLon = coords1.longitude - coords0.longitude;
y = sin(deltaLon) * cos(coords1.latitude);
x = cos(coords0.latitude) * sin(coords1.latitude) - sin(coords0.latitude) * cos(coords1.latitude) * cos(deltaLon);
deg = RADIANS_TO_DEGREES(atan2(y, x));
if(deg < 0)
{
deg = -deg;
}
else
{
deg = 360 - deg;
}
return deg;
}
Is this the correct way to calculate my angle with another location? Or am I missing a step? Being the arrow points directly in the opposite direction sometimes, my assumption is it's an issue with my math.
To calculate radians from x & y:
double r = atan(y/x);
if (x<0)
r = M_PI + r;
else if (x>0 && y<0)
r = 2 * M_PI + r;
There is not issue of dividing by 0 when X is zero because the atan function handles this correctly:
If the argument is positive infinity (negative infinity), +pi/2 (-pi/2) is returned.
How to you get the area of a MKPolygon or MKOverlay in iOS?
I have been able to breakup the Polygon into triangles and do some math to get the area. But, doesn't work well with irregular polygons.
I was thinking about doing something like the "A more complex case" here: http://www.mathopenref.com/coordpolygonarea2.html
I was hoping there is a simpler solution with MapKit.
Thanks,
Tim
Here's the implementation I'm using.
#define kEarthRadius 6378137
#implementation MKPolygon (AreaCalculation)
- (double) area {
double area = 0;
NSMutableArray *coords = [[self coordinates] mutableCopy];
[coords addObject:[coords firstObject]];
if (coords.count > 2) {
CLLocationCoordinate2D p1, p2;
for (int i = 0; i < coords.count - 1; i++) {
p1 = [coords[i] MKCoordinateValue];
p2 = [coords[i + 1] MKCoordinateValue];
area += degreesToRadians(p2.longitude - p1.longitude) * (2 + sinf(degreesToRadians(p1.latitude)) + sinf(degreesToRadians(p2.latitude)));
}
area = - (area * kEarthRadius * kEarthRadius / 2);
}
return area;
}
- (NSArray *)coordinates {
NSMutableArray *points = [NSMutableArray arrayWithCapacity:self.pointCount];
for (int i = 0; i < self.pointCount; i++) {
MKMapPoint *point = &self.points[i];
[points addObject:[NSValue valueWithMKCoordinate:MKCoordinateForMapPoint(* point)]];
}
return points.copy;
}
double degreesToRadians(double radius) {
return radius * M_PI / 180;
}
In Swift 3:
let kEarthRadius = 6378137.0
extension MKPolygon {
func degreesToRadians(_ radius: Double) -> Double {
return radius * .pi / 180.0
}
func area() -> Double {
var area: Double = 0
var coords = self.coordinates()
coords.append(coords.first!)
if (coords.count > 2) {
var p1: CLLocationCoordinate2D, p2: CLLocationCoordinate2D
for i in 0..<coords.count-1 {
p1 = coords[i]
p2 = coords[i+1]
area += degreesToRadians(p2.longitude - p1.longitude) * (2 + sin(degreesToRadians(p1.latitude)) + sin(degreesToRadians(p2.latitude)))
}
area = abs(area * kEarthRadius * kEarthRadius / 2)
}
return area
}
func coordinates() -> [CLLocationCoordinate2D] {
var points: [CLLocationCoordinate2D] = []
for i in 0..<self.pointCount {
let point = self.points()[i]
points.append(MKCoordinateForMapPoint(point))
}
return Array(points)
}
}
I figured this out by doing a little loop through the points in the polygon. For every 3 points, I check if the center of that triangle is in the polygon. If it is continue, if not, connect the polygon so that there are no dips in the polygon. Once done, get the triangles in the polygon and do the math to get the area. Then subtract the triangles that were removed.
Hope this helps someone.
I am currently trying to implement the Cox De Boor algorithm for drawing bezier curves. I've managed to produce something acceptable with a set degree, number of control points and a predefined knot vector, but I want to adapt my code so that it will function given any number of control points and any degree. I'm 90% certain that the problems I am currently encountering, i.e. that the path goes wandering off to point 0/0, are due to me not properly calculating knot vectors. If anyone can give me a hint or two I'd be grateful. Note that I am presently calculating each dimension (in this case just x and y) individually; I will eventually adapt this code to use the same precalculations for all dimensions. I may also adjust it to use C arrays rather than NSArrays, but from what I've seen there's no real speed advantage to doing so.
I am currently producing a degree 3 curve using 5 control points with a knot vector of {0, 0, 0, 0, 1, 2, 2, 2, 2}.
- (double) coxDeBoorForDegree:(NSUInteger)degree span:(NSUInteger)span travel:(double)travel knotVector:(NSArray *)vector
{
double k1 = [[vector objectAtIndex:span] doubleValue];
double k2 = [[vector objectAtIndex:span+1] doubleValue];
if (degree == 1) {
if (k1 <= travel && travel <= k2) return 1.0;
return 0.0;
}
double k3 = [[vector objectAtIndex:span+degree-1] doubleValue];
double k4 = [[vector objectAtIndex:span+degree] doubleValue];
double density1 = k3 - k1;
double density2 = k4 - k2;
double equation1 = 0.0, equation2 = 0.0;
if (density1 > 0.0) equation1 = ((travel-k1) / density1) * [self coxDeBoorForDegree:degree-1 span:span travel:travel knotVector:vector];
if (density2 > 0.0) equation2 = ((k4-travel) / density2) * [self coxDeBoorForDegree:degree-1 span:span+1 travel:travel knotVector:vector];
return equation1 + equation2;
}
- (double) valueAtTravel:(double)travel degree:(NSUInteger)degree points:(NSArray *)points knotVector:(NSArray *)vector
{
double total = 0.0;
for (NSUInteger i = 0; i < points.count; i++) {
float weight = [self coxDeBoorForDegree:degree+1 span:i travel:travel knotVector:vector];
if (weight > 0.001) total += weight * [[points objectAtIndex:i] doubleValue];
}
return total;
}
Never mind, I found this very useful webpage:
http://www.cs.mtu.edu/~shene/COURSES/cs3621/NOTES/INT-APP/PARA-knot-generation.html
Hence anyone with the same problem can use the following method to generate a suitable knot vector, where 'controls' is the number of control points affecting the line segment, and 'degree' is... well, the degree of the curve! Don't forget that degree cannot equal or exceed the number of control points in the curve:
- (NSArray *) nodeVectorForControlCount:(NSUInteger)controls degree:(NSUInteger)degree
{
NSUInteger knotIncrement = 0;
NSUInteger knotsRequired = controls + degree + 1;
NSMutableArray *constructor = [[NSMutableArray alloc] initWithCapacity:knotsRequired];
for (NSUInteger i = 0; i < knotsRequired; i++) {
[constructor addObject:[NSNumber numberWithDouble:(double)knotIncrement]];
if (i >= degree && i < controls) knotIncrement++;
}
NSArray * returnArray = [NSArray arrayWithArray:constructor];
[constructor release];
return returnArray;
}