Understanding MKCoordinateFromMapPoint behaviour - ios

In a location based app we use MKMapPoints to store locations, for example the current user location.
When we try use this location on a MKMapView, to set the region that is initially displayed (zoomed in on the user) we convert this to a CLLocationCoordinate2D
There's a convernience method for that: namenly: MKCoordinateForMapPoint, but during testing this gives strange results.
MKMapPoint mapPoint = MKMapPointMake(51.96, 6.3); // My area ;)
CLLocationCoordinate2D automagicCoordinate = MKCoordinateForMapPoint(mapPoint);
CLLocationCoordinate2D manualCoordinate = CLLocationCoordinate2DMake(mapPoint.x, mapPoint.y);
I would expect both the automagicCoordinate and the manualCoordinate to be exactply the same.
but when I inspect it in the debugger I get the following result:
automagicCoordinate.latitude = (CLLocationDegrees) 85.05
automagicCoordinate.longitude = (CLLocationDegrees) -179.99
manualCoordinate.latitude = (CLLocationDegrees) 51.96
manualCoordinate.longitude = (CLLocationDegrees) 6.3
How come the coordinate created with the method is incorrect?

An MKMapPoint is not a latitude and longitude. If it was, you wouldn't need a function to "convert" it to coordinates.
As the Location Awareness Programming Guide explains in the Understanding Map Geometry section:
A map point is an x and y value on the Mercator map projection. Map points are used for many map-related calculations instead of map coordinates because they simplify the mathematics involved in the calculations.
The documentation for MKMapPoint is clearer:
If you project the curved surface of the globe onto a flat surface,
what you get is a two-dimensional version of a map where longitude
lines appear to be parallel. ...
The actual units of a map point are tied to the underlying units used
to draw the contents of an MKMapView, but you should never need to
worry about these units directly. ...
When saving map-related data to a file, you should always save
coordinate values (latitude and longitude) and not map points.
The map point 51.96, 6.3 corresponds to a coordinate at the top-left of the map projection. If you want to work with coordinates (latitude, longitude), use a CLLocationCoordinate2D to avoid confusion.
(You can technically use an MKMapPoint struct to store your coordinate values but then they don't need to be converted to coordinates and the wrong type usage will just lead to confusion.)

Related

How To Detect A User Is Within Range Of An Annotation

So I am working on a project that includes many uses placing annotations all around a map. The annotation, (which is a custom image with a much larger circular range) appears on the screen and, ideally, I would like for a user to be:
Notified if they are within the range of a annotation
and
Not be allowed to place another annotation within the range of another one if the circular pins overlap by, say, more than 25%
I think this is a pretty unique question and should be fun for somebody to help out with, so have fun! Thanks everybody!
You can check the distance from each annotation using
- (CLLocationDistance)distanceFromLocation:(const CLLocation *)location
This method measures the distance between the two locations by tracing
a line between them that follows the curvature of the Earth. The
resulting arc is a smooth curve and does not take into account
specific altitude changes between the two locations.
For more details refer Link
Try this:
let location = CLLocation(latitude: 1, longitude: 1)//Or user's location
let distance = location.distance(from: anotherLocation)
Edit:
As mentioned in the comments, you wanted to create an equidistant point. I suggest manually doing that:
Subtract the annotation's location from he user's location. Then add your distance back to the original one. For example:
The user's location = (1, 1)
The annotation's location = (3, 2)
Vertical difference would be 2
Horizontal difference would be 1
Then:
(3 + 2, 2 + 1)
Your result: (5, 3)
Now you would have two points (the one you just created and the user's location) at each end with a center point (original annotation)

iOS: (Swift) How to show distance from current location and exist annotation on annotation subtitle

I am currently working on map application base on iOS using Swift language. I would like an suggestion because after I plot all the pins on map view
(which I receive data from my server using JSON frameworks call Alamofire)
I would like the subtitle of all annotations on map to show distance from user current location.
Now it can add annotations onto map view but can only show title and subtitle base on information receive from my server.
Thank You.
If you have two CLLocation instances, you can calculate distance with the following code:
var one, two: CLLocation
// assign one and two
let distance = two.distanceFromLocation(one)
CLLocationDistance is just double and distance calculated in meters

iOS MapKit get actual visible area of MKMapView when mapView is 3D

I've tried the following to get the actual visible area of my MKMapView after a region change. None produce the desired result after the user rotates the map.
use mapView.bounds and mapView.convertPoint to get NE and SW CLLocationCoordinate2D.
use mapView.visibleRect to create NE and SW MKMapPoints and convert those points to NE and SW CLLocationCoordinate2D.
use mapView.centerCoodinate and mapView.region.span latitude and longitude delta to calculate NE and SW latitude and longitude, which are then used for new NE and SW CLLocationCoordinate2D.
#1 and #2 come from this post, and all 3 work well enough until the user rotates the map, which brings the mapView.camera into play by changing its heading. Once this happens, the mapView.visibleRect does not match the actual visible area. I'm sure changing altitude and pitch will have similar issues. I understand why properties on MKMapView don't make sense once it goes 3D, but I don't know how to account for the mapView.camera. There is mention of this in a comment on one of the proposed answers in this post, but no solution provided.
My question is, how can I get the area that's actually visible to the user, through the mapView.camera, accounting for heading, altitude and pitch?
I was looking for an answer to the similar situation, and found this. For my project I resolved like the following. Hope this helps.
let northWestCoordinate = self.mapView.convert(CGPoint(x: 0, y: 0), toCoordinateFrom: self.mapView)
let southEastCoordinate = self.mapView.convert(CGPoint(x: self.mapView.frame.size.width, y: self.mapView.frame.size.height), toCoordinateFrom: self.mapView)

Sort the nearest locations using coordinates?

I have list a of places with coordinates
If users select particular location I want to suggest list of locations nearby using coordinates
I can achieve that by putting loop and searching all the places within the distance from the following code
for(i=0;i<locations.count;i++)
CLLocation *locA = [[CLLocation alloc] initWithLatitude:lat1 longitude:long1];
CLLocation *locB = [[CLLocation alloc] initWithLatitude:lat2 longitude:long2];
CLLocationDistance distance = [locA distanceFromLocation:locB];
// if(distance<10)
//{
//show the pin
// }
But I guess It not a efficient way if we have more locations in the database
What I can I do here
Is there any alternative way??
You should filter the content to reduce the amount you extract from the database. You need to define what 'nearby' means in your case, calculate the lat/long bounding box where everything outside that box is not 'nearby' and use the box min and max lat/long values to exclude things that don't match. This is done as part of the SQL query so it's very efficient.
Now, your question loop logic comes into play. The approach is fine, though using your own loop likely isn't best. Instead, look at using sortedArrayUsingComparator: or sortedArrayUsingFunction:context:.

iOS Compare GPS locations

I'm creating an application which needs to determine if my current location (recd in didUpdateLocation:) is present among a set of geo-coordinates I have. Now I understand that comparing double/float values for equality can be error prone as the precisions involved in geo-coordinates is very high. Slight inaccuracy in the GPS can throw my algo off-track. Hence I need to compare it with some error margin.
How do I compare 2 CLLocations with a margin of error? Or determine the error in the location reading (I don't think this is possible since CLLocationManager would have rectified it).
Thanks in advance :)
In each CLLocation instance there is a property declared as #property(readonly, nonatomic) CLLocationAccuracy horizontalAccuracy; What you can do is to make a routine like:
-(BOOL)isLocation:(CLLocation*)location inRangeWith:(CLLocation*)otherLocation{
CLLocationDistance delta = [location distanceFromLocation:otherLocation];
return (delta < location.horizontalAccuracy);
}
You can make even more complex logic as both locations have that property...
Cheers.
【ツ】
You can use the CoreLocation method distanceFromLocation: to find the distance between the user's location and another point. If the distance is less than whatever threshold you decide, then you can assume the locations are the same.
CLLocationDistance threshold = 5.0; // threshold distance in meters
// note: userLocation and otherLocation are CLLocation objects
if ([userLocation distanceFromLocation:otherLocation] <= threshold) {
// same location
}

Resources