Get the MKMapView boundary in meters - ios

How can I find out the size of the currently displayed area of MKMapKit view, ideally in meters?
MKMapKit has a visibleMapRect method which can be used to obtain a MKMapSize, for which the docs say:
The units of this value are map points.
What is a "map point"?

This might help:
iphone -- convert MKMapPoint distances to meters

Related

How is the altitude of MKMapCamera calculated?

I've been logging the altitude of an MKMapView's camera as the view is scrolled and I've discovered a strange behavior: the altitude of the viewpoint fluctuates as the user scrolls although the apperance of the map's zoom level stays the same.
After closer inspection, it seems that the altitude decreases as the user scrolls South (toward the equator) and the increases again once the user passes the equator. I thought that it might have something to do with curvature of the Earth and possible involve some trigonometry or cartography knowledge that I don't have.
I am trying to emulate this function. I have a CLLocationCoordinate2D and the current altitude and zoom level and I want to calculate the proper altitude for the MKMapCamera.
Thanks!
I found your post when asking the same question. I then found this post:
How to determine the correct altitude for an MKMapCamera focusing on an MKPolygon
Condensing this into the answer to your question (and mine):
double distance = MKMetersBetweenMapPoints(MKMapPointForCoordinate(pinCenter.coordinate),
MKMapPointForCoordinate(pinEye.coordinate));
double altitude = distance / tan(M_PI*(15/180.0));
MKMapCamera *camera = [MKMapCamera cameraLookingAtCenterCoordinate:pinCenter.coordinate
fromEyeCoordinate:pinEye.coordinate
eyeAltitude:altitude];

Converting real world location to screen coordinates

I want to ask this question without thinking about a specific technology. Suppose I pull a map tile from any maps provider using my real world location. How can I mark my location on this map tile? What is the calculation used here to convert longitude and latitude to pixels?
I have worked on OpenGL methods to view data on the earth and I think I'd summarize the position process as follows. This is by no mean the only way to do it by hopefully it helps you to think about the problem.
Treat the earth's core as the origin of a sphere, convert all polar coordinate of (latitude, longitude, radius) into (x,y,z) for every map points. Same thing for a particular mark you are interested in.
At this point, you would need to pick a view origin. Say this is your location.
Rotate everything by view origin's negative longitude through z-axis.
Rotate everything by view origin's negative latitude through y-axis.
At this point, the cartesian coordinate of all the points should have view location as the origin. Essentially, you are looking downward to the view origin.
Finally, scale it down and translate so that (x,y) fits in your coordinate system.

GPS ground coverage

Here is my idea to track my sprayer coverage on the farm with an android app.
Use get??Location to provide GPS Coordinates
Use Coordinates to plug into polyline with Maps API v2
Set polyline width according to boom width. (conversion will require pixel to distance conversion at different zoom levels.
How would I display ground coverage with a polyline if the footage on the map will change with zoom level? Correct me if I'm wrong, but the polyline uses pixels for a defined width. My idea would require the user to input the width of the sprayer in feet and then the program would have to then calculate a polyline width based on the zoom/pixel ratio.
You should not draw a poly line, because your spray path forms a closed polygon.
So you must draw a polygon with line width = 0 (or minimum line width);
Fill the polygon.
For such precision farming usuallay better GPS devices are used with centimeter accuracy. (using RTK)

CLLocationCoordinate2D distance between two points considering zoom level

I'm trying to create a solid clustering mechanism using a subclass of an MKMapView. I came across a task that have I've been banging my head against the wall for quite some time now - grouping annotations into a single cluster when they're overlapping with one another. I can get the distance in meters between two annotations, but how can I get that distance relative to a zoom level (latitudeDelta)? Ideally I would like to know when two annotations overlap with each other considering their width and height for example are 40x40.
You can use convertCoordinate:toPointToView: to get the location of the actual screen point for an annotation:
CGPoint annotationPoint = [self.mapView convertCoordinate:annotation.coordinate
toPointToView:self.mapView];
After that, use your trigonometry skills to find the distance between two points.

Given a latitude, longitude and heading, how can I determine the lat/lon that is x meters from that point?

I have a series of lat/lon which represents the center of some object. I need to draw a line through this point that is x meters on either side of the center and it needs to be perpendicular to the heading (imagine a capital T)
Ultimately I want to get the lat/lon of this line's endpoints.
Thanks!
The basic calculation is in this similar question's answer: Calculate second point knowing the starting point and distance. Calculate the points for the two headings perpendicular to the main heading the distance away you want.
Have a look at: Core Location extensions for bearing and distance
With those extensions and two points on the initial line you should be able to get the bearing, add/subtract pi/2 and find points to either side like this:
double bearing = [bottomOfT bearingInRadiansTowardsLocation:topOfT];
CLLocation *left = [topOfT newLocationAtDistance:meters
alongBearingradians:bearing+M_PI/2];
CLLocation *right = [topOfT newLocationAtDistance:meters
alongBearingradians:bearing-M_PI/2];

Resources