I've been logging the altitude of an MKMapView's camera as the view is scrolled and I've discovered a strange behavior: the altitude of the viewpoint fluctuates as the user scrolls although the apperance of the map's zoom level stays the same.
After closer inspection, it seems that the altitude decreases as the user scrolls South (toward the equator) and the increases again once the user passes the equator. I thought that it might have something to do with curvature of the Earth and possible involve some trigonometry or cartography knowledge that I don't have.
I am trying to emulate this function. I have a CLLocationCoordinate2D and the current altitude and zoom level and I want to calculate the proper altitude for the MKMapCamera.
Thanks!
I found your post when asking the same question. I then found this post:
How to determine the correct altitude for an MKMapCamera focusing on an MKPolygon
Condensing this into the answer to your question (and mine):
double distance = MKMetersBetweenMapPoints(MKMapPointForCoordinate(pinCenter.coordinate),
MKMapPointForCoordinate(pinEye.coordinate));
double altitude = distance / tan(M_PI*(15/180.0));
MKMapCamera *camera = [MKMapCamera cameraLookingAtCenterCoordinate:pinCenter.coordinate
fromEyeCoordinate:pinEye.coordinate
eyeAltitude:altitude];
Related
Basically I am getting answers with only 2 GPS coordinates sometimes with bearing or altitude. But how do I combine all three together to get the distance between two points?
Edit:
All my points are pretty close, i.e. distance lie between few meters to 10 km max. Can I optimize by removing some of the above parameters for such close points?
Some similar questions, although not exact:-
Calculate distance between 2 GPS coordinates: There is an answer taking bearing in consideration, way below.
Taking altitude into account when calculating geodesic distance: Altitude in the calculation
I need to measure distance of wall from user. When user open the camera and point to the any surface i need to get the distance. I have read some link Is it possible to measure distance to object with camera? and i used code for find the iphone camera angle from here http://blog.sallarp.com/iphone-accelerometer-device-orientation.
- (void)accelerometer:(UIAccelerometer *)accelerometer didAccelerate:(UIAcceleration *)acceleration
{
// Get the current device angle
float xx = -[acceleration x];
float yy = [acceleration y];
float angle = atan2(yy, xx);
}
d = h * tan angle
But nothing happen in the nslog and camera.
In the comments, you shared a link to a video: http://youtube.com/watch?v=PBpRZWmPyKo.
That app is not doing anything particularly sophisticated with the camera, but rather appears to be calculate distances using basic trigonometry, and it accomplishes this by constraining the business problem in several critical ways:
First, the app requires the user to specify the height at which the phone's camera lens is being held.
Second, the user is measuring the distance to something sitting on the ground and aligning the bottom of that to some known location on the screen (meaning you have a right triangle).
Those two constraints, combined with the accelerometer and the camera's lens focal length, would allow you to calculate the distance.
If your target cross-hair was in the center of the screen, it greatly simplifies the problem and it becomes a matter of simple trigonometry, i.e. your d = h * tan(angle).
BTW, the "angle" code in the question appears to measure the rotation about the z-axis, the clockwise/counter-clockwise rotation as the device faces you. For this problem, though, you want to measure the rotation of the device about its x-axis, the forward/backward tilt. See https://stackoverflow.com/a/16555778/1271826 for example of how to capture the device orientation in space. Also, that answer uses CoreMotion, whereas the article referenced in your question is using an API that has since been deprecated.
The only way this would be possible is if you could read out the setting of the auto-focus mechanism in the lens. To my knowledge this is not possible.
I want to ask this question without thinking about a specific technology. Suppose I pull a map tile from any maps provider using my real world location. How can I mark my location on this map tile? What is the calculation used here to convert longitude and latitude to pixels?
I have worked on OpenGL methods to view data on the earth and I think I'd summarize the position process as follows. This is by no mean the only way to do it by hopefully it helps you to think about the problem.
Treat the earth's core as the origin of a sphere, convert all polar coordinate of (latitude, longitude, radius) into (x,y,z) for every map points. Same thing for a particular mark you are interested in.
At this point, you would need to pick a view origin. Say this is your location.
Rotate everything by view origin's negative longitude through z-axis.
Rotate everything by view origin's negative latitude through y-axis.
At this point, the cartesian coordinate of all the points should have view location as the origin. Essentially, you are looking downward to the view origin.
Finally, scale it down and translate so that (x,y) fits in your coordinate system.
How can I find out the size of the currently displayed area of MKMapKit view, ideally in meters?
MKMapKit has a visibleMapRect method which can be used to obtain a MKMapSize, for which the docs say:
The units of this value are map points.
What is a "map point"?
This might help:
iphone -- convert MKMapPoint distances to meters
I am developing an application for 'phone and i would like to know if the longitude and latitude change with height?For example within a building if we move to north east corner room in each floor.
No, you have an other value for height : altitude.
when you are in a building, with GPS, you can know how you are precisely.
For sample, in an elevator, your latitude, and your longitude don't change, your altitude change. with this third value, you can know which floor you are.
In general, you have also two other values :
horizontal accuracy : precision for longitude/ latitude.
vertical accuracy : precision for altitude.
For more details, you can see this article, explain all GPS terms
Longitude and lattitude can be supposed as a grid on a sphere upon which we are standing.If you start jumping on your place only thing that will change with respect to the grid is height and and not the cordinates.
So longitude and latitude are independent of height.