In iOS 7, an MKMapView can be rotated by the user (like in the Maps app).
I have overlays and to determine whether I can display them, I need to compute the zoom scale. In iOS 6, I used to do:
MKZoomScale zoomScale = self.mapView.bounds.size.width / self.mapView.visibleMapRect.size.width;
The problem is that the result of this computation changes when the user rotates the map, where as the actual zoom scale should be the same (the size of the overlays tiles is the same, it's just rotated).
So my problem is the following: how to compute the real zoom scale that does not change when the user rotates the map ? If I had the rotation angle, I could correct the "bias" but I could not find any property in MKMapView to have this angle.
A workaround would be to disable map rotation, but I want to keep this feature.
Thanks in advance.
Instead of using a computed zoom scale, you can use the new MKMapCamera altitude property. It won't change as the map rotates and I think it stays the same even if the user changes the map's pitch angle.
MKMapCamera *camera = self.mapView.camera;
CLLocationDistance altitude = camera.altitude;
if (altitude < 3000 && altitude > 1000) {
// do something
}
If you still need to know the rotation angle, you can get that from the map camera too:
CLLocationDirection mapAngle = camera.heading;
Related
When I'm setting an MKMapCamera with a specific altitude on my map view (MapKit) it doesn't zoom to the correct altitude sometimes. I think it has something to do with the map not being fully loaded so it stops higher (950m or so) instead of the altitude I set (about 280m).
Initially I noticed the issue when I first loaded the map but it seems more related to the lower altitudes. Higher altitudes seems to work OK.
Here's a video demonstrating the problem: https://streamable.com/644l1
In the video I'm setting the same camera twice.
The code for setting the camera:
let distance = currentHole!.teeCoordinate.distance(to: currentHole!.greenCoordinate)
let altitude = Double(distance) * 2.5
let camera = MKMapCamera(
lookingAtCenter: currentHole!.centerCoordinate(),
fromDistance: altitude,
pitch: 0.0,
heading: currentHole!.teeCoordinate.bearing(to: currentHole!.greenCoordinate) - 20
)
mapView.setCamera(camera, animated: true)
I also tried to use something like:
UIView.animate(withDuration: 1.0, animations: { () -> Void in
self.mapView.camera = camera
}, completion: { (done) -> Void in
print("Animation complete")
})
to do the animation instead. It works better (not perfect) when setting the duration to something very high, like 10 seconds or so.
Any ideas on what might be the issue here?
UPDATE:
It seems to only happen with "Satellite Flyover" maps. Satellite is fine.
I don't know for certain why this is happening but I have a theory. When you're using the flyover map types the minimum altitude of the camera is restricted by the tallest structure in the centre of the map.
If you go to the Maps app, set it to 3D Satellite view and go directly above a tall building (say the Empire State Building in New York) you can only pinch to zoom to a little above the height of the building. If you pan the camera away from the tall structure you can pinch to zoom in further. The map won't let you zoom through or inside the structure. If you zoom in to the entrance of a tall building and try to pan towards the building, the map will adjust the altitude upwards without you pinching to zoom out to prevent you passing through the building.
So before the map is fully loaded, it doesn't know what the tallest structure at the centre is going to be. To prevent you zooming inside a tall structure, the map limits the minimum height. After the map is fully loaded and it knows that there are no tall structures it lets you zoom in closer.
When you set a long duration on the animation, it's giving the map a chance to load before it gets to the lower altitude. The map knows that there are no tall structures and allows further zooming in. I would say that if you tried a longer duration animation but throttled the network bandwidth it would stop working again.
Note that the Satellite mode allows you to pass through tall structures.
As a workaround, try using mapViewDidFinishLoadingMap: or mapViewDidFinishRenderingMap:fullyRendered: to know when to zoom in more.
I am working on a program that should detect the pins on the map when the user is approaching to some distance, and the pin has to be in a certain angle of view. I have imported MapKit and added all pins to the annotation. Now my app is working but takes into account all pins in the map. I need to take into account only the pins that are in a 30 degree of angle. How to do this?
I think that you are asking how to change the angle of the view so the pins near the distance you say are visible in a different perspective, if im correct this is the answer:
You need to use the method setCamera on the MKMpaView, this receives an MKMapCamera, you can instantiate a camera like this let camera = MKMapCamera(lookingAtCenter:CLLocationCoordinate2D, fromDistance: CLLocationDistance, pitch: CGFloat, heading: CLLocationDirection)
where pitch is the angle, all the other parameters are super clear. when you create the camera the you jus call map.setCamera(camera: camera, animated: true) and thats it.
It's not written anywhere in the documentation, but it's still being able to calculate the viewing angle of MKMapCamera manually. For instance, SCNCamera has a property called fieldOfView, which is vertical viewing angle and it equals 60 degree. If MKMapCamera had the same property, it would be 30 degree.
I'm trying to achieve as follows:
User should always be at the center of the screen on MKMapView.
Route is drawn on the map as user will move.
I know, i can calculate the region to cover all the tracked points on the screen.
But here's my problem:
When i calculate the MKCoordinateRegion and setting it, it just fits the region that is best fitting to the screen but as soon as i'm trying to place user at center, a part of the line drawn on the MKMapView goes out of the screen.
Can anybody face this problem or any suggestions to handle this specific case, any help will be highly appreciated.
Thanks in advance.
I have accomplished it as follows:
Calculate the distance of farthest point from the user's current location (or any point you want to keep at the center).
Calculate the region, with your center point(user's current location in my case) and double the distance calculated above and make a region using te following code:
CLLocationCoordinate2D loc = [myLocation coordinate];
MKCoordinateRegion region =
MKCoordinateRegionMakeWithDistance(loc, distance * 2, distance * 2);
Set the region on the MapView and the trail will be shown inside the screen keeping user's location at the center.
Thanks.
I am trying to focus my map view on a particular region, but in such a way that it doesn't break the current camera (viewing angle). When I call setVisibleMapRect, the camera always resets and the view becomes completely top-down.
Is there any way to either preserve the map's camera angle, or restore it after calling setVisibleMapRect? I can't seem to get this to work to matter what I try. To be clear, I obviously don't want the exact same camera, because then calling setVisibleMapRect would be pointless, but I want to keep the "relative angle" of the camera while still zooming in or out based on the given visible map rect.
I've even gone so far as to attempt to compute the altitude based on the angle using some trigonometry but I can't seem to get it to work properly by setting the camera immediately after calling setVisibleRect. I'm guessing they're not meant to be used together.
Is trying to use setVisibleRect with a custom camera a bad idea? Should I just try to figure out the appropriate values to set the camera to? This is tricky because the camera properties are not intuitive and there don't seem to be any handy helper methods to focus on a particular region or rect on the map while using a nonzero camera angle. Any pointers would be appreciated.
You can use MKMapCamera to control the pitch and altitude. You can read the settings of the camera before changing the rect and then set them again once the new rect has been set. Here is how you set up a camera:
//create camera object
MKMapCamera *newCamera=[[MKMapCamera alloc] init];
//set a new camera angle
[newCamera setCenterCoordinate:CLLocationCoordinate2DMake(lat,lon)];
[newCamera setPitch:60.0];
[newCamera setAltitude:100.0];
[mapView setCamera:newCamera animated:YES];
Easiest way to fix this issue is, set the pitch right after setVisibleRect.
i.e.,
mapView.setVisibleMapRect(yourRect, edgePadding: yourPadding, animated: true)
mapView.camera.pitch = 45
I have a MapView that takes up the full screen and a few elements toward the top of the screen that overlay the map. These top elements potentially cover the blue location marker.
The map is currently tracking fine with MKUserTrackingModeFollow, but I'd like to move the tracking focus down about an inch (obviously the metric will vary by zoom level) to ensure the marker isn't covered. Is there a way to do this with MKUserTrackingModeFollow?
I've considered moving the map with the top elements, but due to design constraints this isn't an option.
Instead of using the userTrackingMode of the MKMapView object to follow the user's location, you could set up a delegate of CLLocationManager to receive location tracking events, and with each location update event you receive, set the center coordinate of the map view, with your vertical offset added to the coordinate value.
You can use the various Map Kit functions and MKMapView methods to convert your offset from some fraction of the map view's bounds.height to a MKMapPoint or CLLocationDistance value.