Preserve map view camera pitch (3d angle) on setVisibleMapRect - ios

I am trying to focus my map view on a particular region, but in such a way that it doesn't break the current camera (viewing angle). When I call setVisibleMapRect, the camera always resets and the view becomes completely top-down.
Is there any way to either preserve the map's camera angle, or restore it after calling setVisibleMapRect? I can't seem to get this to work to matter what I try. To be clear, I obviously don't want the exact same camera, because then calling setVisibleMapRect would be pointless, but I want to keep the "relative angle" of the camera while still zooming in or out based on the given visible map rect.
I've even gone so far as to attempt to compute the altitude based on the angle using some trigonometry but I can't seem to get it to work properly by setting the camera immediately after calling setVisibleRect. I'm guessing they're not meant to be used together.
Is trying to use setVisibleRect with a custom camera a bad idea? Should I just try to figure out the appropriate values to set the camera to? This is tricky because the camera properties are not intuitive and there don't seem to be any handy helper methods to focus on a particular region or rect on the map while using a nonzero camera angle. Any pointers would be appreciated.

You can use MKMapCamera to control the pitch and altitude. You can read the settings of the camera before changing the rect and then set them again once the new rect has been set. Here is how you set up a camera:
//create camera object
MKMapCamera *newCamera=[[MKMapCamera alloc] init];
//set a new camera angle
[newCamera setCenterCoordinate:CLLocationCoordinate2DMake(lat,lon)];
[newCamera setPitch:60.0];
[newCamera setAltitude:100.0];
[mapView setCamera:newCamera animated:YES];

Easiest way to fix this issue is, set the pitch right after setVisibleRect.
i.e.,
mapView.setVisibleMapRect(yourRect, edgePadding: yourPadding, animated: true)
mapView.camera.pitch = 45

Related

MKMapCamera doesn't zoom to correct altitude

When I'm setting an MKMapCamera with a specific altitude on my map view (MapKit) it doesn't zoom to the correct altitude sometimes. I think it has something to do with the map not being fully loaded so it stops higher (950m or so) instead of the altitude I set (about 280m).
Initially I noticed the issue when I first loaded the map but it seems more related to the lower altitudes. Higher altitudes seems to work OK.
Here's a video demonstrating the problem: https://streamable.com/644l1
In the video I'm setting the same camera twice.
The code for setting the camera:
let distance = currentHole!.teeCoordinate.distance(to: currentHole!.greenCoordinate)
let altitude = Double(distance) * 2.5
let camera = MKMapCamera(
lookingAtCenter: currentHole!.centerCoordinate(),
fromDistance: altitude,
pitch: 0.0,
heading: currentHole!.teeCoordinate.bearing(to: currentHole!.greenCoordinate) - 20
)
mapView.setCamera(camera, animated: true)
I also tried to use something like:
UIView.animate(withDuration: 1.0, animations: { () -> Void in
self.mapView.camera = camera
}, completion: { (done) -> Void in
print("Animation complete")
})
to do the animation instead. It works better (not perfect) when setting the duration to something very high, like 10 seconds or so.
Any ideas on what might be the issue here?
UPDATE:
It seems to only happen with "Satellite Flyover" maps. Satellite is fine.
I don't know for certain why this is happening but I have a theory. When you're using the flyover map types the minimum altitude of the camera is restricted by the tallest structure in the centre of the map.
If you go to the Maps app, set it to 3D Satellite view and go directly above a tall building (say the Empire State Building in New York) you can only pinch to zoom to a little above the height of the building. If you pan the camera away from the tall structure you can pinch to zoom in further. The map won't let you zoom through or inside the structure. If you zoom in to the entrance of a tall building and try to pan towards the building, the map will adjust the altitude upwards without you pinching to zoom out to prevent you passing through the building.
So before the map is fully loaded, it doesn't know what the tallest structure at the centre is going to be. To prevent you zooming inside a tall structure, the map limits the minimum height. After the map is fully loaded and it knows that there are no tall structures it lets you zoom in closer.
When you set a long duration on the animation, it's giving the map a chance to load before it gets to the lower altitude. The map knows that there are no tall structures and allows further zooming in. I would say that if you tried a longer duration animation but throttled the network bandwidth it would stop working again.
Note that the Satellite mode allows you to pass through tall structures.
As a workaround, try using mapViewDidFinishLoadingMap: or mapViewDidFinishRenderingMap:fullyRendered: to know when to zoom in more.

Orientation/rotation of a plane node using ARCamera information in ARKit

I am quite new and experimenting with Apple's ARKit and have a question regarding rotation information of the ARCamera. I am capturing photos and saving the current position, orientation and rotation of the camera with each image taken. The idea is to create 2d plane nodes with these images and have them appear in another view in the same position/orientation/rotation (with respect to the origin) as when when they were captured (as if the images were frozen in the air when they were captured). The position information seems to work fine, but the orientation/rotation comes up completely off as I’m having a difficulty in understanding when it’s relevant to use self.sceneView.session.currentFrame?.camera.eulerAngles vs self.sceneView.pointOfView?.orientation vs self.sceneView.pointOfView?.rotation.
This is how I set up my 2d image planes:
let imagePlane = SCNPlane(width: self.sceneView.bounds.width/6000, height: self.sceneView.bounds.height/6000)
imagePlane.firstMaterial?.diffuse.contents = self.image//<-- UIImage here
imagePlane.firstMaterial?.lightingModel = .constant
self.planeNode = SCNNode(geometry: imagePlane)
Then I set the self.planeNode.eulerAngles.x to the value I get from the view where the image is being captured using self.sceneView.session.currentFrame?.camera.eulerAngles.xfor x (and do the same for y and z as well).
I then set the rotation of the node as self.planeNode.rotation.x = self.rotX(where self.rotX is the information I get from self.sceneView.pointOfView?.rotation.x).
I have also tried to set it as follows:
let xAngle = SCNMatrix4MakeRotation(Float(self.rotX), 1, 0, 0);
let yAngle = SCNMatrix4MakeRotation(Float(self.rotY), 0, 1, 0);
let zAngle = SCNMatrix4MakeRotation(Float(self.rotZ), 0, 0, 1);
let rotationMatrix = SCNMatrix4Mult(SCNMatrix4Mult(xAngle, yAngle), zAngle);
self.planeNode.pivot = SCNMatrix4Mult(rotationMatrix, self.planeNode.transform);
The documentation states that eulerAngles is the “orientation” of the camera in roll, pitch and yaw values, but then what is self.sceneView.pointOfView?.orientation used for?
So when I specify the position, orientation and rotation of my plane nodes, is the information I get from eulerAngles enough to capture the correct orientation of the images?
Is my approach to this completely wrong or am I missing something obvious? Any help would be much appreciated!
If what you want to do is essentially create a billboard that is facing the camera at the time of capture then you can basically take the transform matrix of the camera (it already has the correct orientation) and just apply an inverse translation to it to move it to the objects location. They use that matric to position your billboard. This way you don't have to deal with any of the angles or worry about the correct order to composite the rotations. The translation is easy to do because all you need to do is subtract the object's location from the camera's location. One of the ARkit WWDC sessions actually has an example that sort of does this (it creates billboards at the camera's location). The only change you need to make is to translate the billboard away from the camer's position.

MapKit viewing angle

I am working on a program that should detect the pins on the map when the user is approaching to some distance, and the pin has to be in a certain angle of view. I have imported MapKit and added all pins to the annotation. Now my app is working but takes into account all pins in the map. I need to take into account only the pins that are in a 30 degree of angle. How to do this?
I think that you are asking how to change the angle of the view so the pins near the distance you say are visible in a different perspective, if im correct this is the answer:
You need to use the method setCamera on the MKMpaView, this receives an MKMapCamera, you can instantiate a camera like this let camera = MKMapCamera(lookingAtCenter:CLLocationCoordinate2D, fromDistance: CLLocationDistance, pitch: CGFloat, heading: CLLocationDirection)
where pitch is the angle, all the other parameters are super clear. when you create the camera the you jus call map.setCamera(camera: camera, animated: true) and thats it.
It's not written anywhere in the documentation, but it's still being able to calculate the viewing angle of MKMapCamera manually. For instance, SCNCamera has a property called fieldOfView, which is vertical viewing angle and it equals 60 degree. If MKMapCamera had the same property, it would be 30 degree.

tracking camera's position and rotation

I have allowsCameraControl property set to true. I need my camera to tell me what it's position and rotation is, while I move it around with pinch and pan gestures so I can later update my camera to that position. Is there some function that is called every rendering moment so I can put println: statement in it? The other option I could think was to set a didSet statement at camera's position and rotation property but I have no idea how to do that if I'm not the one defining the property in the first place.
Found a way around it using custom buttons(moveLeft,moveRight,rotateLeft etc..) to move the camera(and report current position) around 3D space. Works great. Can't tell if mnuages's suggestion works, but it looks allright.
you can use delegation methods such as SCNSceneRendererDelegate's -renderer:didRenderScene:atTime: and you can access the "free" camera by using the view's pointOfView.

MKMapView how to know rotation angle?

In iOS 7, an MKMapView can be rotated by the user (like in the Maps app).
I have overlays and to determine whether I can display them, I need to compute the zoom scale. In iOS 6, I used to do:
MKZoomScale zoomScale = self.mapView.bounds.size.width / self.mapView.visibleMapRect.size.width;
The problem is that the result of this computation changes when the user rotates the map, where as the actual zoom scale should be the same (the size of the overlays tiles is the same, it's just rotated).
So my problem is the following: how to compute the real zoom scale that does not change when the user rotates the map ? If I had the rotation angle, I could correct the "bias" but I could not find any property in MKMapView to have this angle.
A workaround would be to disable map rotation, but I want to keep this feature.
Thanks in advance.
Instead of using a computed zoom scale, you can use the new MKMapCamera altitude property. It won't change as the map rotates and I think it stays the same even if the user changes the map's pitch angle.
MKMapCamera *camera = self.mapView.camera;
CLLocationDistance altitude = camera.altitude;
if (altitude < 3000 && altitude > 1000) {
// do something
}
If you still need to know the rotation angle, you can get that from the map camera too:
CLLocationDirection mapAngle = camera.heading;

Resources