Minimal zoom level CameraZoomRange doesn't work correctly - ios

I try to set minimal and maximal zoom level in MKMapView.
When I set minimal zoom range it doesn't work correctly. For minCenterCoordinateDistance: 500 diameter seems to be 163 meters (checked with google maps and by drawing circle with MapKit). Also when I change aspect ration of the mapView the diameter value is different.
Can someone explain it to me?
I will be grateful.
mapView.cameraZoomRange = MKMapView.CameraZoomRange(
minCenterCoordinateDistance: 500,
maxCenterCoordinateDistance: 5000
)
------------- update ------------
Temporary solution:
var ratio = mapSize.height / mapSize.width
ratio = max(ratio, 1) * 1.88
let maxValue = maxCameraDistance * ratio
let minValue = minCameraDistance * ratio
mapView.cameraZoomRange = MKMapView.CameraZoomRange(
minCenterCoordinateDistance: minValue,
maxCenterCoordinateDistance: maxValue
)
But why magic number is 1.88 ?

Related

Get area surface and ceiling height with RoomPlan

Can't see it it the doc but, do you know/think it's possible to get the surface of a room in square meter and the ceiling heigh in meter ? Thanks
let length = 4.0 // Length of the room in meters
let width = 5.0 // Width of the room in meters
let height = 2.5 // Height of the room in meters
let surfaceArea = length * width * height
let ceilingHeight = height
print("The surface area of the room is \(surfaceArea) square meters.")
print("The height of the ceiling is \(ceilingHeight) meters.")

Polar coordinate point generation function upper bound is not 2Pi for theta?

So I wrote the following function to take a frame, and polar coordinate function and to graph it out by generating the cartesian coordinates within that frame. Here's the code.
func cartesianCoordsForPolarFunc(frame: CGRect, thetaCoefficient:Double, cosScalar:Double, iPrecision:Double, largestScalar:Double) -> Array<CGPoint> {
// Frame: The frame in which to fit this curve.
// thetaCoefficient: The number to scale theta by in the cos.
// cosScalar: The number to multiply the cos by.
// largestScalar: Largest cosScalar used in this frame so that scaling is relative.
// iPrecision: The step for continuity. 0 < iPrecision <= 2.pi. Defaults to 0.1
// Clean inputs
var precision:Double = 0.1 // Default precision
if iPrecision != 0 {// Can't be 0.
precision = iPrecision
}
// This is ther polar function
// var theta: Double = 0 // 0 <= theta <= 2pi
// let r = cosScalar * cos(thetaCoefficient * theta)
var points:Array<CGPoint> = [] // We store the points here
for theta in stride(from: 0, to: Double.pi * 2 , by: precision) { //TODO: Try to recreate continuity. WHY IS IT NOT 2PI
let x = cosScalar * cos(thetaCoefficient * theta) * cos(theta) // Convert to cartesian
let y = cosScalar * cos(thetaCoefficient * theta) * sin(theta) // Convert to cartesian
// newvalue = (max'-min')/(max-min)*(value-max)+max'
let scaled_x = (Double(frame.width) - 0)/(largestScalar*2)*(x-largestScalar)+Double(frame.width) // Scale to the frame
let scaled_y = (Double(frame.height) - 0)/(largestScalar*2)*(y-largestScalar)+Double(frame.height) // Scale to the frame
points.append(CGPoint(x: scaled_x, y:scaled_y)) // Add the result
}
print("Done points")
return points
}
The polar function I'm passing is r = 100*cos(9/4*theta) which looks like this.
I'm wondering why my function returns the following when theta goes from 0 to 2. (Please note I'm in this image I'm drawing different sizes flowers hence the repetition of the pattern)
As you can see it's wrong. Weird thing is that when theta goes from 0 to 2Pi*100 (Also works for other random values such as 2Pi*4, 2Pi*20 but not 2Pi*2 or 2Pi*10)it works and I get this.
Why is this? Is the domain not 0 to 2Pi? I noticed that when going to 2Pi*100 it redraws some petals so there is a limit, but what is it?
PS: Precision here is 0.01 (enough to act like it's continuous). In my images I'm drawing the function in different sizes and overlapping (last image has 2 inner flowers).
No, the domain is not going to be 2π. Set up your code to draw slowly, taking 2 seconds for each 2π, and watch. It makes a whole series of full circles, and each time the local maxima and minima land at different points. That's what your petals are. It looks like your formula repeats after 8π.
It looks like the period is the denominator of the theta coefficient * 2π. Your theta coefficient is 9/4, the denominator is 4, so the coefficient is 4*2π, or 8π.
(That is based on playing in Wolfram Alpha and observing the results. I may be wrong.)

Calculate real width based on picture, knowing distance

I know the distance between the camera and the object
I know the type of camera used
I know the width in pixel on the picture
Can I figure the real life width of the object?
you have to get the angle of camera. For example, iphone 5s is 61.4 in vertical and 48.0 horizontal. call it alpha.
then you calculate the width of object by this way:
viewWidth = distance * tan(alpha / 2) * 2;
objWidth = viewWidth * (imageWidth / screenWidth)

How to get coordinates of corners from GPUImageHarrisCornerDetector?

I want to detect rectangles and its corners via harris corner detector. It contains a block with corners:
filter.cornersDetectedBlock = { (cornerArray:UnsafeMutablePointer<GLfloat>, cornersDetected:UInt, frameTime:CMTime) in
The problem is, cornerArray is of type GLfloat and it returns a value between 0 and 1. I don't know how to create something like CGPoint with x and y values. Any ideas how to achieve this?
Thanks!
I don't know the specifics, but in general you have to interpolate.
I assume that you're getting back values for x and y that both range from 0 to 1, where 0 is the left/bottom edge, and 1 is the right/top edge?
You just need to lay out a ratio and convert from one coordinates system to the other.
0...1000
is to
0...1
.5 x
---- = ----
1 1000
x * 1 = 0.5 * 1000
x = 0.5 * 1000 / 1
x = 500
So if you get a value 0.5 it would be halfway between 0 and 1000. (1000-0) * 0.5. If your pixel rectangle has an origin of 0,0, you'd just multiply your 0..1 x value by your width and your 0..1 y value by your pixel height. If the pixel origin is not 0 then you need to add the origin.

convertCoordinate toPointToView returning bad results with tilted maps

I have a UIView overlayed on a map, and I'm drawing some graphics in screen space between two of the coordinates using
- (CGPoint)convertCoordinate:(CLLocationCoordinate2D)coordinate toPointToView:(UIView *)view
The problem is that when the map is very zoomed in and tilted (3D-like), the pixel position of the coordinate that is way off-screen stops being consistent. Sometimes the function returns NaN, sometimes it returns the right number and others it jumps to the other side of the screen.
Not sure how can I explain it better. Has anyone run into this?
During research have find a many solution. Any solution might be work for you.
Solution:1
int x = (int) ((MAP_WIDTH/360.0) * (180 + lon));
int y = (int) ((MAP_HEIGHT/180.0) * (90 - lat));
Solution:2
func addLocation(coordinate: CLLocationCoordinate2D)
{
// max MKMapPoint values
let maxY = Double(267995781)
let maxX = Double(268435456)
let mapPoint = MKMapPointForCoordinate(coordinate)
let normalizatePointX = CGFloat(mapPoint.x / maxX)
let normalizatePointY = CGFloat(mapPoint.y / maxY)
print(normalizatePointX)
print(normalizatePointX)
}
Solutuin:3
x = (total width of image in px) * (180 + latitude) / 360
y = (total height of image in px) * (90 - longitude) / 180
note: when using negative longitude of latitude make sure to add or subtract the negative number i.e. +(-92) or -(-35) which would actually be -92 and +35

Resources