Programmatically move in the direction camera is facing - ios

I'm trying to move a user position marker in the direction the camera is facing. Kind of like you would control a character in a game.
Since camera in MapKit is aligned north, I thought for moving forward I'd add some latitude degrees and then rotate the resulting point on camera angle.
I have some converters from meters to how many degrees is that:
class Converter {
fileprivate static let earthRadius = 6378000.0 // meter
class func latitudeDegrees(fromMeter meter: Double) -> Double {
return meter / earthRadius * (180 / Double.pi)
}
class func longitudeDegress(fromMeter meter: Double, latitude: Double) -> Double {
return meter / earthRadius * (180 / Double.pi) / cos(latitude * Double.pi / 180)
}
}
So for moving forward, my code looks like this:
let latitudeDelta = Converter.latitudeDegrees(fromMeter: speed)
let y = userLocation.latitude + latitudeDelta
let x = userLocation.longitude
let a = -self.mapView.camera.heading * Double.pi / 180
self.userLocation = CLLocationCoordinate2D(latitude: x*sin(a) + y*cos(a), longitude:x*cos(a) - y*sin(a))
I've tried different approaches, and none seem to work. Is there something fundamentally wrong? Could it be that I also need to consider Earth curvature in the calculations?

After some more struggling, I found out that this problem is called "Direct Geodesic Problem". Here I found all the formulas I needed, but in the end used a great C library.

Related

translating CLLocation coordinates into SCNVector3 matrics

Goal: Have a AR item fixed to a location in real world based on CLLocation coordinates (LLA), and have it in the same place regardless of my orientation on viewload.
Ive been trying to convert geo LLA coordinates into an SCNVector3 matrix. The issues im having are as followed:
1) Lat1 and Long1 are based on my current coordinates from a CLLocationManager on my device. Im looking for a way to get the relative SCNVector3[x,y,z] needed by my device to and place an item at lat2 and long2's location in the real world.
2) How do I get this targeted Lat2, Long2 to be in the same place every time. I feel like the alignment of . gravityAndHeading isn't working to always orient the matrix facing true north. So though sometimes I may get the correct distance from my device location, the orientation is off.
3) I would like the values to be accurate enough to recognize something at least 2-3 meters away. even when i manually program the SCNVector3 from the same position, it seems a bit off.
Any guidance would be greatly appreciated. I've already looked at http://www.mathworks.com/help/aeroblks/llatoecefposition.html and http://www.jaimerios.com/?p=39 but it seems to return the distance but not the relative position to the phone in an SCNVector format.
import SceneKit
import ARKit
import CoreLocation
...
func latLonToECEF(_ lat1: Double,_ lon2: Double,_ alt: Double?) -> [String:Double] {
let x = 6371 * cos(lat1) * cos(long1)
let y = 6371 * cos(lat1) * sin(long1)
let z = 6371 * sin(lat1)
let lat2 = static_lat
let long2 = static_long
let x2 = 6371 * cos(lat2) * cos(long2)
let y2 = 6371 * cos(lat2) * sin(long2)
let z2 = 6371 * sin(lat2)
return ["x": (x2 - x), "y": (y2 - y), "z": (z2 - z)]
}

Use ARKit detect phone moving trail and distance

I want log phone moving trail and distance .
since we can use ARKit measure real world objects.
func distance(from vector: SCNVector3) -> Float {
let distanceX = self.x - vector.x
let distanceY = self.y - vector.y
let distanceZ = self.z - vector.z
return sqrtf( (distanceX * distanceX) + (distanceY * distanceY) + (distanceZ * distanceZ))
}
i can get distance between 2 SCNVector3 .
but i don't know how to log phone's current SCNVector3 in AR Scene
is it possible use ARKit measure phone move distance and moving trail ?
When you create your ARSession, the camera is located at 0,0,0 and it moves as you move the phone. So if you want to know how far the phone is from its original position, just look at the translation portion of the camera transform matrix for the current frame:
frame.camera.transform
the x y and z translation components are in m41, m42 and m43 respectively. See the apple core animation docs for more on matrices if you are not familiar with the math.

iOS Swift Detecting Image Rotation

I have an app which animates a needle on a meter as long as the user is pressing on the screen. When the finger is lifted I need to know the rotation angle of the needle. I remove all animations as soon as the finger is lifted but I can't figure how to get the current rotation angle of the needle.
It is quite simple, this is the full solution:
Sample Setup:
imageView.transform = CGAffineTransform(rotationAngle: CGFloat.pi / 6) // just to test (it is 30 in degrees and 0.523598775598299 in radians)
Code:
let rad: Double = atan2( Double(imageView.transform.b), Double(imageView.transform.a))
let deg: CGFloat = CGFloat(rad) * (CGFloat(180) / CGFloat.pi )
print(deg) // works, printing 30
where deg = degrees and rad = radians
Explanation:
The first line is getting the radians, and the second line is multiplying the radians by the equivalent of a radian in degrees, to get the degrees.
NOTES:
In CGAffineTransform(rotationAngle: someValue), someValue is, in fact, the radians of the angle, it is not measured in degrees. More information about:
radian
degree
PI
The value in degrees of the radian CGFloat.pi is 180, therefore you can test it for any angle depending on this.
Let me know if this helps!

is there anyway to calculate miles or meters to pixel on google map for calculating zoom level?

I have been searching on this thing from long time.
Is there any math formula to convert miles or meters to pixel for zoom level in google map? Any kind of help is appreciated. I am currently working for iOS.
I found a solution from here https://gist.github.com/ryanhanwu/4dbcdbdf384f5a3cca1f
I re-wrote it for Swift 4 as below
let topLeft: CLLocationCoordinate2D = mapView.projection.visibleRegion().farLeft
let bottomLeft: CLLocationCoordinate2D = mapView.projection.visibleRegion().nearLeft
let zoom = mapView.camera.zoom
let lat = Double(fabs(Float(topLeft.latitude - bottomLeft.latitude)))
let metersPerPixel: Double = Double((cos(lat * .pi / 180) * 2 * .pi) * 6378137 / Double((256 * pow(2, zoom))))

convertCoordinate toPointToView returning bad results with tilted maps

I have a UIView overlayed on a map, and I'm drawing some graphics in screen space between two of the coordinates using
- (CGPoint)convertCoordinate:(CLLocationCoordinate2D)coordinate toPointToView:(UIView *)view
The problem is that when the map is very zoomed in and tilted (3D-like), the pixel position of the coordinate that is way off-screen stops being consistent. Sometimes the function returns NaN, sometimes it returns the right number and others it jumps to the other side of the screen.
Not sure how can I explain it better. Has anyone run into this?
During research have find a many solution. Any solution might be work for you.
Solution:1
int x = (int) ((MAP_WIDTH/360.0) * (180 + lon));
int y = (int) ((MAP_HEIGHT/180.0) * (90 - lat));
Solution:2
func addLocation(coordinate: CLLocationCoordinate2D)
{
// max MKMapPoint values
let maxY = Double(267995781)
let maxX = Double(268435456)
let mapPoint = MKMapPointForCoordinate(coordinate)
let normalizatePointX = CGFloat(mapPoint.x / maxX)
let normalizatePointY = CGFloat(mapPoint.y / maxY)
print(normalizatePointX)
print(normalizatePointX)
}
Solutuin:3
x = (total width of image in px) * (180 + latitude) / 360
y = (total height of image in px) * (90 - longitude) / 180
note: when using negative longitude of latitude make sure to add or subtract the negative number i.e. +(-92) or -(-35) which would actually be -92 and +35

Resources