Get distance between 2 latitude & longitude in iOS6+ - ios

I want to calculate the distance between 2 lat & long. I am able to calculate the distance as below
CLLocation *currentLoc = [[CLLocation alloc] initWithLatitude:-24.4132995 longitude:121.0790024];
CLLocation *restaurnatLoc = [[CLLocation alloc] initWithLatitude:-32.8310013 longitude:150.1390075];
CLLocationDistance meters = [restaurnatLoc distanceFromLocation:currentLoc];
NSLog(#"Distance between 2 geo cordinates: %.2f Meters",meters);
Now I want to get the direction from currentLocation to restaurnatLoc. For this I have below code
double DegreesToRadians(double degrees) {return degrees * M_PI / 180;};
double RadiansToDegrees(double radians) {return radians * 180/M_PI;};
-(double) bearingToLocationFromCoordinate:(CLLocation*)fromLoc toCoordinate:(CLLocation*)toLoc
{
double lat1 = DegreesToRadians(fromLoc.coordinate.latitude);
double lon1 = DegreesToRadians(fromLoc.coordinate.longitude);
double lat2 = DegreesToRadians(toLoc.coordinate.latitude);
double lon2 = DegreesToRadians(toLoc.coordinate.longitude);
double dLon = lon2 - lon1;
double y = sin(dLon) * cos(lat2);
double x = cos(lat1) * sin(lat2) - sin(lat1) * cos(lat2) * cos(dLon);
double radiansBearing = atan2(y, x);
return RadiansToDegrees(radiansBearing);
}
It returns bearing = 114.975752 Now how can I decide whether restaurant is in North,South,West,East,NW,NE,SW,SE from my current location ?
I get 1 solution from this link Direction based off of 2 Lat,Long points But if I consider this solution , then I have doubt on bearing 114 from my location(red circle) to restaurant (green circle) as shown below. Correct me if I am wrong.
As current location is "Western Australia" & restaurant location is "Sydney" as shown in Google Maps.
Can any body tell me whats going wrong here ? Thanks.
///////////////////////////// Update /////////////////////////////
My compass diagram is wrong. Here is the correct diagram all thanks to AlexWien
Now I am getting the correct output

your compass rose is totally wrong. have you ever looked at a compass? open the iphone compass app and look where 90Degrees is located. It is east, not west like in your graphic.
geographical direction is measured clockwise!
so 114 deg is east, which matches you expectation

Related

translating CLLocation coordinates into SCNVector3 matrics

Goal: Have a AR item fixed to a location in real world based on CLLocation coordinates (LLA), and have it in the same place regardless of my orientation on viewload.
Ive been trying to convert geo LLA coordinates into an SCNVector3 matrix. The issues im having are as followed:
1) Lat1 and Long1 are based on my current coordinates from a CLLocationManager on my device. Im looking for a way to get the relative SCNVector3[x,y,z] needed by my device to and place an item at lat2 and long2's location in the real world.
2) How do I get this targeted Lat2, Long2 to be in the same place every time. I feel like the alignment of . gravityAndHeading isn't working to always orient the matrix facing true north. So though sometimes I may get the correct distance from my device location, the orientation is off.
3) I would like the values to be accurate enough to recognize something at least 2-3 meters away. even when i manually program the SCNVector3 from the same position, it seems a bit off.
Any guidance would be greatly appreciated. I've already looked at http://www.mathworks.com/help/aeroblks/llatoecefposition.html and http://www.jaimerios.com/?p=39 but it seems to return the distance but not the relative position to the phone in an SCNVector format.
import SceneKit
import ARKit
import CoreLocation
...
func latLonToECEF(_ lat1: Double,_ lon2: Double,_ alt: Double?) -> [String:Double] {
let x = 6371 * cos(lat1) * cos(long1)
let y = 6371 * cos(lat1) * sin(long1)
let z = 6371 * sin(lat1)
let lat2 = static_lat
let long2 = static_long
let x2 = 6371 * cos(lat2) * cos(long2)
let y2 = 6371 * cos(lat2) * sin(long2)
let z2 = 6371 * sin(lat2)
return ["x": (x2 - x), "y": (y2 - y), "z": (z2 - z)]
}

Calculate bounding box of a position on map in iOS

How can I calculate South West and North East coordinates from a position. I do not have the MKMapView. I need to calculate solely on the basis of CLLocationCoordinate2D.
Found it:
#define DEGREE(RADIANS) (180/M_PI) * RADIANS
#define RADIANS(DEGREE) (M_PI/180) * DEGREE
double lat = [[_locManager location] coordinate].latitude;
double lon = [[_locManager location] coordinate].longitude;
double R = 6371; // earth radius in km
double radius = 0.5; // bounding box spacing from current location in kilo meters
double y2 = lon + DEGREE(radius/R/cos(RADIANS(lat)));
double y1 = lon - DEGREE(radius/R/cos(RADIANS(lat)));
double x2 = lat + DEGREE(radius/R);
double x1 = lat - DEGREE(radius/R);
x1, y1 is South West latitude and longitude and x2,y2 is North East latitude and longitude

Calculating bounding box given center lat /long and distance on iOS?

In my iOS app I have a lat and long that gives me my location. I want to be able to calculate the bounding box that satisfies a certain distance d from my location. How can I do this?
TRY 1:
So I tried the solution given by #Neeku. I can see how it's supposed to return the right information but unfortunately it's off. So I don't think I can use it.
The code I wrote is this and I pass in 1000 meters:
MKCoordinateRegion startRegion = MKCoordinateRegionMakeWithDistance(center, meters, meters);
CLLocationCoordinate2D northWestCorner, southEastCorner;
northWestCorner.latitude = startRegion.center.latitude + .5 * startRegion.span.latitudeDelta;
northWestCorner.longitude = startRegion.center.longitude - .5 * startRegion.span.longitudeDelta;
southEastCorner.latitude = startRegion.center.latitude - .5 * startRegion.span.latitudeDelta;
southEastCorner.longitude = startRegion.center.longitude - .5 * startRegion.span.longitudeDelta;
NSLog(#"CENTER <%#,%#>", #(center.latitude),#(center.longitude));
NSLog(#"NW <%#,%#>, SE <%#,%#>",#(northWestCorner.latitude),#(northWestCorner.longitude),#(southEastCorner.latitude),#(southEastCorner.longitude));
So then the result is:
CENTER <38.0826682,46.3028721>
NW <38.08717278501047,46.29717303828632>, SE <38.07816361498953,46.29717303828632>
I then put that in google maps and get this: (see screenshot)
So then to my understanding the 1000 meters should go from the center to the sides of the box. The map is measuring the corner which should be OVER 1000 meters and it's actually just over 800 meters. This is the problem I am trying to solve.
I tried this method before and the distances simply aren't accurate. So, this solution has not worked for me. If you have more suggestions or maybe want to point out what is done wrong here please let me know.
Thank you
Let's say that your desired distance is 111 meters. Then you use the following code:
// 111 kilometers / 1000 = 111 meters.
// 1 degree of latitude = ~111 kilometers.
// 1 / 1000 means an offset of coordinate by 111 meters.
float offset = 1.0 / 1000.0;
float latMax = location.latitude + offset;
float latMin = location.latitude - offset;
// With longitude, things are a bit more complex.
// 1 degree of longitude = 111km only at equator (gradually shrinks to zero at the poles)
// So need to take into account latitude too, using cos(lat).
float lngOffset = offset * cos(location.latitude * M_PI / 180.0);
float lngMax = location.longitude + lngOffset;
float lngMin = location.longitude - lngOffset;
latMax, latMin, lngMax, lngMin will give you your bounding box coordinates.
(You can change this code pretty easily if you need distance other than 111 meters. Just update offset variable accordingly).
You can add/subtract half of the span from the latitude and longitude respectively and you get the values that you need:
CLLocationCoordinate2D centerCoord = CLLocationCoordinate2DMake(38.0826682, 46.3028721);
MKCoordinateRegion region = MKCoordinateRegionMakeWithDistance(centerCoord, 1000, 1000);
double latMin = region.center.latitude - .5 * startRegion.span.latitudeDelta;
double latMax = region.center.latitude + .5 * startRegion.span.latitudeDelta;
double lonMin = region.center.longitude - .5 * startRegion.span.longitudeDelta;
double lonMax = region.center.longitude + .5 * startRegion.span.longitudeDelta;
Just remember that:
latitudeDelta
The amount of north-to-south distance (measured in degrees) to display on the map. Unlike longitudinal distances, which vary based on
the latitude, one degree of latitude is always approximately 111
kilometers (69 miles).
longitudeDelta
The amount of east-to-west distance (measured in degrees) to display for the map region. The number of kilometers spanned by a
longitude range varies based on the current latitude. For example, one
degree of longitude spans a distance of approximately 111 kilometers
(69 miles) at the equator but shrinks to 0 kilometers at the poles.

How to show heading direction from Current location to other location on map in ios?

I have a requirement where i have to show the heading direction towards any location on map from user current location. Let say if we have 4 location annotation on map apart from current location and i want to show heading towards any of the location after tapping on it.
How can we achieve it. I have gone trough Map API documentation & Locaiotn API, I found that we get the heading value in the delegate method which provided by API when we called the startupdatingheading method. I not getting idea how can we externally get the heading data between two locations.
Please help me out.
Thanks in advance.
This will calculate the initial bearing to get from lat1,lon1 to lat2,lon2 on a straight line.
double lat1=48.0; // source latitude, example data
double lon1=17.0; // source longitude
double lat2=49.0; // destination latitude
double lon2=18.0; // destination longitude
double lat1Rad = lat1 * M_PI / 180;
double lat2Rad = lat2 * M_PI / 180;
double dLon = (lon2 - lon1) * M_PI / 180;
double y = sin(dLon) * cos(lat2Rad);
double x = cos(lat1Rad) * sin(lat2Rad) - sin(lat1Rad) * cos(lat2Rad) * cos(dLon);
double bearingRad = atan2(y, x);
// this is the bearing from source to destination in degrees, normalized to 0..360
double bearing = fmod((bearingRad * 180 / M_PI + 360),360);
Swift version:
func getHeading(fromLoc: CLLocationCoordinate2D, toLoc: CLLocationCoordinate2D) -> Double {
let lat1Rad = fromLoc.latitude * Double.pi / 180
let lat2Rad = toLoc.latitude * Double.pi / 180
let dLon = (toLoc.longitude - fromLoc.longitude) * Double.pi / 180
let y = sin(dLon) * cos(lat2Rad)
let x = cos(lat1Rad) * sin(lat2Rad) - sin(lat1Rad) * cos(lat2Rad) * cos(dLon)
let bearingRad = atan2(y, x)
// this is the bearing from/to in degrees, normalized to 0..360
return fmod((bearingRad * 180 / Double.pi + 360),360);
}

How do I calculate a Rectangle around a Geographic Point?

CLLocationManager will hand off to my delegate a new CLLocation whenever location has changed. The coordinates of that location are expressed as a CLLocationCoordinate2D object, which simply contains a latitude and a longitude. I'd like to take this location and determine the latitude and longitude 1000m south and 1000 west and the latitude and longitude 1000m north and 1000m east. This way I end up with one coordinate southwest of the location and one northeast of the location.
I have no clue how to do this, and my GoogleFoo seems quite poor tonight. What information I have found has offered up impenetrable mathematics. Anybody help a brotha hacker out? I'm fine to use an iOS API if there is one, but an equation that just operates on double values for the lat and long would be even better. It doesn't have to be accurate within centimeters, though within meters would be nice. Ideally, it would look something like this:
NSArray *rect = CalculateRectangleFromLocation(
clLocationCoordinate2D,
1000.0
);
And then *rect would have four values: the lat and long of the southwest corner and the lat and long of the northeast corner.
Here is the code to get the top/right/bottom/left coordinates of the bounding rectangle.
LatLon.h
#import <Foundation/Foundation.h>
#import <CoreLocation/CoreLocation.h>
extern double radians(double degrees);
extern double degrees(double radians);
extern CLLocationCoordinate2D LatLonDestPoint(CLLocationCoordinate2D origin, double brearing, CLLocationDistance distance);
LatLon.m
const CLLocationDegrees kLatLonEarthRadius = 6371.0;
double radians(double degrees) {
return degrees * M_PI / 180.0;
}
double degrees(double radians) {
return radians * 180.0 / M_PI;
}
CLLocationCoordinate2D LatLonDestPoint(CLLocationCoordinate2D origin, double bearing, CLLocationDistance distance) {
double brng = radians(bearing);
double lat1 = radians(origin.latitude);
double lon1 = radians(origin.longitude);
CLLocationDegrees lat2 = asin(sin(lat1) * cos(distance / kLatLonEarthRadius) +
cos(lat1) * sin(distance / kLatLonEarthRadius) * cos(brng));
CLLocationDegrees lon2 = lon1 + atan2(sin(brng) * sinf(distance / kLatLonEarthRadius) * cos(lat1),
cosf(distance / kLatLonEarthRadius) - sin(lat1) * sin(lat2));
lon2 = fmod(lon2 + M_PI, 2.0 * M_PI) - M_PI;
CLLocationCoordinate2D coordinate;
if (! (isnan(lat2) || isnan(lon2))) {
coordinate.latitude = degrees(lat2);
coordinate.longitude = degrees(lon2);
}
return coordinate;
}
Usage
CLLocationCoordinate2D location = ...;
double distance = ...;
CLLocationCoordinate2D right = LatLonDestPoint(location, 90.0, distance);
CLLocationDegrees rectRight = right.longitude;
CLLocationCoordinate2D top = LatLonDestPoint(location, 0.0, distance);
CLLocationDegrees rectTop = top.latitude;
CLLocationCoordinate2D left = LatLonDestPoint(location, 270.0, distance);
CLLocationDegrees rectLeft = left.longitude;
CLLocationCoordinate2D bottom = LatLonDestPoint(location, 180.0, distance);
CLLocationDegrees rectBottom = bottom.latitude;
Swift
extension CLLocationCoordinate2D {
fileprivate func radians(degrees: Double) -> Double { return degrees * .pi / 180.0 }
fileprivate func degrees(radians: Double) -> Double { return radians * 180.0 / .pi }
func coordinate(bearing: Double, distanceInMeter distance: CLLocationDistance) -> CLLocationCoordinate2D {
let kLatLonEarthRadius: CLLocationDegrees = 6371.0
let brng: Double = radians(degrees: bearing)
let lat1: Double = radians(degrees: self.latitude)
let lon1: Double = radians(degrees: self.longitude)
let lat2: CLLocationDegrees = asin(
sin(lat1) * cos(distance / kLatLonEarthRadius) +
cos(lat1) * sin(distance / kLatLonEarthRadius) * cos(brng)
)
var lon2: CLLocationDegrees = lon1 + atan2(
sin(brng) * sin(distance / kLatLonEarthRadius) * cos(lat1),
cos(distance / kLatLonEarthRadius) - sin(lat1) * sin(lat2)
)
lon2 = fmod(lon2 + .pi, 2.0 * .pi) - .pi
var coordinate = CLLocationCoordinate2D()
if !lat2.isNaN && !lon2.isNaN {
coordinate.latitude = degrees(radians: lat2)
coordinate.longitude = degrees(radians: lon2)
}
return coordinate
}
func rect(distanceInMeter meter: CLLocationDistance) -> (north: Double, west: Double, south: Double, east: Double) {
let north = coordinate(bearing: 0, distanceInMeter: meter).latitude
let south = coordinate(bearing: 180, distanceInMeter: meter).latitude
let east = coordinate(bearing: 90, distanceInMeter: meter).longitude
let west = coordinate(bearing: 270, distanceInMeter: meter).longitude
return (north: north, west: west, south: south, east: east)
}
}
I usually do this by using the PROJ4 library to convert latitude and longitude to a projection in meters that's useful for my region (UTM works well if you don't have more information, I'm in Northern California so surveyors in my region all work in EPSG:2226), adding the appropriate offset in meters to that, and then using PROJ4 to convert back.
Later edit: The answer given by Jayant below is fine, depending on how accurate your meters rectangle needs to be. The earth isn't a sphere, it isn't even an oblate spheroid, so the projection in which add your distance to your latitude and longitude may matter. Even using PROJ4 those meters are at sea level. Geography is harder than you'd think.

Resources