Invalid latitude and longitude in android - geolocation

I testing the below code snippet. I suddenly get a weird location (0,0) as latitude and longitude which disturbs my distance calculation.
if (loc != null) {
double currentLatitude = loc.getLatitude();
double currentLongitude = loc.getLongitude();
if (first_time) {
loc.reset();
first_time = false;
}
synchronized (testObject) {
… do some work
previousLat = currentLatitude;
previousLon = currentLongitude;
}
}
I would like to know in what situations does the latitude and longitude become 0. I also would like to know where my code bugs out due to which I am getting 0 latitude and longitude. Is location.reset() results in 0 latitude and longitude?

Of course it is 0 when you reset it. Why do you do that? which value you would expect when calling reset()?
A location delivered by android location services has an acuracy set (getAccuracy()). If you create your own location object or reset() it, it will be 0.
Further info I have written before, shows why i always check the validity before calculating:
lat,lon 0,0 is theoretically a valid location. but in my years of GPS experience, i give you the tipp: ignore all locations with coord (0,0).
This has no impact on real world behavipur, bevause a device cannot be located exactly at 0,0 (with centimeter accuracy), and even if it could, then not more than one second.
If you want you can init your lat,lon with a special value out of the range of the coord range. but even then you are not save from errors of other people.
Generally check if the location is valid, if you have a valid flag,which should exist on android. look at the horicontal accuracy value. read the docu how to distinguish between invalid locations.

Related

polyline.encode strange format

I am working with the polyline of google, I would like to give a set of coordinates and generate the correct polyline and viceversa. In particular in the end i would like to url encode the result (the polyline).
When I insert a polyline like:
code = '%28%28akntGkozv%40kcCka%40us%40y%7BDfvAm%7BBnuCj_Aus%40fzG%29%29'
I use the polyline package: https://pypi.org/project/polyline/, and first I decode the polyline in order to see the coordinates:
coordinates = polyline.decode(code)
print(coordinates)
>> [(3e-05, -0.0001), (-0.0001, -7e-05), (-0.0002, -0.0002), (45.46221, 35.36626), (45.4621, 35.36617), (45.48328, 35.39727), (45.48317, 35.39718), (45.5172, 35.39707), (45.51711, 35.39816), (45.51723, 35.39814), (45.5172, 35.38418), (45.51823, 35.3843), (45.51821, 35.38428), (45.49413, 35.37398), (45.52816, 35.37387), (45.52807, 35.32855), (45.5281, 35.32845), (45.52823, 35.32848), (45.52813, 35.32861)]
and everything here is fine, the problems comes when I try to encode the coordinates back to the polyline (which is my ultimate goal since in the end i would like to give some coordinates and obtain the corresponding polyline)
new_code = polyline.encode(coordinates)
print(new_code)
>> ERXERXakntGkozvETPkcCkaETPusETPyEWBDfvAmEWBBnuCj_AusETPfzGERYERY
Which is slightly different from the original and if put back in the url it doesnt work!
So my question here are:
what kind of encoding is new_code? I have tried to encode it in percentage url using urllib.parse.quote(new_code) but the result is exactly the same, maybe I neeed to specify some particular encoding style but i didnt found anything.
The polyline that I used is a square inside the city of Milan (so only 4 points, maximum 5, are required to identify this area), but the coordinates results from the polyline.decode gives me back a list with 19 points with coordinates that are not even close to the city of Milan. Why?
Ok so basically all of my problems came from the fact that the string i was considering: %28%28akntGkozv%40kcCka%40us%40y%7BDfvAm%7BBnuCj_Aus%40fzG%29%29
contains %28%28 and %29%29 which are not part of the polyline but are simply two (( and )) inserted by the particular url of the site I was using. A simple replace and an encode return the correct polyline:
code = '%28%28akntGkozv%40kcCka%40us%40y%7BDfvAm%7BBnuCj_Aus%40fzG%29%29'
code = code.replace('%28', '').replace('%29', '')
code = urllib.parse.unquote(code)
print(code)
>> irotG_hzv#woBmE}i#yjE`oBwkDf|ChRhMzeG}~BxcB
Which infact, if put inside the polyline.decode returns exactly the coordinates that I have used:
coordinates = polyline.decode(code)
print(coordinates)
>> [(45.46869, 9.15088), (45.48673, 9.15191), (45.4936, 9.18452), (45.47567, 9.21216), (45.45051, 9.20907), (45.44822, 9.16701), (45.46869, 9.15088)]
Which are exactly 7 (now i have changed the shape so a sixtagon instead of a square) and points exactly in the city of Milan

Buffer country borders with st_buffer and a SpatialPolygonsDataFrame?

I'm using coordinate_cleaner's country test cc_coun but it's flagging entries with coordinates near to the edges country borders. To try to keep them I wanted to buffer the terrestrial area of countries, essentially expanding their borders a little so that it doesn't flag these entries.
I've tried:
require(rnaturalearth)
world <- ne_countries(scale = "large", returnclass = "sf") %>% st_buffer(dist=.001)
Using st_buffer(dist=.001) does change the geometry, but I noticed whatever I put into dist doesn't matter as it changes it to the same thing regardless (I tried .001, 0.1, 1, 2, and -1, though any minus number removes the geometry altogether).
I found that maybe I need to transform my sf file into something else and then use st_buffer so it's in meters rather than degrees(?). I saw this question/answer but I don't understand it well enough to help my situation: What unit is the `dist` argument in `st_buffer` set to by default?
In the end I'm looking to create a SpatialPolygonsDataFrame reference file to feed into cc_coun. Using the above code I followed on with:
world <- sf:::as_Spatial(world)
coun_test <- cc_coun(x = data,
lon = "Decimal_Long",
lat = "Decimal_Lat",
iso3 = "Country_code",
value = "flagged",
ref = world,
verbose = TRUE)
Which ended up flagging more entries than when I didn't use st_buffer on the reference fine.
In summary, I want to add a buffer to the edge of every country border by around 100 meters in a file I can use as a reference in this test. Is this the best way to go about it or is there a better/easier way? I'd appreciate any advice.
Thank you

How to reduce the overhead when there are a lot of checking per elements in lists (From O(n2) to O(n))

I have a class(lets call it Path) that contain a list of points of locations.
However, only Path that contains points near my current location is my interests. So I may need to iterate the list of points to see if this path is suitable.
As you can see, per checking is quite heavy, even if there are only few points within a path. O(n)
The problem got worse when I get a pile of path to check. So there could be more than one path suit my interest. (O(n2))
So, I would like to check if there is faster way to reduce the overhead. I have created some mock data in program to illustrate the idea. (given that data is from no sql)
Below is my psudo code for only checking one path
class Path{
Path({this.PointOfInterests});
List<PointOfInterest> PointOfInterests
}
class PointOfInterest{
PointOfInterest({this.lat, this.lng});
double lat;
double lng;
double calculateDistance(PointOfInterest in){
return root ( square((in.lat - this.lat)) + square((in.lng - this.lng)) )
}
}
void main(){
List<PointOfInterest> PointOfInterests = {};
POI1 = new PointOfInterest(lat : 22.24970405555096, lng: 114.1545153839728);
POI2 = new PointOfInterest(lat : 22.24970405555096, lng: 114.1545153839728);
PointOfInterests.add(POI1);
PointOfInterests.add(POI2);
Path path = Path({PointOfInterests});
final double rangeToBeConsideredAsNearby = 50;
//I would like to check if path contain POI which location is within my current region
PointOfInterest myCurrent = PointOfInterest(lat : 22.247636119175315, lng: 114.15919047431217);
boolean IsWithInMyRegion = IsItContainPOINearbyCurentLocation(myCurrent,path);
print(IsWithInMyRegion);
IsItContainPOINearbyCurentLocation(PointOfInterest myCurrent,Path path){
//looping the list of path to check the distance one by one
for(int i = 0; i<path.PointOfInterests.length ;i++) {
// check each point of interest in the path
if (myCurrent.calculateDistance(path.PointOfInterests[i]) <= rangeToBeConsideredAsNearby) {
return true
}
return false
}
}
}
There is clearly no easier way to check whether a path, which is just a list of points, contains a point which satisfies something, than checking each point.
If you omit checking any point, that point might be the one satisfying the test, and then the result is wrong.
What you might be able to do is to precompute some "summary" information about the path, and then reuse that later for multiple checks.
Say, for each path, you compute a bounding shape (a rectangle would be simplest, just remember the minimum and maximum for each coordinate), then you check first whether your current position is close to that rectangle. If not, you can ignore the entire path. If so, you'll still have to check each point.
If you only do one check on each path, that might not be faster than just checking each point first, but if you do multiple checks on the same path, the summary computation can be amortized.
Also, if the check on each point is very expensive, doing two runs over the path can still be faster on average, if the first run does something cheaper and less precise, but which can still disqualify a large percentage of paths which would have failed the more expensive check.
That is, even if you compute the boundary rectangle each time, it might be cheap enough compared to checking all the points that it's worth it.
However, it does mean looking at all the elements, even if an early one of them alone would be enough to fail the expensive test.
To avoid that, you can also try to make the check on individual elements cheaper on average.
Here you compute the distance as sqrt(pow(x2 - x1, 2), pow(y2 - y1, 2)).
The most expensive part of that is sqrt, and it's not needed if all you do to the result is to compare it to a number: dist <= x is the same as distSquared <= x*x.
Look for optimizations like that.
If you want to avoid the multiplications too, you can start out with a simpler check, like whether either direction is already too far away.
/// Checks whether [p1] is within [dist] distance of [p2].
///
/// The [distSquared] must be `dist * dist`.
/// The [distSqrt] must be `sqrt(dist)`.
bool withinDist(Point p1, Point p2, num dist, num distSquared, num distSqrt) {
// True if p1 is within a circle with radius dist from p2.
var dx = (p2.x - p1.x).abs();
var dy = (p2.y - p1.y).abs();
// Bail out early if outside a square circumscribing the circle around p2.
if (dy > dist || dx > dist) return false;
// Bail out early if inside a square inscribed inside the circle around p2.
if (dy <= distSqrt && dx <= distSqrt) return true;
// Properly compute whether inside a `dist` radius circle.
return dx * dx + dy * dy <= distSquared;
}
In short, if you can't avoid looking at all the elements, either:
Look at them ahead of time and compute a summary which might allow you to skip the entire path, and reduce the needs for the more precise and expensive check. Or ...
Optimize the expensive check by providing cheap early answers, and only do the most expensive computations when the cheap check is inconclusive. This may be more expensive if the cheap answer is almost always inconclusive. Tailor it to your data and use-cases.

CLLocationmanger - decimal points for lat and long

To find my phone's lat and long (current location) - i am using CLLocationmanager.
Whenever I try to fetch the location the decimal points are up to 6 values, like 10.007841 and 76.147856.
How can I make it is 8 decimal points.
I have given -
self.locationManager.desiredAccuracy=kCLLocationAccuracyBest;
I think what you mean is that when you print them out, you see 6 values. This is a property of "printf" formatting - the default is 6. The actual location information is stored as a double. So if you print out the number as:
NSLog(#"location %.8lf", location.latitude);
you will get 8 digits.

distance queries by coordinates in rails 3.x

I have a location object with latitude and longitude attributes and I want to query the database for objects within a specific distance from a current latitude / longitude location.
I read about the 'geocoder' gem which seems to be able to do what I'm looking for, but in order to add the obj.near functionality I need to add geocoding to the object.
I don't need geocoding or reverse geocoding at all, just distance queries by coordinates.
can I add the .near functionality to an object with 'geocoder' without adding any geocoding functionality ? is there another or a better way to do what I want ?
Why don't you look at the geocoder source? It's on github. I think the class you are looking for is stores/active_record.rb
Obviously you won't be getting the extensibility or other benefits of using a library but this will give you the code you need.
Edit: This will give you a UDF in mysql that will calculate the distance
DROP FUNCTION IF EXISTS calculateDistance;
delimiter |
CREATE FUNCTION calculateDistance (originLatitude FLOAT, originLongitude FLOAT, targetLatitude FLOAT, targetLongitude FLOAT)
returns FLOAT
deterministic
BEGIN
DECLARE radianConversion FLOAT;
DECLARE interim FLOAT;
DECLARE earthRadius FLOAT;
SET radianConversion = 180 / pi() ;/*57.2957795130823;*/
SET interim = SIN(originLatitude/radianConversion) * SIN(targetLatitude/radianConversion) + COS(originLatitude/radianConversion) * COS(targetLatitude/radianConversion) * COS(targetLongitude/radianConversion - originLongitude/radianConversion);
IF interim > 1 THEN
SET interim = 1;
ELSEIF interim < -1 THEN
SET interim = -1;
END IF;
SET earthRadius = 6371.0072;
RETURN earthRadius * ACOS(interim);
END|
delimiter ;
select calculateDistance(43.6667,-79.4167,45.5000,-73.5833);
Then you just need to pass the correct values from a method on your class to get this to fire off. Obviously this won't work on Sqlite
There is a RailsCast about Geocoder, from what I've seen you need to add latitude and longitude to your model and gecoder will work.

Resources