Adding elevation to the pARk sample code - ios

I'm trying to add the altitude the pARk sample code, so the label appears near the top of the place of interest.
The first thing I did was adding a altitude property to PlaceOfInterest.h and filling it on the ViewController, with the altitude for each poi being in meters.
Then, on ARView, did the following changes:
Adding the altitude to the LLA to ECEF conversion of the device's location: latLonToEcef(location.coordinate.latitude, location.coordinate.longitude, location.altitude, &myX, &myY, &myZ);
same for the poi's location latLonToEcef(poi.location.coordinate.latitude, poi.location.coordinate.longitude, poi.location.altitude, &poiX, &poiY, &poiZ);
Added the Upwardness direction to the placesOfInterestCoordinates 4D vector placesOfInterestCoordinates[i][2] = (float)u;
I thought that was it, pretty easy. Run the project and... labels laying dead on the floor. Comparing the label location of a poi on the app with and without my changes above, they are pretty much in the same places, even though adding the altitude the values from the conversions to ECEF and ENU should change a bit.
I had little knowledge before looking at the code about Perspective projection, all the matrixes used, ECEF, ENU, etc. And I've been reading about all these concepts in Wikipedia to be able to understand the code, plus reading other questions related to this sample code here in SO like
Use of maths in the Apple pARk sample code
and also similar questions for Android... but still can't figure out why the altitude doesn't show on screen.
I tried multiplying the poi's altitude by 5 latLonToEcef(location.coordinate.latitude, location.coordinate.longitude, 5*location.altitude, &myX, &myY, &myZ); to see if there was a noticeable change on the yposition of the poi. And, while there was some, it was very small, the label still being far from the real altitude.
So if somebody could give me some hints about what am I missing, why the altitude isn't showing on screen, I'll appreciate it very much.
I uploaded my try to a repository here https://github.com/Tovkal/pARk-with-altitude, in case you want to see the code with the changes. If you run it, you should change the places of interest from the ViewController to some near you.
Thank you!

Related

ARKit Level measuring

I wonder how to create level measuring app with ARKit like iOS 12 Measure app does.
I am searching for the solution or idea from last 1 week. I have seen number of videos and tutorials but I didn't get idea how to do it.
What I think is ARSceneview's pointOfView node which represent the camera can be used to do it like can get eulerAngles and do something but I am not able to figure out.
Please any help or suggestion would be appreciated. I request you to Please Don't close this question.
For getting the value to how level the phone is, I'd recommend using self.sceneView.pointOfView?.orientation (if you only want one of the 3 values from this SCNVector just add .x, .y or .z on the end).
For instance if I hold my phone perfectly upright, the value of self.sceneView.pointOfView?.orientation.x will be approximately 0. Likewise, holding it upside-down will give a value of around 1 or -1.
You can use these values to figure out just how level the phone is on all three axes, and convert them to degrees.
I'd recommend placing the following code in a project, and triggering it via a button, or a loop that constantly runs it, and watch how the values change. Hopefully this helps. (Note that sceneView is replaced by whatever the name of your ARKit SceneView is named.)
print(self.sceneView.pointOfView?.orientation.x)
print(self.sceneView.pointOfView?.orientation.y)
print(self.sceneView.pointOfView?.orientation.z)

Achieving equal size of square/pixel on Mapbox anywhere on the world map?

The problem I'm facing is similar and closely related to this issue on Github but that's for Unity SDK, my question is for iOS SDK.
I want to achieve the same thing. Let me explain, basically I have pixel grid in which each pixel'd have equal size. Pixel is set to be 10m x 10m in real world. The thing I experienced is that if pixel locates towards the northern or southern part of the world, its size is stretched like the following.
Click for larger resolution
But when such pixel locates along the equator line, or simply along the middle part of the world. It looks ok like following
Click for larger resolution
There's no problem about rendering stuff, or positioning on Mapbox. The thing is I want every pixel to be square visually.
I've read along on the issue I linked above. It relates to mercator and the world is not flat thus makes this visual happens. It looks stretched along the northern and southern part of world map. As well, I found out that there's no equal functionalities as presented in Unity SDK for this particular problem on iOS SDK, so I'm not sure which approach I should go on to solve this solution.
How can I achieve equal size of pixel on the gridline on mapbox using Mapbox iOS SDK? Is there already solutions provided in the SDK?
FYI.
My requirement also needs real distance as shown on the map. I'm not sure it'd affect the solution as presented in the link I linked above.
I use Mapbox iOS SDK 3.7.6
My initial approach is straightforward as I fix the size of pixel to be 10m x 10m, then calculate its corresponding latitude and longitude value. Use those values to position them in Mapbox treating entire world map as a tilemap. Anyway I didn't take into account mercator in calculation, so this might be the case, if so then how to do just that? Only thing from my checking as available in iOS SDK is MGLMapView's metersPerPoint(atLatitude:). No tile ID system, or Conversions.cs as seen on Unity SDK. So i'm not sure on how to go on and solve this problem.
Update
I managed to solve it and made it work!
I'll come back and post the solution.
My solution is to port sphericalmercator.js to swift, then use it in code. I use a fixed zoom level of level 22 as its visual look is closest to what I need and also before. I went with the approach to at least have it looks visually equal not necessary its physical size.
Thanks to a hint in this answer on how to use sphericalmercator.js.
Anyway from my testing with it, tile size as set when you creating an instance using SphericalMercator seems not to be in effect no matter what value I set. Only zoom level will determine number of tiles across the world map for you. Note that upper-left corner is origin which is 0,0 tile index. Lower zoom level value will generate large tile size, but higher value will generate smaller tile size.
You can take a look as SphericalMercator-swift; the code I ported from origin JS implementation as linked above along with how to use it to get tile index, or bounding box of longitude/latitude in swift code in order to do rendering stuff on top of Mapbox.

Indoor Atlas: iOS SDK doesn't give accurate position when device stops moving

I downloaded the Indoor Atlas iPhone SDK and also generated path maps and test paths for my venue. SDK navigates me perfectly when I am moving from one place to another but when I stop moving it generates scattered output with the position radius from 10 to 25. I am expecting precise co-ordinates in both the above cases in my project.
Is there any way to get more precision?
IndoorAtlas technology is using the history of magnetic field observations for computing the precise location. This means that the device needs to move some distance in order to collect enough data to converge to a correct location estimate, i.e., to have a location fix. We are constantly improving our service to decrease the time needed for the first location fix.
If you experience your position moving after you've already stopped walking yourself, please contact support#indooratlas.com with details of your application and venue where this is experienced and we'll look into it. Thanks!

Snapping user location in MKMapView to a line/trail

I am working on a trails/maps app that has custom trails mapped out in a region and will aid the user navigate his or her way around some trails in a "foresty" area.
Currently, I am using MKMapView to get the user data/location, and loading the trails as overlays from a KML file. The problem I am having is that while testing the app, I noticed that in some situations the blue dot representing the user position goes off the trail overlays, which is expected since GPS (especially on phones) is not that great, plus some error that might have been obtained when getting the values for the trails to put in the KML file.
I apologize if all of that is a bit confusing, my question is: Is it possible to "snap" the user location (the blue dot that we all love) to a trail/overlay/path that is placed on the map with a specific tolerence? for example, if the blue dot appears to be a few pixels off the trails, then it would be placed right in the middle of the trails. If it is far off, then the user probably walked off the trails, and no snapping will happen to the user's location.
First off I wouldn't bother. If they are just a few pixels off they won't care, but if they are further away then it's important that they know where they are as accurately as possible. They could be lost in the snow and looking for trail markings.
If you do go ahead with that you'll have to abandon the userLocation dot and build our own. Using a LocationManager you can get told every time the device gets new location information and move your custom annotation dot to where you think they should be. More trouble that it's worth IMHO.

iOS Dev: Map Offset in China

I made a very simple APP in which I can throw a pin right onto the location I am standing at (just a beginner's practice). But I found a problem.
I swear neither I was moving nor the device thought I was moving. And I directly use the geolocation to set the pin. but the pin and the current-location blue point are hundreds of meters apart.
(By the way, the blue point expressed my real location at the time.)
This is a famous problem of Google Map on iOS in China. Put aside the complicated issue of the so-called national security, where I want help is what should we do as a developer. Technically, is there a way, in programming, to figure out what exactly the offset is and correct it?
Does anyone have any idea?
At what time did you place the pin? iOS has up to three sources of location data (cell tower triangulation, Wifi sniffing and GPS) and will keep you up to date with the most accurate. So often you get a not very accurate location, then a more accurate location, then an even more accurate location.
If you have a MKMapView open then something you can do is key-value observe on its userLocation property rather than starting any sort of CLLocationManager. That way you'll always be updated with whatever the map view has decided is the current location, meaning that you don't need to try to match your logic to its.
I did some research on the offset, but haven't gotten a satisfying result yet. The added offset is deterministic, i.e. given a location, the deviated location is fixed. So my goal is to get the deviation function, f(p)=p', where both p and p' are 2D points. You can check here if you are interested.

Resources