currently i'am working on a university project where i want to create a smartphone indoor navigation solution without beacons. Is there any possible way of combining the map of the area with sensors of the smartphone to get a good accuracy? When I did research the most of the solutions rely on BLE Beacons, some on other technologies.
I only saw one company called mapsted which just rely on their algorithm. But i cannot mind how this should work. Like i thought indoor positioning always need something (e.g. beacon) to determine the smartphones location via triangulation.
If anyone has some thoughts on this topic, thank you :)
I am also working on Indoor Positioning without Beacons but don't have any idea to plot it on Map or floorplan. If you get any solution just let me know. You can search out about the openstreetmaps , Mapbox and openlayers where u can create indoor map and use it for ploting.
Related
i have started working on an application where i will be using ibeacons to fetch the distance and show the distance between your iphone device and beacons around the iphone device. i gone through lots of stuff about beacons and some SDK which i can use to make an application like Estimote, Eddystone, ibeacons .
Only part i am not able to understand where i want to add an custom map in it . how i can achieve that . so that it can link to beacons and show the location of the beacon in a map format .
this question might be naive but i am stuck and unable to understand . any help would be appreciated
thank you in advance
There are a number of map SDKs you can use on iOS including Apple's MapKit and Google Maps for iOS. You can also simply draw your own map as a SVG image (a vector scalable format allow you to zoom in and out without losing resolution.) You can then record in your app the latitude and longitude of the beacon, and when detected, center the map to that latitude and longitude perhaps with a blue dot at the center.
I have a requirement mentioned below:
Already have a floor plan map image
First detect current location on floor
Then select the destination location using floor plan map image
Now application should provide direction & distance for that source to destination path
This is like how google direction works, but its in-house map require.
For example,
- Current position of user is: At his desk
- Where is Meeting Room #11
- So application should provide direction and distance updates on the map/floor plan image.
Any kind of suggestions/help would be great.
Thanks in advance
Couple of points...
You could create various audio files and play them as way points based on routing. Same principal as 'turn right at the next light'.
Definitely want to set your accuracy to: kCLLocationAccuracyBest. But this will still probably only get you accuracy of around +/- 10 meters at best.
Do a floor plan overlay using MapOverlayView.
If you are indoor, iPhone uses cell towers or WIFI for a location fix. This might be a problem for you because if you are looking to map multiple floors, only GPS can give you altitude readings - ground floor, second floor, etc...
I don't want to pour cold water on your idea but I have not heard of anyone successfully doing an indoor navigation app on an iPhone using standard stuff. If you really wanted to move forward on this project, your best accuracy might be using indoor bluetooth transmitters as navigational beacons...?
What you want is path-planing in the map, is that? If so, there is lot of algoritms you can use. You can choose a block size based on your map and resolution needs, divide de map into this, amd mark each block as navegable or not. Then getting from the first block trying in the direction of the destionation block, check if the neighboor block is blocked or not, and get going, until you reach (or not, if its not reacheable) the destination block.
Thats a pseudo-implementation, you have some option to do it, if I understand your needs.
(I dont know your hardware as said by others, with simple GPS and indoor navigation, assuming a 15m resolution is a good balance between optimistic/pesimistc signal, If its for robot-navigation, its not a goos approach in the GPS terms, but the algorimt is).
I'm wondering if any of you knows a method (library, category, etc.) to cluster iOS MapAnnotations if there are many of them at the same location (e.g. 4 pieces in about 10m).
It doesn't matter to zoom in because they are still overlapping. I've already tried https://github.com/applidium/ADClusterMapView (and sombe other libs) but all of them are made for showing clusters in zoom-out-scenarios. None of them really respect the distance between annotations when zoomed in.
I'm working on an app with an offline-db so a server-side solution is not an option.
thank you for your help!
You don't need 3rd party framework's anymore. iOS 11 has native clustering support.
You need to implement mapView:clusterAnnotationForMemberAnnotations: method.
Get more details in the Apple example: https://developer.apple.com/sample-code/wwdc/2017/MapKit-Sample.zip
You should have a look into CCHMapClusterController project, looks exactly like something you are looking for.
One alternative is the MapBox iOS SDK, which is an open source (BSD) library replicating MapKit behavior. It does both annotation clustering and offline map layers.
http://mapbox.com/mobile
I have a question about the MKMapView component. I saw that on the native "Map" iOS app, we can draw the roads between different points and we can display the traffic for those roads. Well, that's exactly what I want to do for my app :)
So, I have two questions :
1) Firstly, how can I draw roads on a map. I read a lot of things about MKOverlay, about some samples which do that with JavaScript in UIWebView but what's the best way to do this ?
2) How can I know the traffic on a particularly road ? In order to draw the road in green, orange or red.
Thanks a lot !
Regards,
Sébastien ;)
Unfortunately by now CLGeocoder class from iOS 5 is just supporting to process address to geolocation transformation and reverse. It means to find address around the geocode you have or to find geocode of the address you provide. There are no possibilities access road graph to create routes (to select and draw roads precisely) and moreover to control traffic on the exact place. The only things you can do with standard tools is to draw overlays on the MKMapView with your own data.
To achieve the results you want I would offer to use the third party resources, Google Maps API for example, and use UIWebView to present a customized map. Google Maps JavaScript API v3 TrafficLayer
I'm trying to develop a mini "Around Me" like using camera, compass and location. I would like to display place's images on my screen.
For the moment I have my location and my orientation with compass. I would like to know how can I determine the position of the place I want to display.
Thanks for your help ;)
Once you have relative distance and bearing, which you can determine from two points in the same coordinate space using algorithms found on this page, figuring out where a known coordinate is with respect to a known viewpoint is basically a perspective projection, the math is outlined on this Wikipedia article. The rotation of the camera is given by the compass, and the tilt by the accelerometer (the position is of course, GPS).
I'm trying to find a better document - there are a couple of extra things to consider - like the camera parameters etc, but this is a good starting point.
If it's too involved (like if you're not comfortable with rotation matrices) we can break it right down to the simple trig.
The code in the iPhone ARKit project does this, and quite a bit more. While you may not be able to use their complete library, it is a great reference on the subject of augmented reality.
Check out 3DAR, it lets you add an AR view to a MKMapView app very easily. There's a video tutorial on this process, as well as some sample code, on the 3DAR site, www.3dar.us
You can create a location based AR app in Junaio. It's an AR browser. Free to use and deploy in (as long as it's not a custom app and in Junaio).