I am writing an app in Swift to capture and save the location of the phone.
I am planning to use Core Location framework. This framework provides several services that you can use to get and monitor the device’s current location.
Here are a few questions I have:
1. Does Core Location framework uses the phone GPS to get the longitude and latitude or the cell data?
2. What if the phone is in your car, pocket, bag or purse. Can it still get a correct location (longitude and latitude)?
3. If the answer to #2 is false, then what other alternative I have?
Does Core Location framework uses the phone GPS to get the longitude and latitude or the cell data?
It does if it needs GPS for the specified accuracy. When you use Core Location, you tell it how accurate you need to be, and Core Location will turn on GPS if a) GPS is available on the device, and b) it needs GPS to get the accuracy you requested.
What if the phone is in your car, pocket, bag or purse. Can it still get a correct location (longitude and latitude)?
It can, unless it can't. GPS works reasonably well in a car. Pockets, bags, etc. aren't a problem. But GPS receivers do need to be able to receive radio signals from satellites, so if your car is parked on the bottom floor of an underground parking garage, or even if you're just inside a building, it may not work. Core Location will then use whatever other means are available for determining location, such as WiFi and cell tower triangulation.
If the answer to #2 is false, then what other alternative I have?
You really don't have any alternatives to Core Location, but Core Location has a number of ways to determine location. So use it as necessary (and avoid using it when you don't need it, since it does use power) and trust it to do the right thing.
Related
My application is EXTREMELY dependent on user location, so much so that accuracy is very crucial to the use of the app. Speaking with my team we have realized the scenario that if a user is in close proximity to another geofence that we have created, CoreLocation may not recognize the difference. Are there currently any frameworks that work better for CL or LocationManager in iOS. I check the Cocoapods library but they all seem pretty close to one another in functionality.
You cannot overcome the physical and technological limitations of the GPS system. If you call CLLocationManager's requestLocation when the location manager is configured with the highest desired degree of desiredAccuracy (namely kCLLocationAccuracyBestForNavigation), the response in locationManager(_:didUpdateLocations:) is as accurate is you are going to get. Using a "library" is not going to change that.
You can set the accuracy to kCLLocationAccuracyBestForNavigation and then you will get the best data available. If you are not happy with iOS handling of geofences you could try to implement the geofence thing yourself by just subscribing to the delegate methods.
That way you can also filter CLLocations with a low accuracy and ignore them for example.
As others have said, there no magic bullet to give you accuracy better than what the GPS is able to provide.
What you can do is
Set the desiredAccuracy to the max
Check the accuracy value you get back from location updates, and reject readings with a horizontal "accuracy" value that is too large (since the "accuracy" reading is actually a "circle of confusion" or radius in meters where the phone may actually be located.
Note that step 2 may cause it to take a LONG time to get an acceptable location fix, or you may not be able to get an acceptable reading at all. Based on the testing I did several years ago, requiring a horizontal accuracy reading of 16 meters was the best I was able to achieve, and I could only get that signal quality in an open area free of buildings and without a lot of tree cover. (I haven't tried it in an open prairie, however.)
This is a general question seeking advice on the pattern required to calculate a user's velocity / pace / speed, when running or swimming.
Specifically, I want to be able to calculate this from watch OS, disconnected from the companion phone.
With GPS capabilities of Watch 3 / Watch OS 10.0 would the best approach be to:
Start Location Manager
Calculate distance and time between location points...
Calculate average speed?
Or are there better alternatives?
There is a good article here https://www.bignerdranch.com/blog/watchkit-2-hardware-bits-the-accelerometer/ that recommends using CoreMotion for device speed. However, this in my view would rather represent the 'device-speed' and not necessarily the user's speed over distance.
Any advice or experiences would be much appreciated.
Thanks.
The article you linked to is for WatchOS 2, not Watch 2. The motion tracking is pretty good, but to get accurate device speed you will still need to use the GPS.
If you don't need to do any other location related calculations, and don't need real time data (EDIT you can get near real time data with an HKAnchoredObjectQuery. This is sufficient for most situations) then you don't need to start location manager, just an HKWorkoutSession. This will default to using the GPS or Motion Data (which ever is more accurate/available at the time) and manage everything for you. When the workout is over, you can query for the distance samples and calculate pace from that.
If you need live motion data then the steps you outlined are correct, however you should check that the user is outdoors first. If the user is indoors or has a weak GPS signal switch to using Motion Data (and be sure to set the HKMetadataKeyIndoorWorkout appropriately if using HealthKit).
I am working on a project that has a requirement of Indoor navigations by using iBeacon. Have been searching a lot, I only found some paid sdk and other tools. I know how iBeacon used for indoor navigation,but there is some problem. I want move user location first beacon to another but only on specific path,but now when user move than location not follow path as given by me.
Please let me know. Thanks in advance!!
While it is possible to build an indoor navigation system using beacons, it is not a trivial exercise. Beacons only provide a very small building block needed to create the overall system. Think of beacons as being a brick used to build a house. Are you up for building a house from scratch out of a pile of bricks and many other components?
You may be better off using an off-the-shelf SDK, even if it is paid, rather than building this yourself. If you do want to build it from scratch, there are several components you must build:
Beacon location configuration: You need a system to register the location of each beacon in latitude/longitude and get this configuration into the mobile app.
Position determination: Based on detecting the closest beacon(s), you must build a module that determines the position of the user's mobile phone based on the configuration above.
Map rendering engine
Coordinate system conversion from the beacon location configuration reference frame to the map coordinate frame.
Wayfinding module: Based on configured routes on the map, the wayfinding module would determine where to direct the user along these routes to get to a destination.
I worked on a team that built a beacon-based indoor nav system for the Consumer Electronics Show. It took multiple team members a few months to build the system from scratch using hundreds of beacons and low-level tools. Don't underestimate the effort involved.
This answer is assuming that the user will have no other location services (GPS etc), it could be achieved using multiple iBeacons.
The all possible routes between the start and end destination would need to have to iBeacons on them.
Register that the user has arrived at the first beacon, and display to them your chosen route.
If you detect them getting close to any beacons which are not on your chosen route, then you know they're probably not following it.
So with enough beacons, you can accurately plot the user's location in an indoor environment (provided you know the exact location of the iBeacons beforehand).
I have a project which is constrained to working with apple maps, corelocation and apple's mapkit -- 3rd party interfaces aren't allowed.
I need to display a map view showing the user's current location (standard blue dot) and simultaneously place a pin on the road nearest the user's current location -- a 'snap to road' which is a best estimate of the user's nearest location.
I know how to retrieve the user's current address and use the returned street name and address, but i don't know how to add the pin to the resulting street in mkmapview. I've looked at the mkdirectionsrequest but am unsure how to adapt the returned data to solve this problem.
thanks in advance for your help!
I think you can get the road-snapping done by Location Services, by adjusting the location service mode.
Apple's Developer docs on CLLocationManager offer four 'Activity' modes:
Other ~ unspecified
Automotive ~ in a car, truck, road vehicle
Fitness ~ pedestrian related activity
OtherNavigation ~ trains, planes, boats
(no comment about the ambiguity between the first and fourth... there's a bit more detail in the developer docs too than my few-word summaries above)
Basically you set CLActivityType to CLActivityTypeAutomotiveNavigation, and it will do sensible things on the assumption you're in a road vehicle. From what I've seen, this more aggressively snaps to vehicular roads, though you're still limited by the fidelity of Apple's maps.
For a snap to road feature you need the sequence of latitude and longitude coordinates of that street. You usually never get that vector info, because the owner of that vector data (TomTom and Nokia) do not allow that.
Only OpenStreetMap based data is free, and vector data is available.
ios uses TomTom data, and as expecdted you will not get the coordinates of the polyline of a street.
I would like to know if an altimeter is available on iOS, and, if not, is there a way to calculate height, with much possible accuracy?
The iPhone and 3G-enabled iPads have GPS units in them. They can use the GPS to determine current altitude along with other position and velocity variables. The API used to access this information is called CoreLocation. Specifically, you would use a CLLocationManager object to send your app CLLocation objects which contain an "altitude" property.
See:
https://developer.apple.com/library/ios/#documentation/CoreLocation/Reference/CLLocation_Class/CLLocation/CLLocation.html
Core Location has an altitude attribute There is a basic tutorial available here; note that it has multiple parts.
For the Core Location docs, check out the CLLocation Class Reference here.