Is this concept possible with iOS navigation tools - ios

Lets say we have a starting point, (x,y). By using iOS navigation can we tell how far from that starting point we moved to another location (a,b). So if i walked 20 feet in a certain direction after starting would it be able to tell me how far I've moved and in which direction?
If this technology exists can I get info on where to start learning about it?
This also needs to be done without GPS, sorry.

As rmaddy mentioned With core location class and incorporating GPS in a project you can obtain a distance traveled by the person who is walking. I found a great step by step tutorial for you which has a sample project you can build and take a look at. Here is the link..http://www.perspecdev.com/blog/2012/02/22/using-corelocation-on-ios-to-track-a-users-distance-and-speed/
Also here is the link to core location class reference for further study....https://developer.apple.com/library/iOS/documentation/CoreLocation/Reference/CLLocation_Class/CLLocation/CLLocation.html

No, you can't determine location changes accurately without GPS, even with GPS it is difficult to accuracy measure position change as small as 20 feet (GPS 5m accuracy means a +/-15 foot error)
In theory you might be able to write software to create an Internal Navigation System using the built in accelerometers, gyros, and magnetometers, but in practice they are too noisy and have too much error for this kind of use (see this question). A better rocket scientist than me might be able to make it work but it was also need to use the GPS to keep it from drifting. The M7 chip on the 5S might make this feasible.

Related

determining user location with near pinoint accuracy iOS

My application is EXTREMELY dependent on user location, so much so that accuracy is very crucial to the use of the app. Speaking with my team we have realized the scenario that if a user is in close proximity to another geofence that we have created, CoreLocation may not recognize the difference. Are there currently any frameworks that work better for CL or LocationManager in iOS. I check the Cocoapods library but they all seem pretty close to one another in functionality.
You cannot overcome the physical and technological limitations of the GPS system. If you call CLLocationManager's requestLocation when the location manager is configured with the highest desired degree of desiredAccuracy (namely kCLLocationAccuracyBestForNavigation), the response in locationManager(_:didUpdateLocations:) is as accurate is you are going to get. Using a "library" is not going to change that.
You can set the accuracy to kCLLocationAccuracyBestForNavigation and then you will get the best data available. If you are not happy with iOS handling of geofences you could try to implement the geofence thing yourself by just subscribing to the delegate methods.
That way you can also filter CLLocations with a low accuracy and ignore them for example.
As others have said, there no magic bullet to give you accuracy better than what the GPS is able to provide.
What you can do is
Set the desiredAccuracy to the max
Check the accuracy value you get back from location updates, and reject readings with a horizontal "accuracy" value that is too large (since the "accuracy" reading is actually a "circle of confusion" or radius in meters where the phone may actually be located.
Note that step 2 may cause it to take a LONG time to get an acceptable location fix, or you may not be able to get an acceptable reading at all. Based on the testing I did several years ago, requiring a horizontal accuracy reading of 16 meters was the best I was able to achieve, and I could only get that signal quality in an open area free of buildings and without a lot of tree cover. (I haven't tried it in an open prairie, however.)

HKHealthKit Watch OS - Calculate velocity or 'Pace'

This is a general question seeking advice on the pattern required to calculate a user's velocity / pace / speed, when running or swimming.
Specifically, I want to be able to calculate this from watch OS, disconnected from the companion phone.
With GPS capabilities of Watch 3 / Watch OS 10.0 would the best approach be to:
Start Location Manager
Calculate distance and time between location points...
Calculate average speed?
Or are there better alternatives?
There is a good article here https://www.bignerdranch.com/blog/watchkit-2-hardware-bits-the-accelerometer/ that recommends using CoreMotion for device speed. However, this in my view would rather represent the 'device-speed' and not necessarily the user's speed over distance.
Any advice or experiences would be much appreciated.
Thanks.
The article you linked to is for WatchOS 2, not Watch 2. The motion tracking is pretty good, but to get accurate device speed you will still need to use the GPS.
If you don't need to do any other location related calculations, and don't need real time data (EDIT you can get near real time data with an HKAnchoredObjectQuery. This is sufficient for most situations) then you don't need to start location manager, just an HKWorkoutSession. This will default to using the GPS or Motion Data (which ever is more accurate/available at the time) and manage everything for you. When the workout is over, you can query for the distance samples and calculate pace from that.
If you need live motion data then the steps you outlined are correct, however you should check that the user is outdoors first. If the user is indoors or has a weak GPS signal switch to using Motion Data (and be sure to set the HKMetadataKeyIndoorWorkout appropriately if using HealthKit).

Are Hololens VR ready?

The question is already quite direct and short:
Can the Hololens be used as a virtual reality glasses?
Sorry beforehand if the question is clear for those who have tried them out, but I had not yet the chance.
From what I read I know that they have been designed to be a very good augmented reality tool. This approach is clear for everybody.
Just thinking that may be applications where you simply don't want the user to have any spatial contact with the reality for some moments, or others where you want the user to forget in the complete experience about were s-he is, then a complete environment should be shown as we are used to with the virtual reality glasses.
How are the Hololens ready for this? I think there are two key sub-questions that may be answered for this:
How solid are the holograms?
Does the screen where holograms can be placed covers the complete view?
As others already pointed out, this is a solid No due to the limited viewing window.
Apart from that, the current hardware capabilities of the Hololens is not capable of providing a full immersive experience. You can check the specifications here.
As of now, when the environment is populated with more than a few holograms (depends on the triangle count of each hologram) the device's fps count drops and a certain lag is visible. I'm sure more processing power would be added to the device in future versions, but as of right now, with the current power of the device, I seriously doubt its capabilities to populate an entire environment to give a fully immersive experience.
1) The holograms quality is defined by the following specs:
- Holographic Resolution: 2.3M total light points
- Holographic Density: 2.5k light points per radian
It is worth to say that Microsoft holograms disappear under a certain distance indicated here in 0.85m
Personal note: in the past I worked also on Google Project Tango and I can tell you from these personal experiences that the stability of Microsoft holograms is absolutely superior. Also, the holograms are kept once the device is turned off, so if you place something and you reboot the device you will find them again where you left them, without the need to restart from scratch
2) Absolutely not: "[The field of view] amounts to the size of a monitor in front of you – equivalent to 15 inches" as stated here. And it will not be addressed as reported also here. So if the holograms size exceeds this space they will be shown partially [i.e. cut]. Moreover the surrounding environment is always visible because the device purpose is interacting with the real environment adding another layer on top
Hololens is not intended to be a VR rig, there is no complete immersion that I am aware of, yes you can have solid holograms, but you can always see the real world.
VR is related with substituting the real world that is why VR goggles are always blind. HoloLens are type of see-through so you can see the hologram and the real world. There are created for augmented reality where you augment the real world. That is why you can't use HoloLens for VR purpous
Actually my initial question is: can the Hololens be used AS WELL for VR applications?
No is the answer because of its small window (equivalent to 15'' screen) where the holograms can be placed to.
I am sure this will evolve sooner or later in order to improve the AR experience. As soon as the screen does not cover toe complete view VR won't be possible with the Hololens.
The small FOV is a problem for total immersion, but there is an app for HoloLens called HoloTour, which is VR (with a few AR scenes in the beginning). In the game, the user can travel to Rome and Peru. While you can still see through the holograms, in my personal experience, people playing it will really get into it and will forget about the limitations. After a scene or two, they feel immersed. So while it certainly isn't as good at VR as a machine designed for that, it is capable, and it is still typically enjoyable to the users. There are quite a few measures to prevent nausea in the users (I can use mine for hours at a time with no problem) so I would actually prefer it to poorer VR implementations, such as a GearVR (which made me sick after 10 minutes of use!). Surely a larger FOV is in the works, so this will be less of a limitation in future releases.

Using Twisted to track GPS Locations on an iPhone

Recently, while developing an app on the iPhone, I came across the problem of tracking vehicles. It was easy to track the vehicles on a map if they were stationary using Parse ( although not sure if it was the best method) but the issue was tracking vehicles if they were moving. I didn't want to query for geopoints in Parse unnecessarily if the location of the vehicle did not change. I was steered towards using Twisted, and after doing some investigation, realized this might be a solution. Using the reactor loop, when locations were changed I could notify the other users and update their maps appropriately. Conceptually, I understand this problem but having trouble finding information or help regarding GPS with twisted.
I currently have been running the gps example from the site, http://twistedmatrix.com/documents/12.0.0/core/examples/gpsfix.py
Using my MacBook pro to test, I found the available serial port and it attempts to open as a NMEAReciever but I was expecting a GPS location to be written. Once I can understand how to interact with the GPS, I feel I could tackle communicating this information through the iPhone with NSStreams such in the fashion of this tutorial except instead of sending text messages, it will be sending GPS locations
http://www.raywenderlich.com/3932/networking-tutorial-for-ios-how-to-create-a-socket-based-iphone-app-and-server
Overall, my question is how can I access the GPS coordinates of a device using Twisted through the tutorial provided. I hope my question was detailed enough and I would be more than happy to correspond with someone any more details. Thank you
I (eventually) wrote twisted.positioning, which is essentially a better version of the twisted.protocols.gps thing you're using. It has much nicer abstractions over concepts like positions, as well as receivers. That may be interesting to you, because it provides abstractions that you can use to e.g. combine information from GPS and other sources (like compass). However, I think that in iOS-land, that job is already (mostly) handled by Core Location. I'd assume that the best course of action is too hook that up to twisted.positioning (shouldn't be particularly difficult, can't be anywhere nearly as hard as NMEA is, at least!). Lacking iOS development experience, I can't tell you how to access Core Location from Python; I can only point at the docs.
twisted.positioning is also an improvement when it comes to documentation. Unfortunately, that wasn't very difficult, because its predecessor came with none at all. I hope the one scant example that is provided helps, though; and I'd be more than happy to elaborate if it doesn't.

CMMotionActivityManager ignores cycling

I've been researching the new M7 chip's CMMotionActivityManager, for determining whether the user of the device is walking, running, in a car, etc (see Apple Documentation). This seemed like a great step forward over trying to determine this previous from using LocationManager and accelerometer data only.
I notice however that CMMotionActivityManager does not have a cycling activity, which is disappointing, and almost a deal-breaker for complete usage as a new activity manager. Has anyone else found a convenient way to use CMMotionActivityManager with cycling also without having to reincorporate CMLocationManager + accelerometer just to try to test for cycling too?
Note, this also does not include general transport options for things like a Train. For instance, I commute an hour a day on the train. Automotive could be made more generic at least, similar to how Moves uses Transport.
CMMotionActivity has these defined motion types only:
stationary
walking
running
automotive
unknown
Useful notes from Apple's code, that does not necessarily solve the issue, but is helpful:
CMMotionActivity
An estimate of the user's activity based on the motion of the device.
The activity is exposed as a set of properties, the properties are not
mutually exclusive.
For example, if you're in a car stopped at a stop sign the state might
look like:
stationary = YES, walking = NO, running = NO, automotive = YES
Or a moving vehicle, stationary = NO, walking = NO, running = NO,
automotive = YES
Or the device could be in motion but not walking or in a vehicle.
stationary = NO, walking = NO, running = NO, automotive = NO. Note in this case all of the properties are NO.
[Direct Source: Apple iOS Framework, CoreMotion/CMMotionActivity.h #interface CMMotionActivity, inline code comments]
First of all its your question or kind of informative details on M7?
Has anyone else found a convenient way to use CMMotionActivityManager
with cycling also without having to reincorporate LocationManager +
accelerometer just to try to test for cycling too?
See there is lots of confusion it will create if you want to check if activity is type of cycling ??because its just depend on accelerometer
accelerometer contain microscopic crystal structures that get stressed by accelerative forces, which causes a voltage to be generated.and from that voltage it can parse the result.. so what i know is its just classifies your speed and giving you result that its running walking or automotive so if you want to use cycling some time very fast very slow or medium so may be it will some time result in to walking or running or may be auotomotive so m7 can not clarify the thing if its automotive or cycling or running because there is not much of speed variance while you cycling.
Still while using for running and walking its some time gives wrong results in some cases.. so that will chances that your app will give wrong information too.
One more thing you asked is
Note, this also does not include general transport options for things
like a Train. For instance, I commute an hour a day on the train.
Automotive could be made more generic at least, similar to how Moves
uses Transport.
So Apple is also working on other mapping features. Apple is said to be planning notable updates to its Maps app in iOS 8, and the company is currently working on implementing both public transit directions and indoor mapping features (which Google already has on iOS).
http://www.macrumors.com/2013/09/12/apple-working-to-leverage-new-m7-motion-sensing-chip-for-mapping-improvements/ (Useful Link)
So, not sure if you still need an answer to that but here is the latest from iOs8 SDK
#property(readonly, nonatomic) BOOL cycling NS_AVAILABLE(NA, 8_0);
In session 612 at WWDC 2014, the two presenting Apple engineers provided some information: In the slides they stated:
Performance is very sensitive to location
Works best if device is worn on upper arm Best for retrospective use cases
Longest latency
Best for retrospective use cases
In the video they explain on the audio track (starting at about 11:00) that
Cycling is new, something we introduced in iOS 8.
Cycling is very challenging, and again you need the dynamics and so
it's going to be very sensitive to location.
If it was mounted on the upper arm the latency is going to be fairly
reasonable.
And if it's anywhere else, it's going to take a lot longer. So definitely I would not suggest using cycling activity classification as a hint for the context here and now. It's really something that you'll want to use in a retrospective manner for a journaling app, for example.
I made a simple test setup for iOS 8 and 9 and iPhone 5s and 6 and cycling was not detected - not a single time in over 1.5h cycling. If the new iPhone 6S makes good this major deficit in motion activity detection is unclear - Phil Schiller announced it in September 2015.
tl;tr
Currently, cycling detection in CoreMotion does not work as it works for stationary, walking, running, and car! It will be not detected and can be used retrospectively only.

Resources