How to get user's step count in using accelerometer data? - ios

I want to calculate user steps(like pedometer).I know that with iPhone 5s, 6 and 6+ we can use CMStepCounter or CMPedometer class(which use M7 chip of devices) but iPhone 5 and lower versions does not support M7 chip, so we can't use CoreMotion. By searching all over internet i came to know that we can use accelerometer sensor for this purpose. But after spending a lot of time still i'm not able to make an accurate algorithm that works.
Edit2: After spending several days on searching google i tried a lot but still unable to find an working algorithm for counting user step using accelerometer.
Can anybody out there who can help me?

CMMotionManager is what you are looking for if you are using later versions of iOS.
However, if you want to continue with iOS 5 or lower you need to use the following although this is deprecated.
UIAccelerometer * accelerometer = [UIAccelerometer sharedAccelerometer];
accelerometer.delegate = self;
The method where you can get x, y, z values is:
-(void)accelerometer:(UIAccelerometer *)accelerometer didAccelerate:(UIAcceleration *)acceleration
{
//Check your x,y,z values to find step ...
}
If you need to know the logic behind step counter you can search and read through in Google :)

Related

Minimizing speed variability with kCLLocationBestForNavigation when sailing

I am working on a GPS Apple Watch app for use when sailing which requires high accuracy for a start line scenario. I suspect the additional gyro/accelerometer inputs are actually hurting, not helping GPS accuracy. For example, CLLocation.speed variability seems very high compared to other instruments. (i.e 4.4 knots, 2.2 knots, 6.8 knots, 4.6 knots ... , when other sailing GPS instruments read 4.5 ,4.3 ,4.6 ,4.5, ...)
I understand that having the iPhone nearby will make the watch offload GPS processing to the iPhone. This definitely helps accuracy, but does not seem to help variability. When I am testing off the water (walking or riding a bike), the variability is much lower than sailing. I get similar results running my code on iPhone app and on watch (with iPhone nearby).
func startReceivingLocationChanges(locationManager: CLLocationManager) -> Bool {
// Do not start services that aren't available.
if !CLLocationManager.locationServicesEnabled() {
// Location services is not available.
return false
}
// Configure and start the service.
locationManager.desiredAccuracy = kCLLocationBestForNavigation // kCLLocationBestForNavigation seems to have accuracy issues when sailing, try kCLLocationBest???
locationManager.distanceFilter = kCLDistanceFilterNone // In meters.
locationManager.delegate = self
locationManager.startUpdatingLocation()
return true
}
I am considering trying kCLLocationAccuracyBest to turn OFF the Gyro/accel in hopes that this may help. I am assuming speed and course are simple calculations based on lat/long changes, but perhaps not? I can do these calculations myself if needed. Also interested in possibly using some 3rd party filtering code to "smooth" the curves. Curious what others have tried. (I suspect this might also be an issue for swimmers, and/or mountain biking tracking scenarios where movement is not "smooth" or easily filtered out.)

Interpolating and predicting CLLocationManager

I need to get an updated user location with at least 10 hz to animate the location smoothly in MapBox for iOS while driving. Since Core Location only provides one point every second I believe I need to do some prediction.
I have tried ikalman but it doesn`t seem to do any difference when updated once a second and queried at 10 hz.
How do i tackle this please?
What you're looking for is extrapolation, not interpolation.
I'm really, really surprised that there's so few resources on extrapolation on the internet. If you want to know more you should read some numerical methods/math book and implement the algorithm yourself.
Maybe simple linear extrapolation will suffice ?
// You need two last points to extrapolate
-(double) getExtrapolatedValueAt:(double)x withPointA:(Point*)A andPointB(Point*)B
{
// X is time, Y is either longtitute or latitude.
return A.y + ( x - A.x ) / (B.x - A.x) * (B.y - A.y);
}
-(Point*) getExtrapolatedPointAtTime:(double)X fromLatitudeA:(Point*)latA andLatitudeB:(Point*)latB andLongtitudeA:(Point*)longA andLongtitudeB:(Coord*)longB
{
double extrapolatedLatitude = [self getExtraploatedValueAt:X withPointA:latA andPointB:latB];
double extrapolatedLongtitude = [self getExtrapolatedValueAt:X withPointA:longA andPointB:longB];
Coord* extrapolatedPoint = [Coord new];
extrapolatedPoint.longtitude = extrapolatedLongtitude;
extrapolatedPoint.latitude = extrapolatedLatitude;
return extrapolatedPoint;
}
Not sure if I got the function right but you can check here:
http://en.wikipedia.org/wiki/Extrapolation
it's really easy.
You should implement the linear extrapolation.
If you find out that linear extrapolation isn't enough (for curves for example) you should just iterate and change it with some other extrapolation algorithm.
Another approach would be to have a 1 sec delay in animation and animate between two known points using interpolation. I don't know if that's acceptable for your use case.
This problem is typically solved with something called "Dead Reckoning". And you're right on track with trying to use a Kalman filter for doing this. If iKalman isn't working for you, you can try to resort to a simpler approach.
There's a lot of this sort of problem solving when dealing with games and network latency, so you can likely reuse an algorithm developed for this purpose.
This seems like a pretty thorough example.
The wiki on Kalman filters may help out as well.
I ended up solving this by using long UIView animations instead (2-3) seconds with easing that start from the current state. This gives the impression of smooth position and heading following "for free".

Contiki OS CC2538: Reducing current / power consumption

I am trying to drive down the current consumption of the contiki os running on the CC2538 development kit.
I would like to operate the device from a CR2032 with a run life of 2 years. To achieve this I would need an average current less than 100uA.
However when I run the following at 3V, I get the following results:
contiki/examples/hello-world = 0.4mA - 2mA
contiki/examples/er-rest-example/er-example-client = 27mA
contiki/examples/er-rest-example/er-example-server = 27mA
thingsquare websocket example = 4mA
I have also designed my own target platform based on the cc2538 and get similar results.
I have read the guide at https://github.com/contiki-os/contiki/blob/648d3576a081b84edd33da05a3a973e209835723/platform/cc2538dk/README.md
and have ensured that in the contiki-conf.h file:
- LPM_CONF_ENABLE 1
- LPM_CONF_MAX_PM 2
Can anyone give me some pointers as to how I can get the current down. It would be most appreciated.
Regards,
Shane
How did you measure the current?
You have to be aware that using a basic ampere meter to measure the current consumption of contiki-os wouldn't give you relevant results. The system is turning on/off the radio at a relative high rate (8Hz by default) in order to perform the CCA. This might not be very easy to catch for an ampere meter.
To have an idea of the current consumption when the device is in deep sleep (and then make calculation to determine the averaged current consumption), I'd rather put the device in the PM state before the program reach the infinite while loop. I used the following code to do that:
lpm_enter();
REG(SYS_CTRL_PMCTL) = SYS_CTRL_PMCTL_PM2;
do { asm("wfi"::); } while(0);
leds_on(LEDS_RED); // should not reach here
while(1){
...
On the CC2538, the CCA check consumes about 10-15mA and last approximately 2ms. When the radio transmit a packet, it consume 25mA. Have a look at this post: Contiki UDP packet transmission duration with CC2538.
Furthermore, to save a little more current, turn off the serial com:
#define CC2538_CONF_QUIET 1
Are you using the SmartRF board? If you want to make proper current measurement with this board, you have to remove every jumpers: P486, P487, P411 and P408. Keep only the jumpers of BTN_SEL and the RESET signals.

CGPointFromString, but for OS X

I'm trying to follow a tutorial for making an iOS game in SpriteKit, but also simultaneously porting it to OS X. (When I'm making my own game, I intend to do exactly that, plus it helps the learning sink in a little bit more than just copying spoon fed code)
So far, everything has gone spritely (pun intended) and I've been able to troubleshoot or research every problem I've come across, but alas, it could only last so long. It's probably something stupid simple too, but it evades me.
I'm trying to pull in a string from a plist dictionary
self.position = CGPointFromString( [ characterData
objectForKey:#"StartLocation"]);
StartLocation is the key for the coordinates, in the format of "{0,0}".
When building for iOS, it works beautifully. When building for OS X, it fails. I believe the issue lies with "CGPointFromString" being unique to the iOS development environment. It only appears in iOS documentation. The thing is, I can't find any OSX equivalents.
I realize that I could PROBABLY get it to work by breaking the coordinates into two float values with two separate dictionary entries for x and y, but I don't know if the writer of the tutorial did this intentionally and that would break something down the road, plus I want to know how to convert from one data type to another on OS X too.
Please help! /Thank you!
On OSX you have NSPointFromString but not CGPointFromString. You also have NSPointToCGPoint and NSPointFromCGPoint if you want typecast them.
Not as direct but this works:
NSPoint cocoaPoint = NSPointFromString(theString);
CGPoint point = NSPointToCGPoint(cocoaPoint);

Decoding the CLLocationAccuracy const's

the following are listed in CLLocation.h but from my experience they are deceiving names- possibly originally thought up to serve two purposes, 1. to test the accuracy of the location returned, but also 2. to set how hard the location manager works, specifically what is enabled (gps (how many sat channels), how hard the wifi works, triangulation etc.
extern const CLLocationAccuracy kCLLocationAccuracyBestForNavigation; // (raw value: -2)
extern const CLLocationAccuracy kCLLocationAccuracyBest; // (raw value: -1)
extern const CLLocationAccuracy kCLLocationAccuracyNearestTenMeters; // (raw value: 10)
extern const CLLocationAccuracy kCLLocationAccuracyHundredMeters; // (raw value: 100)
extern const CLLocationAccuracy kCLLocationAccuracyKilometer; // (raw value: 1000)
extern const CLLocationAccuracy kCLLocationAccuracyThreeKilometers; // (raw value: 3000)
I would love to take a look at CLLocation.m, but as that is not likely to happen any time soon- does anyone have any field testing showing what they think is going on with these different modes.
ie, kCLLocationAccuracyBest = 10 satellite (channels/trunks?), 100% power to wifi etc..
I'm kind of guessing at straws here- I think this is the type of information apple should have provided-
what I really want to know is, what is actually happening with kCLLocationAccuracyThreeKilometers in relation to battery draw- is the gps on? 1 sat trunk? wifi enabled? wifi on a timer? who knows? I know I'd like to
I agree with Olie that hiding the details of the algorithm is intended to protect the app developer from worrying about how location is determined. That said, I believe it's still reasonable to ask the question: "what are the power implications of my accuracy selection?".
I have a little bit of information that might guide your decision on which to use, but I don't know the true details of Apple's implementation.
First, assume that as the reading becomes more accurate, the system will need to use more power-hungry radios. For example, the GPS will be required for the most detailed readings, inside 100 Meters, and it uses the most power.
Here is an educated guess at the mechanism used to determine the accuracy. List is ordered with (1) being the highest battery drain.
GPS - kCLLocationAccuracyBestForNavigation;
GPS - kCLLocationAccuracyBest;
GPS - kCLLocationAccuracyNearestTenMeters;
WiFi (or GPS in rural area) - kCLLocationAccuracyHundredMeters;
Cell Tower - kCLLocationAccuracyKilometer;
Cell Tower - kCLLocationAccuracyThreeKilometers;
When choosing, it is recommended by Apple that you select the most coarse-grained accuracy that your application can afford.
Hope that helps, a.little.
In the business district of a major city, wifi and cell tower triangulation are both very good. Residential suburbs they're not so good. In rural areas they barely work if they work at all.
GPS doesn't work very well indoors, and can take a very long time to get any fix at all without cell tower assistance (possibly 20 minutes!!). It takes that long for the satelites to broadcast enough information to determine your location, and there can be packet loss (clouds, buildings, trees, mountains, etc). It's worth noting that a proper high end GPS will have an antenna the size of a basket ball, no handheld GPS can get a perfect signal.
Even outdoors with perfect signal, GPS is inaccurate when you change direction rapidly (such as on the highway or a windy road). The BestForNavigation setting uses the accelerometer and gyroscope to offset this.
Currently, the iOS platform uses:
GPS: very accurate, but high power draw, slow and not always available. some hardware doesn't have a GPS.
WiFi: lots of power draw, and only works in the city. Can also be flat out wrong (eg place you in the wrong city)
Cell Tower: almost no power draw at all, and works well in the city. Not so great in rural areas. Doesn't exist on some hardware.
Accelerometer: slight improvements to other location fixes, but huge power draw.
Gyroscope: slight improvements to other location fixes, but huge power draw. iPhone 4 only.
You give it an accuracy in meters that you need (the constants are just nice names for meters), and it will use a combination of the above, to get you that level of accuracy with the fastest possible fix and lowest possible power draw. The technique it uses will change, from one user to another, and will change depending on where in the world the user is standing at the time.
The whole point of using extern rather than exposing what is actually happening is so that the under-gerwerkkins can change and your code doesn't have to worry about it to pick up the improvements.
That said, CLLocationAccuracy is typedef-ed to double, so I think it's fair to guess that kCLLocationAccuracyNearestTenMeters = 10.0, kCLLocationAccuracyHundredMeters = 100.0, etc. Best is likely either 0, 1 or kCLLocationAccuracyNearestTenMeters, and BestForNavigation is probably one they tossed it to help folks like TomTom, etc.
If you REALLY want to know, you can print out the values -- they're just doubles.
I do not believe that the number of satellites or power to wifi is altered based on your desired accuracy. The way I understand the algorithms, there is an approximation calculation that, the more times through the loop, the more accurate it gets. Hence, less-accurate just bails earlier.
But, again, the more important point is: it doesn't matter. Apple specifically doesn't describe what goes on behind the scenes because that's not part of the design. The design is: if you use kCLLocationAccuracyKilometer, you'll get an answer that's within a kilometer, etc. And Apple is now free to change how they arrive at that without you caring. This sort of isolation is a basic tenet of object oriented programming.
EDIT:
CORRECTION -- I'm just now watching the WWDC session on location (Session 115) and, at about 22:00 or so, he talks about how, when using BestForNavigation, this adds in some gyroscope correction (when available.) However, he warns that, while this is power & CPU intensive, and should be only used when necessary, as with turn-by-turn navigation.
I'm not sure how much more I can talk about this publically but, if you're a registered developer, you can get the sessions from iTunes-U.
(This is WWDC-2010, btw.)

Resources