In my app I want to detect how strong surrounding magnetic/electromagnetic fields are. What I want to achieve is to measure magnetic field change to know if it's stronger than in control measurement or if it's lower. This is my code:
- (void)setupLocationManager {
self.locationManager = [[CLLocationManager alloc] init];
if ([CLLocationManager headingAvailable] == NO) {
self.locationManager = nil;
} else {
self.locationManager.headingFilter = kCLHeadingFilterNone;
self.locationManager.delegate = self;
[self.locationManager startUpdatingHeading];
}
}
// CLLocationManagerDelegate
- (void)locationManager:(CLLocationManager *)manager didUpdateHeading:(CLHeading *)heading {
CGFloat magnitude = sqrt(heading.x * heading.x + heading.y * heading.y + heading.z * heading.z);
if (self.defaultMagnitudeValue == 0.f) {
self.defaultMagnitudeValue = magnitude;
}
self.curentMagnitudeValue = magnitude;
}
Magnitude value reacts to the magnetic fields surrounding the device, the problem is that you need to be REALLY close to a source of such a field.
So, the question is: Is there any possibility for an iOS app to measure magnetic fields on distances more than 10-20 centimeters? If so, how?
P.S.: I have also checked the Teslameter App by Apple, it has nearly same code and totally same problems.
Sure you can, the magnetic field just has to be very strong!
Your first limit might be the hardware. As noted in In iOS, what is the difference between the Magnetic Field values from the Core Location and Core Motion frameworks? the range of those raw values may be limited. At some some hardware and iOS versions restricted them to -128 to +128 microteslas. On an iPhone 6 I can get microtelsa readings well outside that range but that doesn't necessarily help. Apple doesn't seem to provide any reference for the accuracy of magnetometer readings, we can measure values out to nanotesla but the results we get back might be meaningless noise.
Earth's magnetic field at the surface will be 25 to 65 microteslas. Any field you hope to measure is going to need to be measurably stronger than that at the desired measurement distance. What are you trying to measure, and is it really strong enough to move a compass needle at the distance you want to measure it?
Related
I am working on a driver behavior app and I am using SOMotionDetector (Thanks to MIT). Its giving speed and Motion Type (Not Moving, Walking, Running, Automotive) of device. I will use Automotive in my case as I need to detect driver behavior. This is detecting Motion Type based on speed with some thresholds set for Walking, Running, Automotive or if available it uses M7 Chip. It updates location approximately after every second (time varies based on GPS) in [SOMotionDetector sharedInstance].locationChangedBlock To detect Aggressive speed or break I am checking is that the increase/decrease of speed in last second. If it increases from a certain threshold (I am using kAggressiveSpeedIncrementFactor 8.0f) then its aggressively increasing speed, and if there is decreasing speed (difference factor is negative in this case) then its aggressive break. For turn I am playing with angle of latitude and longitude points, following is code for my logic:
#define kAggressiveSpeedIncrementFactor 8.0f // if 8 km/h speed was increased in last second
#define kAggressiveAngleIncrementFactor 30.0f // 30 degree turn angle
#define kAggressiveTurnIncrementFactor 5.0f . // while turn the increasing speed factor in last second
SOMotionDetector *motionDetector = [SOMotionDetector sharedInstance];
motionDetector.locationChangedBlock = ^(CLLocation *location) {
if (motionDetector.motionType == MotionTypeAutomotive) {
SOLocationManager *locationManager = [SOLocationManager sharedInstance];
float currSpeed = motionDetector.currentSpeed * 3.6f;
float lastSpeed = motionDetector.lastSpeed * 3.6f;
float currAngle = locationManager.currAngle;
float lastAngle = locationManager.lastAngle;
self.speedDiff = currSpeed-lastSpeed;
self.angleDiff = currAngle-lastAngle;
if (fabs(self.speedDiff)>kAggressiveSpeedIncrementFactor && fabs(self.angleDiff)<kAggressiveAngleIncrementFactor) {
NSString *msg = #"Aggressive Speed";
if (self.speedDiff < 0)
msg = #"Aggressive Break";
NSLog(#"%#", msg);
}
if (fabs(self.angleDiff)>kAggressiveAngleIncrementFactor && currSpeed>kAggressiveTurnIncrementFactor) {
NSLog(#"aggressive turn");
}
}
};
I have created currentSpeed and lastSpeed in SOMotionDetector class (for my speed difference) and currAngle and lastAngle in SOLocationManager. Please have a look at code,
Aggressive Speed some times work perfect
My question is:
Is this right approach what I am doing?
For detecting aggressive turn with the angle some times this happens that if
my vehicle is going 50 degrees angle (calculated with current and last lat, longs) on a strait road, some times the GPS detect location right or left side of road that give a big difference to the angle (like the path becomes a zig zag). any suggestion for this?
Interesting case: if we move iPhone to iBeacon device, value of accuracy changed much faster than when we move iPhone from device.
How can I make this process faster?
As you have noted, CoreLocation averages past signal measurements to come up with an accuracy calculation (distance estimate in meters). The algorithm used to do the averaging is unpublished, but my measurements have shown a lag that stabilizes after about 20 seconds. I have not noticed a difference in lag between getting closer or further away.
You have no control over this averaging interval. The only thing you can do is average RSSI yourself over whatever time period you wish. You can then use a custom calculation to convert average RSSI to distance.
To do this you will need to have beacons that all have identical transmitter power, as you will not have access to the measured power calibration constant in the beacon advertisement. (Apple does not allow reading this value.). Instead, this constant must be hard coded in your customer distance calculation.
You can see sample code that does this here:
+(double) distanceForRSSI:(double)rssi forPower:(int)txPower {
// use coefficient values from spreadsheet for iPhone 4S
double coefficient1 = 2.922026; // multiplier
double coefficient2 = 6.672908; // power
double coefficient3 = -1.767203; // intercept
if (rssi == 0) {
return -1.0; // if we cannot determine accuracy, return -1.0
}
double ratio = rssi*1.0/txPower;
double distance;
if (ratio < 1.0) {
distance = pow(ratio,10);
}
else {
distance = (coefficient1)*pow(ratio,coefficient2) + coefficient3;
}
if (distance < 0.1) {
NSLog(#"Low distance");
}
return distance;
}
https://github.com/AltBeacon/ios-beacon-tools/blob/master/ios-beacon-tools/RNLBeacon%2BDistance.m
I am doing some mapkit and corelocation programming where I map out a users route. E.g. they go for a walk and it shows the path they took.
On the simulator things are working 100% fine.
On the iPhone I've run into a major snag and I don't know what to do. To determine if the user has 'stopped' I basically check if the speed is (almost) 0 for a certain period of time.
However just keeping the phone still spits out this log for newly updated location changes (from the location manager delegate). These are successive updates in the locationManager(_:didUpdateLocations:) callback.
speed 0.021408926025254 with distance 0.192791659974976
speed 0.0532131983839802 with distance 0.497739230237728
speed 11.9876451887096 with distance 15.4555990691609
speed 0.230133198005176 with distance 3.45235789063791
speed 0.0 with distance 0.0
speed 0.984378335092039 with distance 11.245049843458
speed 0.180509147029171 with distance 2.0615615724029
speed 0.429749086272364 with distance 4.91092459284206
Now I have the accuracy set to best:
_locationManager = CLLocationManager()
_locationManager.delegate = self
_locationManager.distanceFilter = kCLDistanceFilterNone
_locationManager.desiredAccuracy = kCLLocationAccuracyBest
Do you know if there is a setting or I can change to prevent this back and forth behaviour. Even the user pin moves wildly left and right every few seconds when the phone is still.
Or is there something else I need to code to account for this wild swaggering?
I check if the user has moved a certain distance within a certain time to determine if they have stopped (thanks to rmaddy for the info):
/**
Return true if user is stopped. Because GPS is in accurate user must pass a threshold distance to be considered stopped.
*/
private func userHasStopped() -> Bool
{
// No stop checks yet so false and set new location
if (_lastLocationForStopAnalysis == nil)
{
_lastLocationForStopAnalysis = _currentLocation
return false
}
// If the distance is greater than the 'not stopped' threshold, set a new location
if (_lastLocationForStopAnalysis.distanceFromLocation(_currentLocation) > 50)
{
_lastLocationForStopAnalysis = _currentLocation
return false
}
// The user has been 'still' for long enough they are considered stopped
if (_currentLocation.timestamp.timeIntervalSinceDate(_lastLocationForStopAnalysis.timestamp) > 180)
{
return true
}
// There hasn't been a timeout or a threshold pass to they haven't stopped yet
return false
}
I'm currently working on a location tracking app and I have difficulties with inaccurate location updates from my CLLocationManager. This causes my app to track distance which is in fact only caused by inaccurate GPS readings.
I can even leave my iPhone on the table with my app turned on and in few minutes my app tracks hundreds of meters worth of distance just because of this flaw.
Here's my initialization code:
- (void)initializeTracking {
self.locationManager = [[CLLocationManager alloc] init];
self.locationManager.delegate = self;
self.locationManager.desiredAccuracy = kCLLocationAccuracyBest;
self.locationManager.distanceFilter = 5;
[self.locationManager startUpdatingLocation];
}
Thanks in advance! :-)
One of the ways I solved this in a similar application is to discard location updates where the distance change is somewhat less than the horizontal accuracy reported in that location update.
Given a previousLocation, then for a newLocation, compute distance from the previousLocation. If that distance >= (horizontalAccuracy * 0.5) then we used that location and that location becomes our new previousLocation. If the distance is less then we discard that location update, don't change previousLocation and wait for the next location update.
That worked well for our purposes, you might try something like that. If you still find too many updates that are noise, increase the 0.5 factor, maybe try 0.66.
You may also want to guard against cases when you are just starting to get a fix, where you get a series of location updates that appear to move but really what is happening is that the accuracy is improving significantly.
I would avoid starting any location tracking or distance measuring with a horizontal accuracy > 70 meters. Those are poor quality positions for GNSS, although that may be all you get when in an urban canyon, under heavy tree canopy, or other poor signal conditions.
I've used this method to retrive the desired accuracy of the location (In SWIFT)
let TIMEOUT_INTERVAL = 3.0
func locationManager(manager: CLLocationManager!, didUpdateLocations locations: [AnyObject]!) {
let newLocation = locations.last as! CLLocation
println("didupdateLastLocation \(newLocation)")
//The last location must not be capured more then 3 seconds ago
if newLocation.timestamp.timeIntervalSinceNow > -3 &&
newLocation.horizontalAccuracy > 0 {
var distance = CLLocationDistance(DBL_MAX)
if let location = self.lastLocation {
distance = newLocation.distanceFromLocation(location)
}
if self.lastLocation == nil ||
self.lastLocation!.horizontalAccuracy > newLocation.horizontalAccuracy {
self.lastLocation = newLocation
if newLocation.horizontalAccuracy <= self.locationManager.desiredAccuracy {
//Desired location Found
println("LOCATION FOUND")
self.stopLocationManager()
}
} else if distance < 1 {
let timerInterval = newLocation.timestamp.timeIntervalSinceDate(self.lastLocation!.timestamp)
if timerInterval >= TIMEOUT_INTERVAL {
//Force Stop
stopLocationManager()
}
}
}
Where:
if newLocation.timestamp.timeIntervalSinceNow > -3 &&
newLocation.horizontalAccuracy > 0 {
The last location retrieved must not be captured more then 3 seconds ago and the last location must have a valid horizontal accuracy (if less then 1 means that it's not a valid location).
Then we're going to set a distance with a default value:
var distance = CLLocationDistance(DBL_MAX)
Calculate the distance from the last location retrieved to the new location:
if let location = self.lastLocation {
distance = newLocation.distanceFromLocation(location)
}
If our local last location hasn't been setted yet or if the new location horizontally accuracy it's better then the actual one, then we are going to set our local location to the new location:
if self.lastLocation == nil ||
self.lastLocation!.horizontalAccuracy > newLocation.horizontalAccuracy {
self.lastLocation = newLocation
The next step it's to check whether the accuracy from the location retrieved it's good enough. To do that we check if the horizontalDistance of the location retrieved it lest then the desiredAccurancy. If this is case we can stop our manager:
if newLocation.horizontalAccuracy <= self.locationManager.desiredAccuracy {
//Desired location Found
self.stopLocationManager()
}
With the last if we're going to check if the distance from the last location retrieved and the new location it's less the one (that means that the 2 locations are very close). If this it's the case then we're going to get the time interval from the last location retrieved and the new location retrieved, and check if the interval it's more then 3 seconds. If this is the case, this mean that it's more then 3 seconds that we're not receiving a location which is more accurate of the our local location, and so we can stop the location services:
else if distance < 1 {
let timerInterval = newLocation.timestamp.timeIntervalSinceDate(self.lastLocation!.timestamp)
if timerInterval >= TIMEOUT_INTERVAL {
//Force Stop
println("Stop location timeout")
stopLocationManager()
}
}
This is always a problem with satellite locations. It is an estimate and estimates can vary. Each new report is a new estimate. What you need is a position clamp that ignores values when there is no movement.
You might try to use sensors to know if the device is actually moving. Look at accelerometer data, if it isn't changing then the device isn't moving even though GPS says it is. Of course, there is noise on the accelerometer data so you have to filter that out also.
This is a tricky problem to solve.
There's really not a whole lot more you can do to improve what the operating system and your current reception gives to you. On first look it doesn't look like there's anything wrong with your code - when it comes to iOS location updates you're really at the mercy of the OS and your service.
What you CAN do is control what locations you pay attention to. If I were you in my didUpdateLocations function when you get callbacks from the OS with new locations - you could ignore any locations with horizontal accuracies greater than some predefined threshold, maybe 25m? You would end up with less location updates to use but you'd have less noise.
I'm trying to detect three actions: when a user begins walking, jogging, or running. I then want to know when the stop. I've been successful in detecting when someone is walking, jogging, or running with the following code:
- (void)update:(CMAccelerometerData *)accelData {
[(id) self setAcceleration:accelData.acceleration];
NSTimeInterval secondsSinceLastUpdate = -([self.lastUpdateTime timeIntervalSinceNow]);
if (labs(_acceleration.x) >= 0.10000) {
NSLog(#"walking: %f",_acceleration.x);
}
else if (labs(_acceleration.x) > 2.0) {
NSLog(#"jogging: %f",_acceleration.x);
}
else if (labs(_acceleration.x) > 4.0) {
NSLog(#"sprinting: %f",_acceleration.x);
}
The problem I run into is two-fold:
1) update is called multiple times every time there's a motion, probably because it checks so frequently that when the user begins walking (i.e. _acceleration.x >= .1000) it is still >= .1000 when it calls update again.
Example Log:
2014-02-22 12:14:20.728 myApp[5039:60b] walking: 1.029846
2014-02-22 12:14:20.748 myApp[5039:60b] walking: 1.071777
2014-02-22 12:14:20.768 myApp[5039:60b] walking: 1.067749
2) I'm having difficulty figuring out how to detect when the user stopped. Does anybody have advice on how to implement "Stop Detection"
According to your logs, accelerometerUpdateInterval is about 0.02. Updates could be less frequent if you change mentioned property of CMMotionManager.
Checking only x-acceleration isn't very accurate. I can put a device on a table in a such way (let's say on left edge) that x-acceleration will be equal to 1, or tilt it a bit. This will cause a program to be in walking mode (x > 0.1) instead of idle.
Here's a link to ADVANCED PEDOMETER FOR SMARTPHONE-BASED ACTIVITY TRACKING publication. They track changes in the direction of the vector of acceleration. This is the cosine of the angle between two consecutive acceleration vector readings.
Obviously, without any motion, angle between two vectors is close to zero and cos(0) = 1. During other activities d < 1. To filter out noise, they use a weighted moving average of the last 10 values of d.
After implementing this, your values will look like this (red - walking, blue - running):
Now you can set a threshold for each activity to separate them. Note that average step frequency is 2-4Hz. You should expect current value to be over the threshold at least few times in a second in order to identify the action.
Another helpful publications:
ERSP: An Energy-efficient Real-time Smartphone Pedometer (analyze peaks and throughs)
A Gyroscopic Data based Pedometer Algorithm (threshold detection of gyro readings)
UPDATE
_acceleration.x, _accelaration.y, _acceleration.z are coordinates of the same acceleration vector. You use each of these coordinates in d formula. In order to calculate d you also need to store acceleration vector of previous update (with i-1 index in formula).
WMA just take into account 10 last d values with different weights. Most recent d values have more weight, therefore, more impact on resulting value. You need to store 9 previous d values in order to calculate current one. You should compare WMA value to corresponding threshold.
if you are using iOS7 and iPhone5S, I suggest you look into CMMotionActivityManager which is available in iPhone5S because of the M7 chip. It is also available in a couple of other devices:
M7 chip
Here is a code snippet I put together to test when I was learning about it.
#import <CoreMotion/CoreMotion.h>
#property (nonatomic,strong) CMMotionActivityManager *motionActivityManager;
-(void) inSomeMethod
{
self.motionActivityManager=[[CMMotionActivityManager alloc]init];
//register for Coremotion notifications
[self.motionActivityManager startActivityUpdatesToQueue:[NSOperationQueue mainQueue] withHandler:^(CMMotionActivity *activity)
{
NSLog(#"Got a core motion update");
NSLog(#"Current activity date is %f",activity.timestamp);
NSLog(#"Current activity confidence from a scale of 0 to 2 - 2 being best- is: %ld",activity.confidence);
NSLog(#"Current activity type is unknown: %i",activity.unknown);
NSLog(#"Current activity type is stationary: %i",activity.stationary);
NSLog(#"Current activity type is walking: %i",activity.walking);
NSLog(#"Current activity type is running: %i",activity.running);
NSLog(#"Current activity type is automotive: %i",activity.automotive);
}];
}
I tested it and it seems to be pretty accurate. The only drawback is that it will not give you a confirmation as soon as you start an action (walking for example). Some black box algorithm waits to ensure that you are really walking or running. But then you know you have a confirmed action.
This beats messing around with the accelerometer. Apple took care of that detail!
You can use this simple library to detect if user is walking, running, on vehicle or not moving. Works on all iOS devices and no need M7 chip.
https://github.com/SocialObjects-Software/SOMotionDetector
In repo you can find demo project
I'm following this paper(PDF via RG) in my indoor navigation project to determine user dynamics(static, slow walking, fast walking) via merely accelerometer data in order to assist location determination.
Here is the algorithm proposed in the project:
And here is my implementation in Swift 2.0:
import CoreMotion
let motionManager = CMMotionManager()
motionManager.accelerometerUpdateInterval = 0.1
motionManager.startAccelerometerUpdatesToQueue(NSOperationQueue.mainQueue()) { (accelerometerData: CMAccelerometerData?, error: NSError?) -> Void in
if((error) != nil) {
print(error)
} else {
self.estimatePedestrianStatus((accelerometerData?.acceleration)!)
}
}
After all of the classic Swifty iOS code to initiate CoreMotion, here is the method crunching the numbers and determining the state:
func estimatePedestrianStatus(acceleration: CMAcceleration) {
// Obtain the Euclidian Norm of the accelerometer data
accelerometerDataInEuclidianNorm = sqrt((acceleration.x.roundTo(roundingPrecision) * acceleration.x.roundTo(roundingPrecision)) + (acceleration.y.roundTo(roundingPrecision) * acceleration.y.roundTo(roundingPrecision)) + (acceleration.z.roundTo(roundingPrecision) * acceleration.z.roundTo(roundingPrecision)))
// Significant figure setting
accelerometerDataInEuclidianNorm = accelerometerDataInEuclidianNorm.roundTo(roundingPrecision)
// record 10 values
// meaning values in a second
// accUpdateInterval(0.1s) * 10 = 1s
while accelerometerDataCount < 1 {
accelerometerDataCount += 0.1
accelerometerDataInASecond.append(accelerometerDataInEuclidianNorm)
totalAcceleration += accelerometerDataInEuclidianNorm
break // required since we want to obtain data every acc cycle
}
// when acc values recorded
// interpret them
if accelerometerDataCount >= 1 {
accelerometerDataCount = 0 // reset for the next round
// Calculating the variance of the Euclidian Norm of the accelerometer data
let accelerationMean = (totalAcceleration / 10).roundTo(roundingPrecision)
var total: Double = 0.0
for data in accelerometerDataInASecond {
total += ((data-accelerationMean) * (data-accelerationMean)).roundTo(roundingPrecision)
}
total = total.roundTo(roundingPrecision)
let result = (total / 10).roundTo(roundingPrecision)
print("Result: \(result)")
if (result < staticThreshold) {
pedestrianStatus = "Static"
} else if ((staticThreshold < result) && (result <= slowWalkingThreshold)) {
pedestrianStatus = "Slow Walking"
} else if (slowWalkingThreshold < result) {
pedestrianStatus = "Fast Walking"
}
print("Pedestrian Status: \(pedestrianStatus)\n---\n\n")
// reset for the next round
accelerometerDataInASecond = []
totalAcceleration = 0.0
}
}
Also I've used the following extension to simplify significant figure setting:
extension Double {
func roundTo(precision: Int) -> Double {
let divisor = pow(10.0, Double(precision))
return round(self * divisor) / divisor
}
}
With raw values from CoreMotion, the algorithm was haywire.
Hope this helps someone.
EDIT (4/3/16)
I forgot to provide my roundingPrecision value. I defined it as 3. It's just plain mathematics that that much significant value is decent enough. If you like you provide more.
Also one more thing to mention is that at the moment, this algorithm requires the iPhone to be in your hand while walking. See the picture below. Sorry this was the only one I could find.
My GitHub Repo hosting Pedestrian Status
You can use Apple's latest Machine Learning framework CoreML to find out user activity. First you need to collect labeled data and train the classifier. Then you can use this model in your app to classify user activity. You may follow this series if are interested in CoreML Activity Classification.
https://medium.com/#tyler.hutcherson/activity-classification-with-create-ml-coreml3-and-skafos-part-1-8f130b5701f6