How do I programmatically find direction on a blackberry? - blackberry

How do I programmatically find direction on a blackberry using GPS?

With GPS, the minimum resolution is aroud 3 meters. If you take consecutive GPS readings and look for significant changes in a given direction, it will give you a rough estimate of the direction of travel, and thus a probable direction the person is facing.
This is not nearly as good as having a magnetic compass, which none of the Blackberries (Blackberrys?) on the market currently have.
Some GPS systems have two GPS receivers placed beside each other in a known orientation. They can calculate which direction the unit is facing based on comparing two GPS readings. They call it a GPS compass. Also, these systems are too bulky to be included in a phone at this point.
You can use the Blackberry API to find the GPS information including the course made good heading (getCourse method). It will give you a compass reading with 0.00 being North.

GPS data can not give you direction, it only gives you positions. If you have two positions (such as where you were 1 second ago, and where you are now), most implementations, including the Blackberry, will give you the bearing (direction) from one point to the other.
Android devices, and IIRC the iPHone 3Gs, with digital magnetic compasses can give you direction. I don't believe there are any Blackberries equipped with compasses yet.

The GPS API in java micro that the Blackberry uses will supply you with the direction the phone is heading. Here is a snippet of a GPS class that retrieves most of the basic GPS info:
/**
* This will start the GPS
*/
public GPS() {
// Start getting GPS data
if (currentLocation()) {
// This is going to start to try and get me some data!
}
}
private boolean currentLocation() {
boolean retval = true;
try {
LocationProvider lp = LocationProvider.getInstance(null);
if (lp != null) {
lp.setLocationListener(new LocationListenerImpl(), interval, 1, 1);
} else {
// GPS is not supported, that sucks!
// Here you may want to use UiApplication.getUiApplication() and post a Dialog box saying that it does not work
retval = false;
}
} catch (LocationException e) {
System.out.println("Error: " + e.toString());
}
return retval;
}
private class LocationListenerImpl implements LocationListener {
public void locationUpdated(LocationProvider provider, Location location) {
if (location.isValid()) {
heading = location.getCourse();
longitude = location.getQualifiedCoordinates().getLongitude();
latitude = location.getQualifiedCoordinates().getLatitude();
altitude = location.getQualifiedCoordinates().getAltitude();
speed = location.getSpeed();
// This is to get the Number of Satellites
String NMEA_MIME = "application/X-jsr179-location-nmea";
satCountStr = location.getExtraInfo("satellites");
if (satCountStr == null) {
satCountStr = location.getExtraInfo(NMEA_MIME);
}
// this is to get the accuracy of the GPS Cords
QualifiedCoordinates qc = location.getQualifiedCoordinates();
accuracy = qc.getHorizontalAccuracy();
}
}
public void providerStateChanged(LocationProvider provider, int newState) {
// no-op
}
}

Related

How can I customise proximity zones for ibeacons using CoreLocation

How can I set my own radius limits for Immediate, Near and Far zones when I range ibeacons using CoreLocation?
While you cannot configure Proximity thresholds, it is easy enough to do roll your own based on the accuracy field, which is effectively a distance estimate in meters:
if beacon.accuracy < 0 {
// unknown
}
else if beacon.accuracy < 0.5 {
// immediate
}
else if beacon.accuracy < 3.0 {
// near
}
else {
// far
}
The values above approximate CoreLocation's default behavior, but you can alter them to suit your needs.

Using iOS device as teslameter (magnetometer discussion)

In my app I want to detect how strong surrounding magnetic/electromagnetic fields are. What I want to achieve is to measure magnetic field change to know if it's stronger than in control measurement or if it's lower. This is my code:
- (void)setupLocationManager {
self.locationManager = [[CLLocationManager alloc] init];
if ([CLLocationManager headingAvailable] == NO) {
self.locationManager = nil;
} else {
self.locationManager.headingFilter = kCLHeadingFilterNone;
self.locationManager.delegate = self;
[self.locationManager startUpdatingHeading];
}
}
// CLLocationManagerDelegate
- (void)locationManager:(CLLocationManager *)manager didUpdateHeading:(CLHeading *)heading {
CGFloat magnitude = sqrt(heading.x * heading.x + heading.y * heading.y + heading.z * heading.z);
if (self.defaultMagnitudeValue == 0.f) {
self.defaultMagnitudeValue = magnitude;
}
self.curentMagnitudeValue = magnitude;
}
Magnitude value reacts to the magnetic fields surrounding the device, the problem is that you need to be REALLY close to a source of such a field.
So, the question is: Is there any possibility for an iOS app to measure magnetic fields on distances more than 10-20 centimeters? If so, how?
P.S.: I have also checked the Teslameter App by Apple, it has nearly same code and totally same problems.
Sure you can, the magnetic field just has to be very strong!
Your first limit might be the hardware. As noted in In iOS, what is the difference between the Magnetic Field values from the Core Location and Core Motion frameworks? the range of those raw values may be limited. At some some hardware and iOS versions restricted them to -128 to +128 microteslas. On an iPhone 6 I can get microtelsa readings well outside that range but that doesn't necessarily help. Apple doesn't seem to provide any reference for the accuracy of magnetometer readings, we can measure values out to nanotesla but the results we get back might be meaningless noise.
Earth's magnetic field at the surface will be 25 to 65 microteslas. Any field you hope to measure is going to need to be measurably stronger than that at the desired measurement distance. What are you trying to measure, and is it really strong enough to move a compass needle at the distance you want to measure it?

iPhone GPS User Location is moving back and forth while the phone is still

I am doing some mapkit and corelocation programming where I map out a users route. E.g. they go for a walk and it shows the path they took.
On the simulator things are working 100% fine.
On the iPhone I've run into a major snag and I don't know what to do. To determine if the user has 'stopped' I basically check if the speed is (almost) 0 for a certain period of time.
However just keeping the phone still spits out this log for newly updated location changes (from the location manager delegate). These are successive updates in the locationManager(_:didUpdateLocations:) callback.
speed 0.021408926025254 with distance 0.192791659974976
speed 0.0532131983839802 with distance 0.497739230237728
speed 11.9876451887096 with distance 15.4555990691609
speed 0.230133198005176 with distance 3.45235789063791
speed 0.0 with distance 0.0
speed 0.984378335092039 with distance 11.245049843458
speed 0.180509147029171 with distance 2.0615615724029
speed 0.429749086272364 with distance 4.91092459284206
Now I have the accuracy set to best:
_locationManager = CLLocationManager()
_locationManager.delegate = self
_locationManager.distanceFilter = kCLDistanceFilterNone
_locationManager.desiredAccuracy = kCLLocationAccuracyBest
Do you know if there is a setting or I can change to prevent this back and forth behaviour. Even the user pin moves wildly left and right every few seconds when the phone is still.
Or is there something else I need to code to account for this wild swaggering?
I check if the user has moved a certain distance within a certain time to determine if they have stopped (thanks to rmaddy for the info):
/**
Return true if user is stopped. Because GPS is in accurate user must pass a threshold distance to be considered stopped.
*/
private func userHasStopped() -> Bool
{
// No stop checks yet so false and set new location
if (_lastLocationForStopAnalysis == nil)
{
_lastLocationForStopAnalysis = _currentLocation
return false
}
// If the distance is greater than the 'not stopped' threshold, set a new location
if (_lastLocationForStopAnalysis.distanceFromLocation(_currentLocation) > 50)
{
_lastLocationForStopAnalysis = _currentLocation
return false
}
// The user has been 'still' for long enough they are considered stopped
if (_currentLocation.timestamp.timeIntervalSinceDate(_lastLocationForStopAnalysis.timestamp) > 180)
{
return true
}
// There hasn't been a timeout or a threshold pass to they haven't stopped yet
return false
}

Detecting when someone begins walking using Core Motion and CMAccelerometer Data

I'm trying to detect three actions: when a user begins walking, jogging, or running. I then want to know when the stop. I've been successful in detecting when someone is walking, jogging, or running with the following code:
- (void)update:(CMAccelerometerData *)accelData {
[(id) self setAcceleration:accelData.acceleration];
NSTimeInterval secondsSinceLastUpdate = -([self.lastUpdateTime timeIntervalSinceNow]);
if (labs(_acceleration.x) >= 0.10000) {
NSLog(#"walking: %f",_acceleration.x);
}
else if (labs(_acceleration.x) > 2.0) {
NSLog(#"jogging: %f",_acceleration.x);
}
else if (labs(_acceleration.x) > 4.0) {
NSLog(#"sprinting: %f",_acceleration.x);
}
The problem I run into is two-fold:
1) update is called multiple times every time there's a motion, probably because it checks so frequently that when the user begins walking (i.e. _acceleration.x >= .1000) it is still >= .1000 when it calls update again.
Example Log:
2014-02-22 12:14:20.728 myApp[5039:60b] walking: 1.029846
2014-02-22 12:14:20.748 myApp[5039:60b] walking: 1.071777
2014-02-22 12:14:20.768 myApp[5039:60b] walking: 1.067749
2) I'm having difficulty figuring out how to detect when the user stopped. Does anybody have advice on how to implement "Stop Detection"
According to your logs, accelerometerUpdateInterval is about 0.02. Updates could be less frequent if you change mentioned property of CMMotionManager.
Checking only x-acceleration isn't very accurate. I can put a device on a table in a such way (let's say on left edge) that x-acceleration will be equal to 1, or tilt it a bit. This will cause a program to be in walking mode (x > 0.1) instead of idle.
Here's a link to ADVANCED PEDOMETER FOR SMARTPHONE-BASED ACTIVITY TRACKING publication. They track changes in the direction of the vector of acceleration. This is the cosine of the angle between two consecutive acceleration vector readings.
Obviously, without any motion, angle between two vectors is close to zero and cos(0) = 1. During other activities d < 1. To filter out noise, they use a weighted moving average of the last 10 values of d.
After implementing this, your values will look like this (red - walking, blue - running):
Now you can set a threshold for each activity to separate them. Note that average step frequency is 2-4Hz. You should expect current value to be over the threshold at least few times in a second in order to identify the action.
Another helpful publications:
ERSP: An Energy-efficient Real-time Smartphone Pedometer (analyze peaks and throughs)
A Gyroscopic Data based Pedometer Algorithm (threshold detection of gyro readings)
UPDATE
_acceleration.x, _accelaration.y, _acceleration.z are coordinates of the same acceleration vector. You use each of these coordinates in d formula. In order to calculate d you also need to store acceleration vector of previous update (with i-1 index in formula).
WMA just take into account 10 last d values with different weights. Most recent d values have more weight, therefore, more impact on resulting value. You need to store 9 previous d values in order to calculate current one. You should compare WMA value to corresponding threshold.
if you are using iOS7 and iPhone5S, I suggest you look into CMMotionActivityManager which is available in iPhone5S because of the M7 chip. It is also available in a couple of other devices:
M7 chip
Here is a code snippet I put together to test when I was learning about it.
#import <CoreMotion/CoreMotion.h>
#property (nonatomic,strong) CMMotionActivityManager *motionActivityManager;
-(void) inSomeMethod
{
self.motionActivityManager=[[CMMotionActivityManager alloc]init];
//register for Coremotion notifications
[self.motionActivityManager startActivityUpdatesToQueue:[NSOperationQueue mainQueue] withHandler:^(CMMotionActivity *activity)
{
NSLog(#"Got a core motion update");
NSLog(#"Current activity date is %f",activity.timestamp);
NSLog(#"Current activity confidence from a scale of 0 to 2 - 2 being best- is: %ld",activity.confidence);
NSLog(#"Current activity type is unknown: %i",activity.unknown);
NSLog(#"Current activity type is stationary: %i",activity.stationary);
NSLog(#"Current activity type is walking: %i",activity.walking);
NSLog(#"Current activity type is running: %i",activity.running);
NSLog(#"Current activity type is automotive: %i",activity.automotive);
}];
}
I tested it and it seems to be pretty accurate. The only drawback is that it will not give you a confirmation as soon as you start an action (walking for example). Some black box algorithm waits to ensure that you are really walking or running. But then you know you have a confirmed action.
This beats messing around with the accelerometer. Apple took care of that detail!
You can use this simple library to detect if user is walking, running, on vehicle or not moving. Works on all iOS devices and no need M7 chip.
https://github.com/SocialObjects-Software/SOMotionDetector
In repo you can find demo project
I'm following this paper(PDF via RG) in my indoor navigation project to determine user dynamics(static, slow walking, fast walking) via merely accelerometer data in order to assist location determination.
Here is the algorithm proposed in the project:
And here is my implementation in Swift 2.0:
import CoreMotion
let motionManager = CMMotionManager()
motionManager.accelerometerUpdateInterval = 0.1
motionManager.startAccelerometerUpdatesToQueue(NSOperationQueue.mainQueue()) { (accelerometerData: CMAccelerometerData?, error: NSError?) -> Void in
if((error) != nil) {
print(error)
} else {
self.estimatePedestrianStatus((accelerometerData?.acceleration)!)
}
}
After all of the classic Swifty iOS code to initiate CoreMotion, here is the method crunching the numbers and determining the state:
func estimatePedestrianStatus(acceleration: CMAcceleration) {
// Obtain the Euclidian Norm of the accelerometer data
accelerometerDataInEuclidianNorm = sqrt((acceleration.x.roundTo(roundingPrecision) * acceleration.x.roundTo(roundingPrecision)) + (acceleration.y.roundTo(roundingPrecision) * acceleration.y.roundTo(roundingPrecision)) + (acceleration.z.roundTo(roundingPrecision) * acceleration.z.roundTo(roundingPrecision)))
// Significant figure setting
accelerometerDataInEuclidianNorm = accelerometerDataInEuclidianNorm.roundTo(roundingPrecision)
// record 10 values
// meaning values in a second
// accUpdateInterval(0.1s) * 10 = 1s
while accelerometerDataCount < 1 {
accelerometerDataCount += 0.1
accelerometerDataInASecond.append(accelerometerDataInEuclidianNorm)
totalAcceleration += accelerometerDataInEuclidianNorm
break // required since we want to obtain data every acc cycle
}
// when acc values recorded
// interpret them
if accelerometerDataCount >= 1 {
accelerometerDataCount = 0 // reset for the next round
// Calculating the variance of the Euclidian Norm of the accelerometer data
let accelerationMean = (totalAcceleration / 10).roundTo(roundingPrecision)
var total: Double = 0.0
for data in accelerometerDataInASecond {
total += ((data-accelerationMean) * (data-accelerationMean)).roundTo(roundingPrecision)
}
total = total.roundTo(roundingPrecision)
let result = (total / 10).roundTo(roundingPrecision)
print("Result: \(result)")
if (result < staticThreshold) {
pedestrianStatus = "Static"
} else if ((staticThreshold < result) && (result <= slowWalkingThreshold)) {
pedestrianStatus = "Slow Walking"
} else if (slowWalkingThreshold < result) {
pedestrianStatus = "Fast Walking"
}
print("Pedestrian Status: \(pedestrianStatus)\n---\n\n")
// reset for the next round
accelerometerDataInASecond = []
totalAcceleration = 0.0
}
}
Also I've used the following extension to simplify significant figure setting:
extension Double {
func roundTo(precision: Int) -> Double {
let divisor = pow(10.0, Double(precision))
return round(self * divisor) / divisor
}
}
With raw values from CoreMotion, the algorithm was haywire.
Hope this helps someone.
EDIT (4/3/16)
I forgot to provide my roundingPrecision value. I defined it as 3. It's just plain mathematics that that much significant value is decent enough. If you like you provide more.
Also one more thing to mention is that at the moment, this algorithm requires the iPhone to be in your hand while walking. See the picture below. Sorry this was the only one I could find.
My GitHub Repo hosting Pedestrian Status
You can use Apple's latest Machine Learning framework CoreML to find out user activity. First you need to collect labeled data and train the classifier. Then you can use this model in your app to classify user activity. You may follow this series if are interested in CoreML Activity Classification.
https://medium.com/#tyler.hutcherson/activity-classification-with-create-ml-coreml3-and-skafos-part-1-8f130b5701f6

Distance between 2 arduino's using rf links

I currently have a setup where I send a char using a Tx of 434MHz and an Uno to a Mega with a Rx. The Mega counts how many times it receives the char and then if it falls below a certain number it triggers an alarm. Is this a viable way to measure the distance between two microcontrollers while indoors or is there a better way.
Transmitter (Mega)
#include <SoftwareSerial.h>
int rxPin=2; //Goes to the Receiver Pin
int txPin=5; //Make sure it is set to pin 5 going to input of receiver
SoftwareSerial txSerial = SoftwareSerial(rxPin, txPin);
SoftwareSerial rxSerial = SoftwareSerial(txPin, rxPin);
char sendChar ='H';
void setup() {
pinMode(rxPin, INPUT);
pinMode(txPin,OUTPUT);
txSerial.begin(2400);
rxSerial.begin(2400);
}
void loop() {
txSerial.println(sendChar);
Serial.print(sendChar);
}
Receiver
#include <SoftwareSerial.h>
//Make sure it is set to pin 5 going to the data input of the transmitter
int rxPin=5;
int txPin=3; //Don't need to make connections
int LED=13;
int BUZZ=9;
int t=0;
char incomingChar = 0;
int counter = 0;
SoftwareSerial rxSerial = SoftwareSerial(rxPin, txPin);
void setup() {
pinMode(rxPin, INPUT); //initilize rxpin as input
pinMode(BUZZ, OUTPUT); //initilize buzz for output
pinMode(LED, OUTPUT); //initilize led for output
rxSerial.begin(2400); //set baud rate for transmission
Serial.begin(2400); //see above
}
void loop() {
for(int i=0; i<200; i++) {
incomingChar = rxSerial.read(); //read incoming msg from tx
if (incomingChar =='H') {
counter++; //if we get bit "h" count it
}
delay(5); //delay of 10 secs
}
Serial.println(incomingChar);
Serial.println(counter); //prints the the bits we recieved
if(counter<55) {
//if we receive less than 100 bits than print out of range triggers alarm
Serial.println("out of range");
tone(BUZZ,5000,500);digitalWrite(LED,HIGH);
}
else {
noTone(BUZZ);digitalWrite(LED, LOW);
//if we get more than 100 bits than we are within range turn off alarm
Serial.println("in range");
}
counter = 0;
incomingChar=0;
}
In theory you could achieve distance measuring by making the uno send a message which the mega would echo back. That would give the uno a round-trip time for message propagation between the arduinos. You would have to approximate the processing delays. After that it is basic physics. That is basically the same as how radar works. The actual delay would be something like
troundtrip = tuno send + 2*tpropagation + tmega receive + tmega send + tuno receive
I am guessing the distance you are trying to achieve is in the order of meters. Required resolution is going to be an issue, because s = vt => t = s/v, where s is the distance between your arduinos and v = c in case of radio waves. As the transmission delays should stay constant, you have to be able to measure differences in the order of 1/c second intervals, basically. I am not very familiar with arduinos, so I do not know if they are capable of this kind of measurements.
I would suggest you use an ultrasonic range finder like the Maxbotix HRLV-EZ4 sold by Sparkfun.
It is within your price range and it should be able to measure distances up to 5m/195 inches with 1mm resolution.
It is actually possible to do it, I have seen it be done with other microcontrollers. Therefore using arduino you would have to solve the equations,fit in arduino language and make a lot of measurements to value discrepancies over communication itself. Do not forget about atmospheric attenuation wich need to be known and fit in the equations. Humidity may deviate electromagnetic waves.

Resources