CMDeviceMotion does not provide attitude info - ios

For some reason my iPad2 is not providing motion attitude information. I am doing AFAIK precisely what people say to do but still... no data.
float angle = 0;
CMDeviceMotion *deviceMotion;
CMAttitude *attitude;
deviceMotion = motionManager.deviceMotion;
if (deviceMotion) {
attitude = deviceMotion.attitude;
[attitude multiplyByInverseOfAttitude:referenceAttitude];
angle = [attitude roll];
} else {
NSLog (#"Cannot get angles.");
}
Earlier in my code I do this:
motionManager = [[CMMotionManager alloc] init];
if (motionManager.gyroAvailable) {
[motionManager startGyroUpdates];
}
However I never get the angle. Help?

You only get attitude if you use device motion updates, i.e. you have to call for initialisaiton:
if (![motionManager isDeviceMotionActive]) {
[motionManager startDeviceMotionUpdates];
}
And stopDeviceMotionUpdates when going to background.

Related

Geofencing for CLBeacons

I am trying to establish a geofence for CLbeacons, which is like this :
a> Any beacon whose accuracy <= 2.5 metres of distance should get detected.
Now, when I place the beacons in about 7m distance apart both get detected. What is more shocking is that the accuracy sometimes goes like 15.70 m for the beacon (checked by running the Airlocate App), which happens randomly and thereby makes the geofencing thing impossible to construct.
I tried to apply the custom formula to calculate the beacon distance double accuracy = (0.89976) * pow(ratio,7.7095) + 0.111; where double ratio = rssi*1.0/txPower; but since the txPower for CLbeacons are not provided, the function depends on me providing a static value as txPower.
Can anyone guide as to how the geofencing for these CLBeacons should be constructed then?
You are correct in that the accuracy value of the beacon can fluctualte drastically over short periods. The way I handle this (we have a similar need to determining when devices have been returned to a base location) is a combination of two approaches: First, we are tweaking the power on our iBeacons to lower them so that the didDetermineState: delegate call does not get called to many times for entering and leaving the beacon's range. Second, my iBeacon model keeps track of the accuracy for any beacons in range and averages them out. That way someone walking in between the device and the beacon, or the user turning the device a particular way won't cause the huge fluctuations in the accuracy value, messing up your logic.
I don't believe Apple intended developers to use iBeacons as indoor geolocation. The geofencing aspect of it is to simply adjust the transmit power so that you can get notified of when your device can detect the signal or not. The accuracy can be used, but it is so inaccurate it should be used with caution.
There is a developer that claims to have developed an algorithm for using iBeacons for indoor positioning, but I have not experience with it. Also, if it were possible with any level of accuracy, I feel that Apple would be using it for it's indoor location capabilities, which they are not.
Here's some of the code I use:
Here's my custom MyBeacon class:
#interface MyBeacon()
#property NSMutableDictionary *accuracyHistory;
#end
#implementation MyBeacon
- (id) init
{
self = [super init];
if (self!=nil) {
self.accuracyHistory = [[NSMutableDictionary alloc] init];
}
return self;
}
- (void) addAccuracyValue:(CGFloat)rangeValue forDate:(NSDate *)rangeDateTime
{
[self removeOldRangeHistoryItems];
[self.accuracyHistory setObject:[NSNumber numberWithFloat:rangeValue] forKey:rangeDateTime];
}
- (double) getBeaconAverageAccuracy
{
[self removeOldRangeHistoryItems];
if( self.accuracyHistory.count == 0 )
{
return -1;
}
CGFloat sumRangeVals = 0.0;
int numRangeVals = 0;
for(NSDate *accuracyDateTime in self.accuracyHistory) {
NSNumber *curValue = [self.accuracyHistory objectForKey:accuracyDateTime];
if( [curValue floatValue] >= 0.0 )
{
sumRangeVals += [curValue floatValue];
numRangeVals++;
}
else // let's toy with giving unknown readings a value of 30.
{
sumRangeVals += 30;
numRangeVals++;
}
}
CGFloat averageRangeVal = sumRangeVals / numRangeVals;
return averageRangeVal;
}
- (void) removeOldRangeHistoryItems
{
NSMutableArray *keysToDelete = [[NSMutableArray alloc] init];
for(NSDate *accuracyDateTime in self.accuracyHistory) {
// remove anything older than 10 seconds.
if( [accuracyDateTime timeIntervalSinceNow] < -10.0 )
{
[keysToDelete addObject:accuracyDateTime];
}
}
for( NSDate *key in keysToDelete )
{
[self.accuracyHistory removeObjectForKey:key];
}
}
#end

CMMotionManager: Device Calibration does not work on a real device

I've a strange behavior with CMMotionManager. I try to calibrate the position of my device to enable my App to support multiple device orientations.
When I debug my App on a real device (not in Simulator), everything is working fine.
When I run the same App without debugging, the calibration does not work.
Here's my code:
static CMMotionManager* _motionManager;
static CMAttitude* _referenceAttitude;
// Returns a vector with the current orientation values
// At the first call a reference orientation is saved to ensure the motion detection works for multiple device positions
+(GLKVector3)getMotionVectorWithLowPass{
// Motion
CMAttitude *attitude = self.getMotionManager.deviceMotion.attitude;
if (_referenceAttitude==nil) {
// Cache Start Orientation
_referenceAttitude = [_motionManager.deviceMotion.attitude copy];
} else {
// Use start orientation to calibrate
[attitude multiplyByInverseOfAttitude:_referenceAttitude];
NSLog(#"roll: %f", attitude.roll);
}
return [self lowPassWithVector: GLKVector3Make(attitude.pitch,attitude.roll,attitude.yaw)];
}
+(CMMotionManager*)getMotionManager {
if (_motionManager==nil) {
_motionManager=[[CMMotionManager alloc]init];
_motionManager.deviceMotionUpdateInterval=0.25;
[_motionManager startDeviceMotionUpdates];
}
return _motionManager;
}
I've found a solution. The issue was caused due the different timing behavior between debug and non debug mode. CMMotionManager needs a little time for initializing, before it returns correct values. The solution was to postpone the calibration for 0.25 seconds.
This code works:
+(GLKVector3)getMotionVectorWithLowPass{
// Motion
CMAttitude *attitude = self.getMotionManager.deviceMotion.attitude;
if (_referenceAttitude==nil) {
// Cache Start Orientation
// NEW:
[self performSelector:#selector(calibrate) withObject:nil afterDelay:0.25];
} else {
// Use start orientation to calibrate
[attitude multiplyByInverseOfAttitude:_referenceAttitude];
NSLog(#"roll: %f", attitude.roll);
}
return [self lowPassWithVector: GLKVector3Make(attitude.pitch,attitude.roll,attitude.yaw)];
}
// NEW:
+(void)calibrate
_referenceAttitude = [self.getMotionManager.deviceMotion.attitude copy]
}

CMMotionManager vs UIAccelerometer efficiency

I've been working on an AR framework for a while now and am trying to update from UIAccelerometer (deprecated) to CMMotionManager but am running into some efficiency problems?
Basically it seems like CMMotionManager is MUCH larger and slower than UIAccelerometer is. Has anyone experienced performance issues with CMMotionManager before?
As you can see here, I had this:
accelerometer = [UIAccelerometer sharedAccelerometer];
accelerometer.updateInterval = 0.01;
[accelerometer setDelegate:self];
and
-(void)accelerometer:(UIAccelerometer *)accelerometer didAccelerate:(UIAcceleration *)acceleration {
rollingZ = (acceleration.z * kFilteringFactor) + (rollingZ * (1.0 - kFilteringFactor));
rollingX = (acceleration.y * kFilteringFactor) + (rollingX * (1.0 - kFilteringFactor));
if (rollingZ > 0.0) currentInclination = inc_avg(atan(rollingX / rollingZ) + M_PI / 2.0);
else if (rollingZ < 0.0) currentInclination = inc_avg(atan(rollingX / rollingZ) - M_PI / 2.0);
else if (rollingX < 0) currentInclination = inc_avg(M_PI/2.0);
else if (rollingX >= 0) currentInclination = inc_avg(3 * M_PI/2.0);
}
and all works great even on "older" devices like the iPhone 4 (not really old but yea...).
But when trying the exact same code but with CMMotionManager:
motionManager = [[CMMotionManager alloc] init];
with
[motionManager setAccelerometerUpdateInterval:0.01];
[motionManager startAccelerometerUpdatesToQueue:[NSOperationQueue currentQueue]
withHandler: ^(CMAccelerometerData *accelerometerData, NSError *error){
rollingZ = (accelerometerData.acceleration.z * kFilteringFactor) + (rollingZ * (1.0 - kFilteringFactor));
rollingX = (accelerometerData.acceleration.y * kFilteringFactor) + (rollingX * (1.0 - kFilteringFactor));
if (rollingZ > 0.0) currentInclination = inc_avg(atan(rollingX / rollingZ) + M_PI / 2.0);
else if (rollingZ < 0.0) currentInclination = inc_avg(atan(rollingX / rollingZ) - M_PI / 2.0);
else if (rollingX < 0) currentInclination = inc_avg(M_PI/2.0);
else if (rollingX >= 0) currentInclination = inc_avg(3 * M_PI/2.0);
}];
The math seems to slow the crap out of it..! I say this because when I remove all the math part it works great.
An iPhone 5 will work alright but an iPhone 4S will show signs of lag and the iPhone 4 will just freeze...
(I can give you more details if you want but its relatively complicated to explain)
I was just having this same problem, and wouldn't you know it, the solution was in the documentation ;)
The problem is with the block format. All of the tutorials seem to favor that method, but Apple recommends periodic polling of the CMMotionManager as a more performance oriented approach. The block format adds overhead.
From the CMMotionManager Class Reference:
To handle motion data by periodic sampling, the app calls a “start”
method taking no arguments and periodically accesses the motion data
held by a property for a given type of motion data. This approach is
the recommended approach for apps such as games. Handling
accelerometer data in a block introduces additional overhead, and most
game apps are interested only the latest sample of motion data when
they render a frame.
So what you want to do, from the docs again:
Call startAccelerometerUpdates to begin updates and periodically
access CMAccelerometerData objects by reading the accelerometerData
property.
Something along these lines
CMMotionManager *mManager = [(AppDelegate *)[[UIApplication sharedApplication] delegate] sharedManager];
[mManager startAccelerometerUpdates];
Then, in some sort of periodically updating method of your choosing:
CMMotionManager *mManager = [(SEPAppDelegate *)[[UIApplication sharedApplication] delegate] sharedManager];
CMAccelerometerData *aData = mManager.accelerometerData;
This solution appears to work as well as UIAccelerometer on an iPhone 4 from the limited testing I've done.
I use CADisplayLink.
First, setup CMMotionManager instance.
-(void)viewDidLoad
{
[super viewDidLoad];
self.motionManager = [[CMMotionManager alloc]init];
if(self.motionManager.isDeviceMotionAvailable)
{
[self.motionManager startDeviceMotionUpdates];
}
[self setupDisplayLink];
}
Secondly setup displaylink instance like this:
-(void)setupDisplayLink
{
CADisplayLink *displayLink = [CADisplayLink displayLinkWithTarget:self selector:#selector(update:)];
displayLink.frameInterval = 10;// how many frames to skip before next update
[displayLink addToRunLoop:[NSRunLoop currentRunLoop] forMode:NSRunLoopCommonModes];
}
Display link is an object that is connected to your device screen, and which can run specified selector with specified frame rate; More on them here
Thirdly you should implement update: method you specified as display link selector.
-(void)update:(CADisplayLink *)displayLink
{
// handle new accelerometer data here
CMDeviceMotion *deviceMotion = self.motionManager.deviceMotion;
NSLog(#"Acceleration: %#", deviceMotion.accelerometerData.acceleration);
}

How to detect when the phone has been put down

I would like an action to take place when the phone is stationary for 2 seconds. I've searched for ages around google and stack overflow. I discovered that "Accelerometer DidAccelerate" has been depreciated and that CoreMotion is the replacement. Everything I have seen has been to do with the 'shaking' motion. I've tried reading through apple's documentation but It just confuses me!
Basically, I want the app to detect that the g-forces on the phone have remained within a small limit for a certain amount of time (suggesting that the phone has been laid down on a table or something) and for it to call and instance or make the app do something.
Any help would be greatly appreciated.
It's similar to the problem described in Simple iPhone motion detect. The basic setup for CMMotionManager is described in the Apple docs like Mike Pollard stated in his comment. I recommend especially the Handling Processed Device Motion Data section.
What you then need is CMDeviceMotion.userAcceleration which contains the pure acceleration without gravity.
CMMotionManager *motionManager = [[CMMotionManager alloc] init];
// UPDATE: set interval to 0.02 sec
motionManager.deviceMotionUpdateInterval = 1.0 / 50.0;
[motionManager startDeviceMotionUpdatesToQueue:[NSOperationQueue mainQueue]
withHandler:^(CMDeviceMotion *deviceMotion, NSError *error) {
CMAcceleration userAcceleration = deviceMotion.userAcceleration;
double totalAcceleration = sqrt(userAcceleration.x * userAcceleration.x +
userAcceleration.y * userAcceleration.y + userAcceleration.z * userAcceleration.z);
// UPDATE: print debug information
NSLog (#"total=%f x=%f y=%f z=%f", totalAcceleration, userAcceleration.x, userAcceleration.y, userAcceleration.z);
// if(totalAcceleration < SOME_LIMIT) ...
Then proceed like codeplasma has described in his answer above.
Also be aware that the solution might not be precise if used in the underground, bus, etc. because of external accelerations.
You can do something like this:
CMMotionManager *mManager = [[CMMotionManager alloc] init];
if ([mManager isAccelerometerAvailable] == YES) {
__block float lastActivityBefore = 0.0;
[mManager setAccelerometerUpdateInterval:0.1];
[mManager startAccelerometerUpdatesToQueue:[NSOperationQueue mainQueue] withHandler:^(CMAccelerometerData *accelerometerData, NSError *error) {
double totalAcceleration = sqrt(accelerometerData.acceleration.x * accelerometerData.acceleration.x + accelerometerData.acceleration.y * accelerometerData.acceleration.y + accelerometerData.acceleration.z * accelerometerData.acceleration.z);
if(totalAcceleration < SOME_LIMIT)
lastActivityBefore = lastActivityBefore + 0.1;
else
lastActivityBefore = 0.0;
if(lastActivityBefore >= 2.0)
{
//do something
}
}];
}
Accelerometer will show some minimal acceleration even if your device is steady, so you should make a testing in order to determine SOME_LIMIT value.
Also be advised that you should have only one instance CMMotionManager class in your app, so you're better to put it in your AppDelegate and initialize it only once.

CMMotionManager with multitasking

I'm using CMMotionManager in my app so I can get the device motion info. I have these two methods:
- (void)startDeviceMotion {
motionManager = [[CMMotionManager alloc] init];
motionManager.showsDeviceMovementDisplay = YES;
motionManager.deviceMotionUpdateInterval = 1.0 / 120.0;
[motionManager startDeviceMotionUpdatesUsingReferenceFrame:CMAttitudeReferenceFrameXTrueNorthZVertical];
}
Second is:
- (void)stopDeviceMotion {
[motionManager stopDeviceMotionUpdates];
[motionManager release];
motionManager = nil;
}
They're launched when the app starts and when the app finishes respectively. My problem is now multitasking. If I get my problem into background and then I bring it to foreground again, I get a message (with NSZombie activated) telling me that a [CMMotionManager retain] message is being sent to a deallocated instance.
Where could my problem be?
Thanks!
Try using Jonathan's suggestion here. Basically, to make sure only one instance of your CMMotionManager is created, put your motionManager in AppDelegate and retrieve it by this method wherever you want to use your motionManager.
-(CMMotionManager *)motionManager
{
CMMotionManager *motionManager = nil;
id appDelegate = [UIApplication sharedApplication].delegate;
if ([appDelegate respondsToSelector:#selector(motionManager)]) {
motionManager = [appDelegate motionManager];
}
return motionManager;
}
Let me know if this works for you.

Resources