Sphero concept of "Shake"? - ios

What would be the best way tell the user is shaking the Sphero?
I need to differentiate when the user tilts the Sphero left/right/up/down and when they shake it rapidly a few times in any direction.
Is there a sample project that would be good to look at?

If you're collecting the accelerometer filtered values, and also the "IMU" values, the accelerometer values would be best for detecting shaking, while the IMU values (roll, pitch, yaw) are best for detecting tilt.
If you don't care on which axis it is shaken, then normalize the axis' by getting a square of the sum of their squares: sqrt(x^2 + y^2 + z^2) > 2000. This will give you a magnitude of the acceleration vector. It's a good value for "general acceleration-ness", and it's great for detecting shaking.
If you want to isolate on which axis it is being shaken, then for each axis, evaluate whether its absolute value of acceleration is above a threshold: abs(x) > 2000, since the positive or negative value of an axis is its own vector magnitude.
Then, just use the IMU data's roll, pitch and yaw values to determine the tilt of the Sphero.

Related

How to get the actual acceleration via CMMotionManager in iOS

I am getting x,y and z device acceleration using startDeviceMotionUpdates(), and reading the userAcceleration data structure with a Timer. Apple documentation states
The total acceleration of the device is equal to gravity plus the acceleration the user imparts to the device.
The values that I am getting even if I jerk the phone around are at the most 5.7nnnn on the X axis for example. Now, if the gravity acceleration is 9.81 m/sec squared, what does the value 5.7nnnn represent in m/sec squared? That is, how do I get the actual m/sec squared value from the raw axis values that userAcceleration gives? How does one interpret the difference between acceleration and deceleration?
CoreMotion (CM) outputs acceleration in g's. So, you'd need to multiply the values with ~9.81 m/s^2.
Also, CM acceleration readings are reversed compared to a more conventional accelerometer; i.e., when the device is stationary on a table, CM measures approximately -1.0 on the z axis, while a conventional accelerometer would measure approximately 9.81 m/s^2 (note that the former value is negative, while the latter is positive). So, if you multiply the CM readings with -9.81, you would get intuitive results: a positive value along an axis would mean acceleration and a negative value would mean deceleration.

Transforming Pitch and Roll values in iOS

I have been trying to compare the pitch and roll pattern (2 second recording) readings.
What I have done is record a Pitch and Roll Values for 10 seconds. In those 10 seconds some similar motion is applied to the device.
When I plot these on graph, the values are also same.
This only happens when providing motion to the device from same orientation/position. For example, device is lying down on the table and a motion is provided to the device. All the readings are same.
But if the device is rotate 180 deg. ccw, the readings are inverted.
Is there a way I can achieve same reading for every position? Via applying some transformation formula? I have done it for acceleration values using pitch roll and yaw. But, don't know how to achieve this for pitch and roll itself.
Basically, what I want to achieve is that the Pitch and Roll values should be independent of yaw.
Here is the plot for pitch values.. all the readings are same expect the two starting from 1.5 in the graph. These were the two times when the device was rotated 180 deg. ccw
UPDATE:
I tried to store the CMAttitude in NSUSerDefaults and then applied the multiplyByInverseOfAttitude. But still the graph plot is inverse.
CMAttitude *currentAtt = motion.attitude;
if (firstAttitude)//firstAttitude is the stored CMAttitude
{
[currentAtt multiplyByInverseOfAttitude:firstAttitude];
}
CMQuaternion quat = currentAtt.quaternion;

Comparison: TYPE_ROTATION_VECTOR with Complemenary Filter

I have been working on Orientation estimation and I need to estimate correct heading when I am walking in straight line. After facing some roadblocks, I started from basics again.
I have implemented a Complementary Filter from here, which uses Gravity vector obtained from Android (not Raw Acceleration), Raw Gyro data and Raw Magnetometer data. I am also applying a low pass filter on Gyro and Magnetometer data and use it as input.
The output of the Complementary filter is Euler angles and I am also recording TYPE_ROTATION_VECTOR which outputs device orientation in terms of 4D Quaternion.
So I thought to convert the Quaternions to Euler and compare them with the Euler obtained from Complementary filter. The output of Euler angles is shown below when the phone is kept stationary on a table.
As it can be seen, the values of Yaw are off by a huge margin.
What am I doing wrong for this simple case when the phone is stationary
Then I walked in my living room and I get the following output.
The shape of Complementary filter looks very good and it is very close to that of Android. But the values are off by huge margin.
Please tell me what am I doing wrong?
I don't see any need to apply a low-pass filter to the Gyro. Since you're integrating the gyro to get rotation, it could mess everything up.
Be aware that TYPE_GRAVITY is a composite sensor reading synthesized from gyro and accel inside Android's own sensor fusion algorithm. Which is to say that this has already been passed through a Kalman filter. If you're going to use Android's built-in sensor fusion anyway, why not just use TYPE_ROTATION_VECTOR?
Your angles are in radians by the looks of it, and the error in the first set wasn't too far from 90 degrees. Perhaps you've swapped X and Y in your magnetometer inputs?
Here's the approach I would take: first write a test that takes accel and gyro and synthesizes Euler angles from it. Ignore gyro for now. Walk around the house and confirm that it does the right thing, but is jittery.
Next, slap an aggressive low-pass filter on your algorithm, e.g.
yaw0 = yaw;
yaw = computeFromAccelMag(); // yaw in radians
factor = 0.2; // between 0 and 1; experiment
yaw = yaw * factor + yaw0 * (1-factor);
Confirm that this still works. It should be much less jittery but also sluggish.
Finally, add gyro and make a complementary filter out of it.
dt = time_since_last_gyro_update;
yaw += gyroData[2] * dt; // test: might need to subtract instead of add
yaw0 = yaw;
yaw = computeFromAccelMag(); // yaw in radians
factor = 0.2; // between 0 and 1; experiment
yaw = yaw * factor + yaw0 * (1-factor);
They key thing is to test every step of the way as you develop your algorithm, so that when the mistake happens, you'll know what caused it.

Relative Camera Pose Estimation using OpenCV

I'm trying to estimate the relative camera pose using OpenCV. Cameras in my case are calibrated (i know the intrinsic parameters of the camera).
Given the images captured at two positions, i need to find out the relative rotation and translation between two cameras. Typical translation is about 5 to 15 meters and yaw angle rotation between cameras range between 0 - 20 degrees.
For achieving this, following steps are adopted.
a. Finding point corresponding using SIFT/SURF
b. Fundamental Matrix Identification
c. Estimation of Essential Matrix by E = K'FK and modifying E for singularity constraint
d. Decomposition Essential Matrix to get the rotation, R = UWVt or R = UW'Vt (U and Vt are obtained SVD of E)
e. Obtaining the real rotation angles from rotation matrix
Experiment 1: Real Data
For real data experiment, I captured images by mounting a camera on a tripod. Images captured at Position 1, then moved to another aligned Position and changed yaw angles in steps of 5 degrees and captured images for Position 2.
Problems/Issues:
Sign of the estimated yaw angles are not matching with ground truth yaw angles. Sometimes 5 deg is estimated as 5deg, but 10 deg as -10 deg and again 15 deg as 15 deg.
In experiment only yaw angle is changed, however estimated Roll and Pitch angles are having nonzero values close to 180/-180 degrees.
Precision is very poor in some cases the error in estimated and ground truth angles are around 2-5 degrees.
How to find out the scale factor to get the translation in real world measurement units?
The behavior is same on simulated data also.
Have anybody experienced similar problems as me? Have any clue on how to resolve them.
Any help from anybody would be highly appreciated.
(I know there are already so many posts on similar problems, going trough all of them has not saved me. Hence posting one more time.)
In chapter 9.6 of Hartley and Zisserman, they point out that, for a particular essential matrix, if one camera is held in the canonical position/orientation, there are four possible solutions for the second camera matrix: [UWV' | u3], [UWV' | -u3], [UW'V' | u3], and [UW'V' | -u3].
The difference between the first and third (and second and fourth) solutions is that the orientation is rotated by 180 degrees about the line joining the two cameras, called a "twisted pair", which sounds like what you are describing.
The book says that in order to choose the correct combination of translation and orientation from the four options, you need to test a point in the scene and make sure that the point is in front of both cameras.
For problems 1 and 2,
Look for "Euler angles" in wikipedia or any good math site like Wolfram Mathworld. You would find out the different possibilities of Euler angles. I am sure you can figure out why you are getting sign changes in your results based on literature reading.
For problem 3,
It should mostly have to do with the accuracy of our individual camera calibration.
For problem 4,
Not sure. How about, measuring a point from camera using a tape and comparing it with the translation norm to get the scale factor.
Possible reasons for bad accuracy:
1) There is a difference between getting reasonable and precise accuracy in camera calibration. See this thread.
2) The accuracy with which you are moving the tripod. How are you ensuring that there is no rotation of tripod around an axis perpendicular to surface during change in position.
I did not get your simulation concept. But, I would suggest the below test.
Take images without moving the camera or object. Now if you calculate relative camera pose, rotation should be identity matrix and translation should be null vector. Due to numerical inaccuracies and noise, you might see rotation deviation in arc minutes.

Blackberry Device Movement Angle Difference

I am working on a Blackberry application in which I need to retrieve the Angle difference when the device moves. It means the difference of angle between when the movement starts and ends. It must be 25 degrees to call some function.
In simple words, call a function when the device moves by 25 degrees.
Please read AccelerometerSensor docs, it is available in API 4.7.0 and higher. All data that you can retrieve is described in class AccelerometerData, it is orientation and acceleration (gravity data).
It is described more in details how to get angle from gravity sensor data in JavaME docs, "Mobile Sensor API" section:
If the phone was placed flat, the accelerometer would tell us that the acceleration along the z-axis (up and down) is about 1000 (this value represents 1G). The accelerations along the X and Y axises (sideways) would be about 0 since the phone is sitting still and gravity only works downwards. Flipping the phone over with the screen facing down, the accelerometer would give us the value of -1000 on the Z-axis. Standing on its side, would give us a value of 1000 or -1000 along either the X- or the Y-axis, depending on which side you put it. Putting the phone in a 45 degree angle along the X-axis would give us a value of ±707 on the Z-axis and ±707 on the Y-axis, since gravity cannot affect either axis with its full force (You can easily calculate what the value should be for a certain angle for each axis using the sine and cosine functions). Using the values from the X and Y-axis from the accelerometer, we can determine the position of the phone at any time, and then use that value to move our ship to avoid the incoming asteroids.
So, having accelerometer data for all 3 axes we may figure it out what is horizontal angle of a device.

Resources