Iphone accelerometer provides user acceleration and gravity reading? It's clear from the apple guide the user acceleration means, the acceleration than the phone get's from the user. Then what is the meaning of gravity reading? Anyway they have mention total acceleration is the addition of both! I m confused with gravity reading, Can anyone explain me the theory behind this?
Thank you in advance!
The accelerometer measures the sum (user acceleration + gravity). Unfortunately it is of little use, in your application you typically need one or the other. So it is split into components (into user acceleration and gravity) with a sophisticated procedure called sensor fusion.
Gravity points downwards, corresponds to the gravitation. It tells you where downwards is. You can compute tilt angles from it, which may be handy in developing tilt games.
The user acceleration is handy when you are trying to figure out how the phone is shaken. It good for shaking games.
See also accelerometers uses- smartphone
Gravity is an acceleration which is constant an downward. It is indistingishable along its vector from any other constant acceleration. Rotating the phone is going to change the contribution of gravity to each of the acceleration axis proportionate to angle with the ground. Gravity being constant, any change in combined vector length of the three axis should be due to user acceleration.
Related
I am currently developing an App which should calculate the speed by the accelerations of the device. To achieve this I am using Sensing Kit. The app records data of the users motion and saves it to the local storage. When the user stops the recording it is possible draw the collected accelerations in velocity by time (Three graphs for every axis of the accelerometer).
Sensing Kit uses startAccelerometerUpdates of the CMMotionManager to get the accelerations. To calculate the velocity I do some signal processing and an integration of the acceleration (and multiply it by 9.81 because apple measures the acceleration in increments of gravity). The result is the velocity over the recorded time (Its not very precise but that doesn't matter in my case).
I tested my app by sliding the phone over a table with the screen up and the upper screen side in moving direction. The movement later shown up in the resulting graph of Y-Axis, but it has a negative velocity (The accelerations are negative, too). I expect that the velocity and the acceleration should be positive, because I moved the device in the positive direction of the Y-Axis.
The same happens with the X-Axis wehen I move the phone on the table in the direction of this axis.
I tested it today without Sensing Kit and get same results.
The gravity is always as I expect a negative velocity on the z-Axis, because its an acceleration to the ground.
Can somebody explain to me why the acceleration of the sensor has the wrong sign?
Thank you.
https://developer.apple.com/documentation/foundation/unitacceleration
https://developer.apple.com/documentation/foundation/units_and_measurement
The Foundation library has a function for this and you could use this to convert between UnitAcceleration and UnitSpeed.
I'm new to Core Motion and I'm very confused. Can somebody please explain what these inputs measure and how they can be useful in simple terms?
Accelerometers measure movement relative to gravity, by virtue of "feeling" the force of movement applied to the device. Force of movement can be described as the rate of acceleration and deceleration of the device, hence the name of this sensor.
Gyroscopes measure changes in rotation by virtue of a suspended element reporting its rotation relative to the device. As the device rotates, this suspended element doesn't rotate, so there's a report coming from it that tells you how far the phone's rotated.
Magnetometers get their idea of rotational position from the north/south magnetic fields that compasses use to know where they are relative to the poles. This data is used (primarily) to help the Gyroscope, because these things suffer from float and inertia.
Combined, the information from these sensors, when filtered well (which Apple does for you with CoreMotion) give you all the movement of a phone.
So you can know if the user is swinging the phone around like a table tennis bat, or steering like a Wii Remote Mario style game controller, or simply walking.
it's now the standard practices to fuse the measurements from accelerometers and gyro through Kalman filter, for applications like self-balancing 2-wheel carts: for example: http://www.mouser.com/applications/sensor_solutions_mems/
accelerometer gives a reading of the tilt angle through arctan(a_x/a_y). it's very confusing to use the term "acceleration" here, since what it really means is the projection of gravity along the devices axis (though I understand that , physically, gravity is really just acceleration ).
here is the big problem: when the cart is trying to move, the motor drives the cart and creates a non-trivial acceleration in horizontal direction, this would make the a_x no longer a just projection of gravity along the device x-axis. in fact it would make the measured tilt angle appear larger. how is this handled? I guess given the maturity of Segway, there must be some existing ways to handle it. anybody has some pointers?
thanks
Yang
You are absolutely right. You can estimate pitch and roll angles using projection of gravity vector. You can obtain gravity vector by utilizing motionless accelerometer, but if accelerometer moves, then it measures gravity + linear component of acceleration and the main problem here is to sequester gravity component from linear accelerations. The best way to do it is to pass the accelerometer signal through Low-Pass filter.
Please refer to
Low-Pass Filter: The Basics or
Android Accelerometer: Low-Pass Filter Estimated Linear Acceleration
to learn more about Low-Pass filter.
Sensor fusion algorithm should be interesting for you as well.
I'm making an app in which the user can move their devices left, right, up, and down in order to view an image. The way I'm doing this is by reading in the accelerometer data to calculate how far in each direction the user moved the device. The issue is that these readings are heavily influenced by the tilt of the device, because the axis change slightly with every tiny tilt. I know the gyroscope measures the rotational accelerations, so is there a way I could use those readings to take tilt out of the picture and fix the axis so they are the same at whatever angle the user holds the device? I realize this is a very eloborate problem, so if you don't understand something feel free to ask
I did a lot of experiment using the accelerometer for detecting the movement size(magnitude) just one value from x,y,z acceleration. I am using an iPhone 4 with accelerometer update frequency 1.0 / 50.0 (50HZ), but I've also tried with 100HZ, 150HZ, 200HZ.
Examples:
Acceleration on X axis
Acceleration on Y axis
Acceleration on Z axis
I assume ( I hope I am correct) that the accelerations are the small peaks on the graph, not the big steps. I think from my experiments that the big steps show the device position. If changed the position the step is changed too.
If my previous assumption is correct I need to cut the peaks from the graph and summarize them. Here comes my question how can I cut those peaks without losing the information, the peak sizes.
I know that the high pass filter does this kind of thinks(passes the high peaks and blocks the noise, the small ones, I've read some paper about the filters. But for me the filter cut a lot of information from my "signal"(accelerometer data).
I think that there should be a better way for getting the information out from the data.
I've tried a simple one which looks nice but it isn't correct.
I did this data data using my function magnitude
for i = 2 : length(x)
converted(i-1) = x(i-1) - x(i);
end
Where x is my data and converted array is the result.
The following row generated a the image below, which looks like nice.
xyz = magnitude(datay) + magnitude(dataz) + magnitude(datax)
However the problem with that solution is that if I have continuos acceleration the graph just will show the first point and then goes down. I know that I need somehow better filter, but I am bit confused. Could you give some advice how can I do this properly.
Thanks for your time,
I really appreciate your help
Edit(answers for Zaph question):
What are you trying to accomplish?
I want to measure the movement when the iPhone is placed to desk, chair or bed. The accelerometer is so sensible if I put down a pencil it to a desk it shows me. I want to measure all movement that happens in a specific time.
What are the scale units?
I'm not scaling the data.
When you say "device position" what do you mean, an accelerometer provides movement (in iPhones with gyros)
I am using only the accelerometer. When I put the device like the picture below I got values around -1 on x coordinate, 0.0 on z and y coordinate. This is what I mean as device position.
The measurements that are returned from the accelerometer are acceleration, not position.
I'm not sure what you mean with "big steps" but the peaks show a change of acceleration. The fact that the values are not 0 when holding the device still is from the fact that the gravitation accelerates the device with 9.81 m/s^2 (the magnitude of the acceleration vector).
You are potentially trying to do something quite difficult, especially the with low quality sensors that are embedded in phones. That is, getting the actual coordinate acceleration of the phone.
What you can do, is to detect the time periods when the phone was moved or touched. You can first calculate magnitude (norm) of acceleration signal and then, with a moving window, check areas where sample standard deviation is smaller than some threshold. Determining how the phone moved is more complicated issue. Of course you can check orientation for the stationary areas between movements.