I want to measure the punch power by iphone, and I think I need to measure the distance iphone move from starting point to the end point, I measure time for this movement.
I can calculate speed = distance / time
I need the mass of user's body, I can know the mass of user's arm
From the speed and the mass of user's arm, I can calculate the punch power (Newton unit)
Did I think right? Or have any method that I don't know?
Should I use Accelerator or GPS to measure?
Thanks
I am trying to calculate the real world distance of an arbitrary line drawn along the field of view from a one point perspective, single camera setup.
I will have a known distance running parallel. How can I find the compensation factor I need to apply to the pixel length of the measuring line?
Do I have to take into account the distance from the vanishing point, as the length per pixel increases the nearer you get to the vanishing point? Do I need to use the gradient of the known line to give me a rate of change?
A good study on this and similar problems can be found in Antonio Criminisi's papers and Ph.D. thesis on single-view metrology. This is a good link to start, and the whole paperdump is here
I'm looking to create a function for my app which records the distance travelled in the vertical plane. More specifically, I want to record how far the device has been 'dropped' - this could mean dropped at arm's length onto the floor or dropped slowly with the user as they go down ten floors in an elevator. I'm looking for advice on the best way to calculate this with a relatively high level of accuracy.
I've read a little on the difficulty in accurately measuring distance travelled using core motion - especially as I need it to work even if the device rotates during the movement. From what I've researched it seems as though it would be impossible, or at least very difficult, to achieve this using core motion.
Would I be able to achieve this effect with Core Location instead? I've seen posts about calculating lateral distance, as in during a car journey, but nothing about vertical distance.
Is it as simple as 'startingAltitude - endingAltitude = distanceTravelled?
If so - how accurate is the altitude measurement of Core Location and how could I get started with this behaviour? I'm fairly new to iOS programming and would appreciate any pointers on the most appropriate method of achieving the function I want.
Thanks
There are serious limitations to both approaches.
Using an accelerometer to measure distance travelled requires very precise and accurate real-time measurement of acceleration. Any error in acceleration reading leads to error in your velocity calculation, which makes your location reading drift from the real location. Drift gets worse over time, to the point where the error swamps the actual location reading.
Based on my testing the altitude reading in iOS GPS devices is really bad. +/- 100 or more meters is not uncommon. Indoors GPS readings tend to get really bad, and the altitude reading is bad enough to start.
This question already has answers here:
Closed 10 years ago.
Possible Duplicate:
Getting displacement from accelerometer data with Core Motion
Android accelerometer accuracy (Inertial navigation)
I am trying to use core motion user acceleration values, and double integrating them to derive distance covered. I move my iPhone linearly along its Y axis, against a 30 cm log ruler, on the table. First, I let the device be at rest for 10 seconds, and I calculate my offsets along the three axes, by averaging the respective user acceleration values.
The X, Y and Z offsets are subtracted from the acceleration values, when I try calculating the distance covered. After offset subtraction, these values are passed through a low pass filter and a median filter, separately of course. The filters are linear filters, and the cut-off frequency is specified by the number of neighbouring values whose mean is taken in low pass, and median in the median filter. I have experimented with varying values of this number from 1 to 100. In the end, these filtered values are double integrated using trapezoidal rule to get distances. But, the distance calculated is no where close to 30 cm. The closest value I got was some -22 cm(I am wondering why I am getting negative values even though I move the device in positive Y direction). I also came across this:
http://ajnaware.wordpress.com/2008/09/05/accelerating-iphones/
its an old post about the same thing, which says that the accelerometer readings returned appeared to come in quanta of about 0.18m/s^2 (ie. about 0.018g), resulting in a large cumulative error very quickly. Going by that, for this error to really not matter, one will have to accelerate the device by almost 1.8m/s^2, which is practically impossible for distance/length measurement purposes. for small movements, it does not look like there is a possibility of calculating distances by using an optimal filter and a higher order numerical integration method, without an impractical velocity/acceleration constraint like that. Is it possible?
How about using my acceleration vs timestamp data to interpolate a polynomial that grows over time, as I get more and more motion updates, which represents approximately an acceleration vs time curve. Double integration of ths polynomial would be a piece of cake. But, for small distances, the polynomial will have a big error component. Using a predictable known motion that my device will be subjected to, I wish to take a huge number of snapshots (calculated distance vs actual known distance) to calculate my error polynomial in a similar way, and then subtract it from my first polynomial. Can this work?
Although this does not fit StackOverflow, because it's not a question but a discussion, I'll try to sum up my thoughts about it.
As already said, the accelerometer is very inaccurate and you would need very good accuracy for this kind of task, especially if you are trying to measure such short distances. Plus, accelerometers differ from device to device, you will get different results for the same movements with different device. Plus a very huge random error.
My guess is, that you can get rid of a huge part of randomness/error by calibrating the device and making the "measurement move" a couple of times, like 10 times. After that you have enough data to get an average that might get close to the real value.
Calibration is a key part here, you have to think of a clever way to calibrate, like letting the user move the device over different distances in different speeds.
But all this is just theory. I would really like to see your results, but I doubt you get it working good enough even using the best possible filters/algorithms, since there is just too much noise.
There are n points in the 2D plane. A robot wants to visit all of them but can only move horizontally or vertically. How should it visit all of them so that the total distance it covers is minimal?
This is the Travelling Salesman Problem where the distance between each pair of points is |y2-y1|+|x2-x1| (called Rectilinear distance or Manhattan distance). It's NP-hard which basically means that there is no known efficient solution.
Methods to solve it on Wikipedia.
The simplest algorithm is a naive brute force search, where you calculate the distance for every possible permutation of the points and find the minimum. This has a running time of O(n!). This will work for up to about 10 points, but it will very quickly become too slow for larger numbers of points.