Detect if iPhone thrown in air using core motion - ios

I'm trying to detect if my iPhone has been thrown into the air.I've tried using core motion's acceleration API and its altitude API.However, because the axes are fixed to the phone doing the detection of the changes is incredibly difficult.Is there a better way to do what I want?Is it possible to speed up the refresh rate of the CMAltitude API?

In freefall, you should see your 3 accelerometer values go to 0. Even in a projectile type of fall (throwing), the phone is in freefall as soon as it leaves the thrower's hand.
This white paper talks about using a MCU, but the concept is there.
http://www.nxp.com/files/sensors/doc/app_note/AN3151.pdF

Related

How I can get gps signal on my iPhone indoor?

I developed an app which calculates distances between current position(lat and long) and another stores location. Current position comes from built-in Gps on iPhone. In some buildings is working indoor( like 8meters far from windows or doors). Why is it working indoor? How can I still get GPS signal without WiFi/3G/4G connection?
i hope someone can answer me
Welcome to SO.
The short answer is, it's out of your control. GPS is not reliable indoors. Sometimes you are able to get a signal, sometimes not. Metal frame buildings are worse than wood-frame buildings, and brick also tends to interfere with GPS signals. Florescent lights also tend to interfere.
In a given indoor setting you'll either get a GPS signal or not.

Indoor Atlas: iOS SDK doesn't give accurate position when device stops moving

I downloaded the Indoor Atlas iPhone SDK and also generated path maps and test paths for my venue. SDK navigates me perfectly when I am moving from one place to another but when I stop moving it generates scattered output with the position radius from 10 to 25. I am expecting precise co-ordinates in both the above cases in my project.
Is there any way to get more precision?
IndoorAtlas technology is using the history of magnetic field observations for computing the precise location. This means that the device needs to move some distance in order to collect enough data to converge to a correct location estimate, i.e., to have a location fix. We are constantly improving our service to decrease the time needed for the first location fix.
If you experience your position moving after you've already stopped walking yourself, please contact support#indooratlas.com with details of your application and venue where this is experienced and we'll look into it. Thanks!

Is there a way to calculate small distances with CoreMotion?

Is it possible to calculate small distances with CoreMotion?
For example a user moves his iOS device up or down, left and right and facing the device in front of him (landscape).
EDIT
Link as promised...
https://www.youtube.com/watch?v=C7JQ7Rpwn2k position stuff starts at about 23 minutes in.
His summary...
The best thing to do is to try and not use position in your app.
There is a video that I will find to show you. But short answer... No. The margin for error is too great and the integration that you have to do (twice) just amplifies this error.
At best you will end up with the device telling you it is slowly moving in one direction all the time.
At worst it could think it's hurtling around the planet.
2020 Update
So, iOS has added the measure app that does what the OP wanted. And uses a combination of accelerometer and gyroscope and magnetometer in the phone along with ARKit to get the external reference that I was talking about in this answer.
Iā€™m not 100% certain but if you wanted to do something like the OP was asking you should be able to dig into ARKit and find some apis in there that do what you want.
šŸ‘šŸ»

How to detect only/specifically human voice?

I am developing an application where I shall be plotting a realtime pitch-frequency graph based on the sound produced by the speaker.
Example: User says "hmmmmmmmmmmmmmm..." and a graph is being plotted simultaneously showing the frequency reached by the user at every 1/10th of a second.
Now, I have developed everything from top to bottom but the only problem which remains is that the background noise is also being captured while a user speaks or says something. Even if the user says something keeping the phone close to his lips, noise is still being captured and plotted.
I want to remove that noise.
I have tried going about Shout ToolKit and Shpinx but nothing is being that effective as it is slowing the plotting of graph.
I am making this app using phonegap.
Are there any better noise cancellation apis available [pref: open source]

iOS Accelerometer and Spacial Navigation?

I'm trying to build a web app specifically for iOS that relies on accelerometer data for navigation. (A site that you could theoretically move through (from page to page) spatially.)
For example, taking a step (or moving the device) forward would take you to one web page or URL, and left, right, or backwards would take to their own unique URLs. Any ideas on how to make this happen?
I spent quite a lot of time looking in to this for an experiment I was writing.
The trouble is that it is very hard/impossible to get an accurate (or even approximate) location in space from your device.
The trouble is that the device measures acceleration which needs a couple of calculations to get down to position.
You can use this value to measure location relative to a start point but the next problem is the noise that the accelerometer receives.
The closest I could get to getting it working was to smooth out the noise of the accelerometer and then calculate back to try and determine location. But due to the noise I found that the device would constantly think it was moving in one direction.
After a number of days of trying different methods I determined that it really isn't possible without external tracking of the device.
What you can do is use the orientation of the device to navigate. i.e. tilt forwards, tilt backwards, tilt left, tilt right do do different things.

Resources