Detect Bump On Bottom Of iPhone [duplicate] - ios

This question already has answers here:
Detect when an iphone has been bumped
(5 answers)
Closed 6 years ago.
I have been doing a lot of research in the area of an iPhone's magnetometer, gyroscope and accelerometer, but can't seem to a good conclusion.
Things I have tried so far:
Acceleration in one direction to detect a stop. (Always detected when person is holding phone, and seems to just move in that direction)
Gyroscope angle changing directions regarding the way in which a person is holding the device. (Not consistent because trying to get shake to work with this as well)
Bump API/Existing Code (Does not work)
Has anyone come across a solid solution for detecting a tap on the bottom of and iPhone, against an object? Preferably with sample code.

Perhaps this is what you are looking for? It's a set of custom wrappers around the hardware that is designed specifically for detecting "accidents", which sounds like what you're looking for.

Related

iOS using both cameras at the same time? [duplicate]

This question already has an answer here:
How can I use front camera and back camera simultaneously on the iPhone
(1 answer)
Closed 9 years ago.
Is there anything in the HIG or in Apples doc's that say that you cannot use the front camera and the back camera simultaneously at the same time to take a photo? Or technically is this not possible because of some limitation? Thank you
It is impossible and it is not allowed to access both simultaneously
So you can not use both cameras at a time because as one camera begins, the other will stop.

ios detect heart rate [duplicate]

This question already has answers here:
Closed 10 years ago.
Possible Duplicate:
Detecting heart rate using the camera
I need the same functionality as the application Instant Heart Rate.
The basic process requires the user to:
Place the tip of the index finger gently on the camera lens.
Apply even pressure and cover the entire lens.
Hold it steady for 10 seconds and get the heart rate.
This can be accomplished by turning the flash on and watch the light change as the blood moves through the index finger.
how can i start
I'd start by using AVFoundation to turn the light on. The answers in the linked post below include examples of how to do this:
How to turn the iPhone camera flash on/off?
Then as far as detecting the light change goes, you can probably use Brad Larson's GPUImage Framework. This framework includes a couple of helpful functions that you may be able to use to achieve this, including:
GPUImageAverageLuminanceThresholdFilter
GPUImageAverageColor
GPUImageLuminosity
Using the filters listed above you should be able to measure color variations in your finger, and monitor the time intervals between the occurrences of these changes. Using this framework you may even be able to specify an arbitrary variance requirement for the color/luminosity change.
From there all you have to do is convert the time interval in between the color changes to a pulse. Here's an example of how to calculate pulse.
http://www.wikihow.com/Calculate-Your-Target-Heart-Rate

iOS/C/C++/Cocos2D: how to "simulate" an electric guitar pick slide sound [closed]

This question is unlikely to help any future visitors; it is only relevant to a small geographic area, a specific moment in time, or an extraordinarily narrow situation that is not generally applicable to the worldwide audience of the internet. For help making this question more broadly applicable, visit the help center.
Closed 10 years ago.
rather weird question I suppose, but lets put it in the right context. I am not looking for simulating the entire guitar sound range or for writing my own synthetizer, neaither to simulate an entire guitar (please see answers to this type of questions here). I "Just" want to give an idea of the slide pick sound without irritating a musician if possible. If the sound type is not clear please have a look here:
http://youtu.be/y4_wjNRLe4M?t=1m39s
http://youtu.be/VdrFTyUCGYs?t=2m57s
I'd like to give the user the sensation that if the slide bar is moved right a slide up sound is produced for a time lenght proportional to the relative movement of the slide bar. Instead if the user moves the slide bar left then a slide down sound would be played using the same concept.
For reference. I am using Cocos2D 2.0 and iOS 5.1.
I found something vaguely similar to the effect I am after the paragraph "Modifying audio properties" in Chapter 6 of Cocos2d Cookbook (here is a link to a free App from the demo examples of Chapter 4 to 6), where the author modifies a mono synthetizer tone changing the pitch and gain according to the user's finger position on the screen (see this video to understand which example I am talking about in the case that is still unclear).
//Play our sound with custom pitch and gain
CDSoundSource *sound = [[sae soundSourceForFile:#"synth_tone_mono.caf"] retain];
[sound play];
sound.looping = YES;
[notes setObject:sound forKey:key];
sound.pitch = point.x/240.0f;
sound.gain = point.y/320.0f;
This does not give the best solution but a decent approximation. Anyone would be able to suggest something better?
EDIT: I am working towards an approximation of this by recording a slide up and slide down sound equal to the maximum lenght. Then starting the sound once the slide pick bar is moved and stopping it once the movement is finished (in Cocos2d this will correspond to the touch started and touch ended on a particular sprite that will represent a slide pick). The slide pick will then be restored at the original position to avoid having to deal with the case where the sliding starts from different position. Of course is an approximation and I would use this because it is cheap in terms of development time and as well as computational performance (I guess that anything that will simulate the sound will be more "expensive" computationally). I leave the question open to see if anyone comes up with a better idea or solution (libraries are still wellcome because I am still considering the possibility to fully simulate the slide pick sound).

How to add continuous movement to iPad app

I'm writing a 3d car designer ipad app and I want to add a little "life" to it. I want it to be similar to what "3D Car Builder" iPad app does, in that from the moment the app starts, there's just the slightest amount of "movement" in the scene. Even if the iPad is sitting on the table, there's movement going on.
I'm testing on a 1st gen iPad, so whatever they've implemented, works on my device. I have (iOS 5 installed). I've looked up several things, thinking this was developed with the accelerometer, or possibly magnetometer, core motion of some sort...I can't figure out where to start. It might something as simple as moving the 3D scene in the x direction for a small amount, then y-direction, the negative x, then negative y. I dunno, it's something simple.
Anyone know how I/they might have implemented this?
Okay, nevermind...I figured this one out. Worked great using CCActionInterval

Motion detection of iOS device in 3d Space

Ive been working with the iOS sensors a bit off late and i wanted to write an app that would accurately track the motion of the phone in space. I wanted to know if its possible to track the motion of the device and detect gestures, such as drawing a circle with your phone or even moving in a straight line.
I've been searching online about this, and i wanted to know two things:-
1.Is it possible to do this with the CoreMotion framework.
2.If Yes, what is the alternative for older devices that do not support CoreMotion. Without the double integral method using the accelerometer!
This would really help!
Any other alternative ideas are most welcome!
Thanks in advance!
As your write, you cannot do the double integral.
For gesture recognition, I would try dynamic time warping. See my earlier answer here.

Resources