ios detect heart rate [duplicate] - ios

This question already has answers here:
Closed 10 years ago.
Possible Duplicate:
Detecting heart rate using the camera
I need the same functionality as the application Instant Heart Rate.
The basic process requires the user to:
Place the tip of the index finger gently on the camera lens.
Apply even pressure and cover the entire lens.
Hold it steady for 10 seconds and get the heart rate.
This can be accomplished by turning the flash on and watch the light change as the blood moves through the index finger.
how can i start

I'd start by using AVFoundation to turn the light on. The answers in the linked post below include examples of how to do this:
How to turn the iPhone camera flash on/off?
Then as far as detecting the light change goes, you can probably use Brad Larson's GPUImage Framework. This framework includes a couple of helpful functions that you may be able to use to achieve this, including:
GPUImageAverageLuminanceThresholdFilter
GPUImageAverageColor
GPUImageLuminosity
Using the filters listed above you should be able to measure color variations in your finger, and monitor the time intervals between the occurrences of these changes. Using this framework you may even be able to specify an arbitrary variance requirement for the color/luminosity change.
From there all you have to do is convert the time interval in between the color changes to a pulse. Here's an example of how to calculate pulse.
http://www.wikihow.com/Calculate-Your-Target-Heart-Rate

Related

iOS timecode-synced downloadable animation system

As an introduction and context, I'm currently a novice iOS app developer and I want to make sure I'm not reinventing the wheel too much as I make this app (reinventing wheels can get very expensive.)
The app will allow the user to download our videos off the internet and will allow storage for offline usage. The problem with storing these videos on the device is that many of them will be too long and thus too big to be practical to store.
The videos are quite simple however, consisting of a couple short "real" video clips at the beginning and end, with the bulk of the video being still images animated around the screen. The animations would consist solely of opacity and simple transformation keyframes (translate, scale, rotate around static anchor point), and would require a variety of easing functions for each transition.
The hardest part likely would be that the "video" player will also have to be able to track with an audio player's timecode, and will have to support seeking to any arbitrary point like a normal video player.
So, now that I've described the problem, here's the solution I've come up with so far. Hopefully doing it this way will reduce the probability of XY problems. :)
The idea is to basically do a dumbed-down version of what Final Cut and other editing programs do with animations—have a bunch of clips, sometimes overlapping, and be able to animate the position, scale, rotation, and opacity of each using keyframes.
My first instinct as far as implementation goes is to use some of iOS's game engine stuff to do animations (maybe SceneKit because it seems to allow animations to use scene time as opposed to real time, despite the fact that it's primarily 3d and I am doing 2d animations) and manually handle syncing time with the audio player, as well as manually handling the adding and removing of nodes from the scene when seeking through the video and when clips begin/end.
What are some built-in systems, plugins, etc. that I can take advantage of to make this easier and faster to develop and maintain? Double points if I don't have to transcode the animations by hand to some custom format.
As I mentioned in my comment your question is rather broad and contains multiple questions in one, I will address what you mentioned to be likely the hardest part:
https://developer.apple.com/documentation/avfoundation/avplayeritem
https://developer.apple.com/documentation/avfoundation/avasset
Instead of SceneKit, take a look at SpriteKit and its SKVideoNode.
Also, research Metal video processing. There are quit a few example projects available you could use as a starting point.

Detect Bump On Bottom Of iPhone [duplicate]

This question already has answers here:
Detect when an iphone has been bumped
(5 answers)
Closed 6 years ago.
I have been doing a lot of research in the area of an iPhone's magnetometer, gyroscope and accelerometer, but can't seem to a good conclusion.
Things I have tried so far:
Acceleration in one direction to detect a stop. (Always detected when person is holding phone, and seems to just move in that direction)
Gyroscope angle changing directions regarding the way in which a person is holding the device. (Not consistent because trying to get shake to work with this as well)
Bump API/Existing Code (Does not work)
Has anyone come across a solid solution for detecting a tap on the bottom of and iPhone, against an object? Preferably with sample code.
Perhaps this is what you are looking for? It's a set of custom wrappers around the hardware that is designed specifically for detecting "accidents", which sounds like what you're looking for.

What logic is used for creating an Equalizer meter

Basically i'm gonna be working on an iOS music app which when a song is being played, it shows the fancy Equalizer meter, Something like this but with all the animation of bars going up and down:
After looking into this and not finding enough resource, I really want to carry this as a project perhaps making a web version using j query.
I'm not really asking for specific code, i just want to know how the animation works in general ?
Thanks a million !!!
Checkout the Cocoa Waveform Audio Player Control project. It's a cocoa audio player component which displays the waveform of the audio file.
Also, there is already a lot of questions on this topic:
iOS FFT Accerelate.framework draw spectrum during playback
Using the Apple FFT and Accelerate Framework
iOS FFT Draw spectrum
Animation would be pretty straight forward. It is just animating changes of the height of rectangles.

How to detect only/specifically human voice?

I am developing an application where I shall be plotting a realtime pitch-frequency graph based on the sound produced by the speaker.
Example: User says "hmmmmmmmmmmmmmm..." and a graph is being plotted simultaneously showing the frequency reached by the user at every 1/10th of a second.
Now, I have developed everything from top to bottom but the only problem which remains is that the background noise is also being captured while a user speaks or says something. Even if the user says something keeping the phone close to his lips, noise is still being captured and plotted.
I want to remove that noise.
I have tried going about Shout ToolKit and Shpinx but nothing is being that effective as it is slowing the plotting of graph.
I am making this app using phonegap.
Are there any better noise cancellation apis available [pref: open source]

iOS/C/C++/Cocos2D: how to "simulate" an electric guitar pick slide sound [closed]

This question is unlikely to help any future visitors; it is only relevant to a small geographic area, a specific moment in time, or an extraordinarily narrow situation that is not generally applicable to the worldwide audience of the internet. For help making this question more broadly applicable, visit the help center.
Closed 10 years ago.
rather weird question I suppose, but lets put it in the right context. I am not looking for simulating the entire guitar sound range or for writing my own synthetizer, neaither to simulate an entire guitar (please see answers to this type of questions here). I "Just" want to give an idea of the slide pick sound without irritating a musician if possible. If the sound type is not clear please have a look here:
http://youtu.be/y4_wjNRLe4M?t=1m39s
http://youtu.be/VdrFTyUCGYs?t=2m57s
I'd like to give the user the sensation that if the slide bar is moved right a slide up sound is produced for a time lenght proportional to the relative movement of the slide bar. Instead if the user moves the slide bar left then a slide down sound would be played using the same concept.
For reference. I am using Cocos2D 2.0 and iOS 5.1.
I found something vaguely similar to the effect I am after the paragraph "Modifying audio properties" in Chapter 6 of Cocos2d Cookbook (here is a link to a free App from the demo examples of Chapter 4 to 6), where the author modifies a mono synthetizer tone changing the pitch and gain according to the user's finger position on the screen (see this video to understand which example I am talking about in the case that is still unclear).
//Play our sound with custom pitch and gain
CDSoundSource *sound = [[sae soundSourceForFile:#"synth_tone_mono.caf"] retain];
[sound play];
sound.looping = YES;
[notes setObject:sound forKey:key];
sound.pitch = point.x/240.0f;
sound.gain = point.y/320.0f;
This does not give the best solution but a decent approximation. Anyone would be able to suggest something better?
EDIT: I am working towards an approximation of this by recording a slide up and slide down sound equal to the maximum lenght. Then starting the sound once the slide pick bar is moved and stopping it once the movement is finished (in Cocos2d this will correspond to the touch started and touch ended on a particular sprite that will represent a slide pick). The slide pick will then be restored at the original position to avoid having to deal with the case where the sliding starts from different position. Of course is an approximation and I would use this because it is cheap in terms of development time and as well as computational performance (I guess that anything that will simulate the sound will be more "expensive" computationally). I leave the question open to see if anyone comes up with a better idea or solution (libraries are still wellcome because I am still considering the possibility to fully simulate the slide pick sound).

Resources