SuperpoweredSDK Frequencies Example - ios

I'm building an iOS app using the SuperpoweredFrequencies project as an example. Everything is working great. I've increased the number of bands to 55 and experimented with widths of 1/12 and 1/24 to tighten up the filtering range surrounding the individual frequencies in question.
I've noticed something when testing with a musical instrument, that when I play lower notes, starting approximately with A 110 that the amplitudes of these frequencies are registering much lower than when playing higher notes, say A 220 and A 440. This makes detecting the fundamental frequencies more difficult when lower notes are being played as it often appears as if I am playing the note an octave higher (the harmonic frequencies show up more prominently than the fundamental frequency for lower notes).
Can someone shed some light on this phenomenon? This doesn't appear to be due to the iPhone's mic because the same thing happens when testing on both my iMac and Mac Book. Is there a way of dealing with this issue using Superpowered's api so that the fundamental frequency can be detected when lower notes are being played?
Correction: I was testing a little more this morning with a guitar, and what I noticed is that for the E (82.4069) and F (87.3071) the fundamental frequencies (82.xxx and 87.xxx) register less prominently than the perfect fifth above those frequencies, B and B# respectively.
Maybe it is just due to the nature of the guitar as an instrument. Unfortunately I don't have a piano to test with. How do the responses look when playing the low notes on a piano?

The sensitivity of the iPhone's microphone may be lower in that region: https://blog.faberacoustical.com/2009/ios/iphone/iphone-microphone-frequency-response-comparison/
That's why harmonics may be picked up at a higher volume.

Related

How does ultra wide band determine position?

Apple iPhone's now have a U1 chip which is described as "Ultra Wideband technology for spatial awareness". I've heard the technology can do time of flight calculations to determine range, but that doesn't answer how it determines relative position. How does the positioning work?
How does ultra-wideband work?
Travelling at the speed of light
The idea is to send radio waves from one module to another and measure the time of flight (TOF), or in other words, how long it takes. Because radio waves travel at the speed of light (c = 299792458 m/s) we can simply divide the time of flight by this speed to get the distance.
However, perhaps you've noticed that the radio waves travel fast. Very fast! In a single nanosecond, which is a billionth of a second, a wave has travelled almost 30 cm. So if we want to perform centimetre-accurate ranging, we have to be able to measure the timing very very accurately! So now the question is, how can we do this? How can we even measure the timing of.. a wave?
It's all about the bandwidth
In physics, there is something called the Heisenberg's uncertainty principle. The principle states that it is impossible to know both the frequency and the timing of a signal. Consider for example a sinusoid; a signal with a well-known frequency but a very ill-determined timing: the signal has no beginning or end. However, if we combine multiple sinusoidal signals with a slightly different frequency, we can create a 'pulse' with more defined timing, i.e., the peak of the pulse. This is seen in the following figure from Wikipedia that sequentially adds sinusoids to a signal to get a sharper pulse:
fig. 1
The range of frequencies that are used for this signal is called the bandwidth Δf. Using Heisenberg's uncertainty principle we can roughly determine the width Δt of the pulse, given a certain bandwidth Δf*:
ΔfΔt ≥ 1/4π
From this simple formula we can see that if we want a narrow pulse, which is necessary for accurate timing, we need to use a large bandwidth. For example, using the bandwidth Δf = 20 MHz available for wifi systems we obtain a pulse-width larger than Δt ≥ 4ns. Using the speed of light this relates to a pulse of 1.2 m 'long' which is too much for accurate ranging. Firstly, because it is hard to accurately determine the peak of such a wide pulse, and secondly because of reflections. Reflections come from the signals bouncing onto objects (walls, ceilings, closets, desks, etc..) within the surrounding environment. These reflections are also captured by the receiver and may overlap with the line-of-sight pulse which makes it very hard to measure the true peak of the pulse. With pulses of 4 ns wide, any object within 1.2 m of the receiver or the transmitter will cause an overlapping pulse. Because of this, ranging from wifi using time-of-flight is not suitable for indoor applications.
The ultra-wideband signals have typically a bandwidth of 500 MHz resulting in pulses of 0.16 ns wide! This timing resolution is so fine that at the receiver, we are able to distinguish several reflections of the signal. Hence, it remains possible to do accurate ranging even in places with a lot of reflectors, such as indoor environments.
fig. 2
Where to put all this bandwidth?
So we need a lot of bandwidth. Unfortunately, everybody wants it: in wireless communication systems, more bandwidth means faster downloads. However, if everybody would transmit signals on the same frequencies, all the signals would interfere and nobody would be able to receive anything meaningful. Because of this, the use of the frequency spectrum is highly regulated.
So how is it possible that UWB gets 500 MHz of precious bandwidth and most other systems have to satisfy with a lot less? Well, the UWB systems are only allowed to transmit at very low power (the power spectrum density must be below -41.3 dBm/MHz). This very strict power constraint means that a single pulse is not able to reach far: at the receiver, the pulse will likely be below the noise level. In order to solve this issue, a train of pulses is sent by the transmitter (typically 128 of 1024) to represent a single bit of information. At the receiver, the received pulses are accumulated and with enough pulses, the power of the 'accumulated pulse' will rise above the noise level and reception is possible. Hooray!
The IEEE 802.15.4 standard for Low-Rate Wireless Personal Area Networks has defined a number of UWB channels of at least 500MHz wide. Depending on the country you're in, some of these channels are either allowed or not. In general, the lower band channels (1 to 4) can be used in most countries under some limitations on update rate (using mitigation techniques). Channel 5 is accepted in most parts of the world without any limitations, with the notable exception of Japan. Purely from physics, the lower the channel center frequency, the better the range.
‍
‍
A note on the received signal strength (RSS)
There exists another way to measure the distance between two points by using radio waves, and that is by using the received signal strength. The further the two points are, the smaller the received signal strength will be. Hence, from this RSS-value, we should be able to derive the distance. Unfortunately, it's not that simple. In general, the received signal strength will be a combination of the power of all the reflections and not only of the desired line-of-sight. Because of this, it becomes very hard to relate the RSS value to the true distance. The figure below shows just how bad it is.
In this figure, the RSS value of a Bluetooth signal is measured at certain distances. At every distance, the error bars show how the RSS value behaves at the given distance. Clearly, the variation on the RSS value is very large which makes RSS unsuitable for accurate ranging or positioning.
‍
Source

Can the frequency of a flashing light be counted using a video camera

Is there a formula to determine that max flash rate countable by a video camera? I am thinking that any flash rate > # of fps is not practical. I get hung up on the fact that the shutter is open only a fraction of the amount of time required to produce a frame. 30fps is roughly 33.33ms. If the shutter is set for say 1/125 which is about 8ms or roughly 25% of the frame time. Does the shutter speed matter? I am thinking that unless they are sync'd the shutter could open at any point in the lamp flash ultimately making counting very difficult.
The application is just a general one. With today's high speed cameras (60fps or 120fps) can one reliably decide on the flash rate of a lamp. Think alarm panels, breathing monitors or heart rate monitors or the case of trying to determine duty cycle by visual means.
What you describe is related to the sampling problem.
You can refer your problem to the Nyquist - Shannon theorem
Given a certain frequency of acquisition (# of FPS) you can be sure of your counting (in every case, no matter of syncronization) if
"# of FPS" >= 2* flashing light frequency (in Hz)
Of course this is a general theoric rule, things can work in a quite different way (I am answering only regarding the number of FPS in a general case)

How does knocktounlock work?

I am trying to figure out how knocktounlock.com is able to detect "knocks" on the iPhone. I am sure they use the accelerometer to achieve this, however all my tries come up with false flags (if user moves, jumps, etc it sometimes fires)
Basically, I want to be able to detect when a user knocks/taps/smacks their phone (and be able to distinguish that from things that may also give a rise to the accelerometer). So I am looking for sharp high peeks. The device will be in the pocket so the movement of the device will not be very much.
I have tried things like high/low pass (not sure if there would be a better option)
This is a duplicate of this: Detect hard taps anywhere on iPhone through accelerometer But it has not received any answers.
Any help/suggestions would be awesome! Thanks.
EDIT: Looking for more thoughts before I accept the answer below. I did hear back from Knocktounlock and they use the fourth derivative (jounce) to get better values to then analyse. Which is interesting.
I would consider knock on the iPhone to be exactly same as bumping two phones with each other. Check out this Github Repo,
https://github.com/joejcon1/iOS-Accelerometer-visualiser
Build&Run the App on iPhone and check out the spikes on Green line. You can see the value of the spike clearly,
Knocking the iPhone:
As you can see the time of the actual spike is very short when you knock the phone. However the spike patterns are little different in Hard Knock and Soft knock but can be distinguished programmatically.
Now lets see the Accelerometer pattern when iPhone moves in space freely,
As you can see the Spikes are bell shaped that means the it takes a little time for spike value to return to 0.
By these pattern it will be easier to determine the knocking pattern. Good Luck.
Also, This will drain your battery out as the sensor will always be running and iPhone needs to persist connection with Mac via Bluetooth.
P.S.: Also check this answer, https://stackoverflow.com/a/7580185/753603
I think the way to go here is using pattern recognition with accelerometer data.
You could (write and) train a classifier (e.g. K-nearest neighbor) with data you gathered and that has been classified by hand. Neural networks are also an option. However, there will be many different ways to solve that problem. But there is probably no straightforward way for achieving this.
Some papers showing pattern recognition approaches to similar topics (activity, movement), like
http://www.math.unipd.it/~cpalazzi/papers/Palazzi-Accelerometer.pdf
(some more, but I am not allowed to post them with my reputation count. You can search for "pattern recognition accelerometer data")
There is also a master thesis about gesture recognition on the iPhone:
http://klingmann.ch/msc_thesis_marco_klingmann_iphone_gestures.pdf
In general you won't achieve 100% correct classification. Depending on the time/knowledge one has got the result will vary between good-usable and we-could-use-random-classification-instead.
Just a though, but It could be useful to add to the mix the output of the microphone to listen to really short, loud noises at the same time that a possible "knock" movement has been detected.
I am surprised that 4th derivative is needed, intuitively feels to me 3rd ("jerk", the derivative of acceleration) should be enough. It is a big hint what to keep eye on, though.
It seems quite simple to me: collect accelerometer data at high rates, plot on chart, observe. Calculate from that first derivative, plot&observe. Then rinse&repeat, derivative of the last one. Draw conclusions. I highly doubt you will need to do pattern recognition per se, clustering/classifiers/what-have-you - i think you will see very distinct peak on one of your charts, may only need to tune collection rate and smoothing.
It is more interesting to me how come you don't have to be running the KnockToUnlock app for this to work? And if it was running in the background, who left it run there for unlimited time. I dont think accel. qualifies for unlimited background run. And after some pondering, i am guessing the reason is that the app uses Bluetooth to connect Mac as accessory - and as such gets a pass from iOS to run in the background (and suck your battery, shhht)
To solve this problem you need to select the frequency. Tap (knock)
has a very high frequency, so you should chose the frequency of the
accelerometer is not lower than 50 Hz (perhaps even 100 Hz) for
quality tap detection in the case of noise from other movements.
The use of classifiers is necessary, but in order to save battery consumption you should not call a classifier very often.It should write a simple algorithm that would find only taps and situation similar to knoks and report that you program need to call a classifier.
Note the gyro signal, it also responds to knocks, besides the
gyroscope signal not be need separated from the constant component
and the gyroscope signal contains less noise.
That is a good video about the basics of working with smartphones sensors: http://talkminer.com/viewtalk.jsp?videoid=C7JQ7Rpwn2k#.UaTTYUC-2Sp .

Comparing pitches with digital audio

I work on application which will compare musical notes with digital audio. My first idea was analyzes wav file (or sound in real-time) with some polyphonic pitch algorithms and gets notes and chords from this file and subsequently compared with notes in dataset. I went through a lot of pages and it seems to be a lot of hard work because existing implementations and algorithms are mainly/only focus on monophonic sound.
Now, I got the idea to do this in the opposite way. In dataset I have for example note: A4 or better example chord: A4 B4 H4. And my idea is make some wave (or whatever I don't know what) from this note or chord and then compared with piece of digital audio.
Is this good idea? Is it better/harder solution?
If yes can you recommend me how to do it?
The easiest solution is to take the FFT (Fast Fourier Transform) of the waveform: all the notes (and their harmonics) will be present in the signal. You then look for the frequencies that correspond to notes, and there's your solution.
Note - in order to get decent frequency resolution you need a sufficiently long sample, and high enough sample rate. But try it and you will see.
Here are a couple of screen shots of an app called SpectraWave that I took sitting in front of my piano. The first is of middle A (f = 440 Hz as you know):
and the second is of an A-minor chord (as you can see, my middle finger is a little stronger and the C is showing up as the note with the greatest volume). The harmonics will soon make it hard to see more than just a few notes…
Your "solution" most likely makes matching even more difficult, since you will have no idea what waveform to make for each note. Most musical instruments and voices not only produce waveforms that are significantly different from single sinewaves or any other familiar waveform, but these waveforms evolve over time. Thus guessing the proper
waveform to use for each note for a match is extremely improbable.

Is there a more accurate way to detect ball to ball collisions with Sphero API?

I'm writing a game for sphero, the robotic ball (having issues with their forums, can't seem to ask a question). I'm trying to do ball to ball collision detection for 2 or more players.
first of all the they give a sample here:
https://github.com/orbotix/Sphero-iOS-SDK/tree/master/samples/CollisionDetection
The thresholds they supply are WAY too sensitive, on a wooden floor it triggers all the time. Forgetting that for the minute, I have to use the impact timestamp from both devices to see if they have triggered collisions at roughly the same time.
My issue is when subtracting timestamps, in some cases i'm getting very wide variations and i think the difference is quite long to begin with. I'm storing several timestamps so I don't miss the correct one and I tried playing with the dead time to see if lowering it would help.
Most commonly subtracting 2 NSTimeIntervals i get a difference between 0.68 and 0.72 (I would have expected 0.01 level reactions). So Im checking if the difference is under 0.72, 3 times in a row i got between 0.72 and 0.73, several times I got 1.5, 2.6, 1.1 and even 3.8.
It doesn't seem as though its reliable. The documentation says this time comes from the iPhones reference. Both devices are synced to get time automatically, so they are as close to each other as possible.
Has anyone tried this and come up with a reliable solution, that doesn't involve keeping one ball still ?
I did a significant amount of research on the subject of ball to ball collisions when I started as a developer for Orbotix, the makers of Sphero.
This is a very complicated problem to solve. The closest I came to making this work (for a infected zombies research game) was about 80% accuracy for detecting which ball hit which ball with a sample size of 3. The more balls you would put into the game, the lower the accuracy would become. Hence, we decided to eliminate the issue by having one ball required to stop moving before it was vulnerable, like in Sphero TAG.
There are a few factors that limit this capability, and it seems you have discovered them. I believe the biggest issue is that collision detection has poor performance while the ball is driving. Especially on a rough surface or when the ball makes quick jerky movements. This alone causes majors problems when coupled with the dead time.
I was able to get collision timestamps to within 50 ms on average. Are you taking into consideration the wifi latency in transmitting the packets between phones?
The solution is something you probably don't want to hear, but you should tweak your game play to work within the capabilities of collision detection. That is, the ball driving really slow when it can be contacted, or even come to a stop like in TAG. Ask yourself, how can I make this fun without ball to ball collisions?
I just want to say, first, that we are moving our developer support forum here, to StackOverflow, and that's why you can't post on the forums. So, you did the right thing, Simon, by coming to StackOverflow, and you should be proud.
We just changed the forums to redirect here instead of leaving people confused.
The timestamps are generated by Sphero. But they only make sense is you're using the Poll Packet Times command to generate delay and offset values. Please refer to DID 00h, CID 50h in the API commands document.
That being said, collision detection is an ever evolving technology from our end. We employ a cleverly coded DFT frequency transform on a sliding data window real-time inside the robot. The parameters allow tuning to the surface you're running on; there are no universal settings. If you're obtaining too many false positives then please experiment. If you have ideas to improve the algorithm then contact us directly and maybe we can include it as a new filtering method. We're always open to clever ideas!
You could sync the internal timers of each Sphero at the beginning of the game. These can be matched against a synced timer within each host phone. Clocks may be different, but a millisecond is a millisecond. You could also lower the threshold of the collision detection, thus making it so that the 'event' (damage, infection, etc.) can only occur if the 'attacking' Sphero is moving at a certain speed. Or a variation thereof.

Resources