Recording from microphone and replaying sound with a delay in Chuck - chuck

I'm trying to replay sound currently recorded on mic with an adjustable delay.
This should help to improve pronaunciation ability. I found the example, which seems to do what I want:
adc => DelayL delay => dac;
.05::second => delay.max => delay.delay;
while( true ) 5::second => now;
Unfortunatelly, the delay is not exact the time I define. The replay starts about 0,25 sec later and not 50ms as I want.
Is there a way to set smaller delay?

Related

How to get lowest audio latency response to touch event?

I'm building a rhythm game and trying to provide extremely low latency audio response to the user using AudioKit.
I'm new to AudioKit, following the Hello world example, I built a very simple test app using AKOscillator:
...
let oscillator = AKOscillator()
...
AudioKit.output = oscilator
oscillator.frequency = 880
AKSettings.bufferLength = .shortest
AKSettings.ioBufferDuration = 0.002
AudioKit.start()
... // On Touch event ///
oscillator.start()
... // 20 ms later ///
oscillator.stop()
I measured the latency between touch event and the first sound coming out, it's around 100ms, which is way to slow for us...
Few possibilities I could think:
100 ms is hitting the hardware limit of the audio output delay
some more magic settings can fix this
oscillator.start() has some delay, to achieve lowest latency I should use something else
something wrong with the other parts of the code (touch handling etc.)
Since I have now experience on AudioKit (nor iOS audio system...), any piece of information will be really appreciated!

Mute the audio at a particular interval of time while casting the video in ios using chromecast

I am working on chrome cast based application using google cast api. Initially, I am joining to the chromecast session from youtube and playing the video. Later I joined to this session from my application.
There is a requirement in my application to mute the audio at particular interval of time.
I required to mute the audio from 00:01:34:03(hh:mm:ss:ms) to 00:01:34:15((hh:mm:ss:ms).
Converting the time to seconds in the below way.
Time to seconds conversion: (00*60*60)+(01*60)+34+(03/1000) = 94.003 -> Mute start time
Calling the mute method after the interval of: Mute start time - Current streaming position
I am using approximateStreamPosition value (in GCKMediaControlChannel header file) to known the stream position of the casting video. It is returning the value in double format say 94.70801001358.
In this 94 is seconds duration, what does the value after the decimal point indicates(.70801001358). Is it milliseconds? If so can I round it to three digits.
As I required to mute the audio in milliseconds duration, is it causes any delay or advance muting of the audio if I round off the value.
The 0.70801001358 is in seconds; I am not sure what you mean by asking if that is in milliseconds. In milliseconds, that number would be 708.01001358.
You wont be able to have millisecond accuracy in controlling mute (or any other control command for that matter); just setting up a command and transfer time from your iOS device to Chromecast together will throw your calculations off by a good number of milliseconds.

Millisecond (and greater) precision for audio file elapsed time in iOS

I am looking for a low-latency way of finding out how many seconds have elapsed in an audio file to guaranteed millisecond precision in real-time. According to the AVAudioPlayer class reference, a call to -currentTime will return "the offset of the current playback position, measured in seconds from the start of the sound", however an NSTimeInterval is a double and this implies fractions of a second are possible.
As a testing scenario, I have an audio file playing and the user taps a button. Playback DOES NOT pause/stop, but at the moment the button was tapped I would like to obtain information about the elapsed time. In the real application, the "button may be pressed" many times in one second, hence the need for millisecond precision.
My files are stored as AIFF files and are around 1-10 minutes in length. Ideally I would like to find out exactly which sample frame is 'up-next' when playback resumes - however, this level of precision is a little excessive and millisecond precision is perfectly acceptable.
Is AVAudioPlayer's -currentTime method sufficient to achieve guaranteed millisecond precision for a currently-playing audio file? Or, would it be preferable to use a lower-level API such as iOS's Audio Units?
If you want sub-millisecond relative time resolution, convert to raw PCM and count buffers * length + samples using a low latency RemoteIO Audio Unit configuration. Most iOS devices will support as small as 6 mS RemoteIO buffers of 256 samples, with a callback for each buffer.

Using MusicTrackNewMIDINoteEvent to add note while playing

I'm building a drum machine to learn how to use MIDIs on iOS. I managed to get it working to one point, however, I have the following problem. When the user taps a certain button I need to add a sound to my MIDI loop while the MIDI player is playing and unfortunately I can't simply do:
MusicTrackNewMIDINoteEvent(track, 0, &message);
although the track is looping and has a determined length, so theoretically it should come back to 0 at one point. I also tried this:
MusicTrackNewMIDINoteEvent(track, noteTimestamp, &message);
where noteTimestamp is the timestamp I receive from the player. Finally, I managed to get it working with something like this:
MusicTrackNewMIDINoteEvent(track, noteTimestamp+.5, &message);
but it's needless to say that the .5 delay is not really what I would want for my drum machine, which should be as responsive as possible.
So, how does one tackle this problem? How can you push a note on the track as soon as possible, without any delay?
You're laying down an event on the track, and by the time you lay the event down the "playhead" is already past the point where it can do anything with it.
So continue to do what you're doing (without shifting the time) as a means to "record" the event for the next time the loop "comes around", but you'll need to fire off a midi message manually -- apart from the track as:
(int) note = 60;
(int) velocuty = 127;
(int) offset = 0;
MusicDeviceMIDIEvent(_yourSamplerUnit, kMIDIMessage_NoteOn << 4 | 0, note, velocity, offset);
Again, firing a manual midi event will allow the listener to hear the sound, and laying down the event into the track will allow your track to "record" it for the next time 'round.

portaudio video/audio sync

I use ffmpeg to decode video/audio stream and use portaudio to play audio. I encounter a sync problem with portaudio. I have a function like below,
double AudioPlayer::getPlaySec() const
{
double const latency = Pa_GetStreamInfo( mPaStream )->outputLatency;
double const bytesPerSec = mSampleRate * Pa_GetSampleSize( mSampleFormat ) * mChannel;
double const playtime = mConsumedBytes / bytesPerSec;
return playtime - latency;
}
mCousumeBytes is the byte count which written into audio device in the portaudio callback function. I thought I could have got the playing time according to the byte count. Actually, when I execute the other process ( like open firefox ) which make cpu busy, the audio become intermittent, but the callback doesn't stop so that mConsumeBytes is more than expected, and getPlaySec return a time which is larger than playing time.
I have no idea how this happened. Any suggestion is welcome. Thanks!
Latency, in PortAudio is defined a bit vaguely. Something like the average time between when you put data into the buffer and when you can expect it to play. That's not something you want to use for this purpose.
Instead, to find the current playback time of the device, you can actually poll the device using the Pa_GetStreamTime function.
You may want to see this document for more detailed info.
I know this is old. But still; PortAudio v19+ can provide you with its own sample rate. You should use that for audio sync, since actual sample rate playback can differ between different hardware. PortAudio might try to compensate (depending on implementation). If you have drift problems, try using that.

Resources