iOS: Mute right channel volume - ios

I'm trying to mute the right channel for all audio apart from an audio stream that I control.
I am using a number of libraries playing audio including OpenEars for Text-to-speech and I would like all of them to only play out the left headphone speaker while I play something else out the right speaker.
I know how to play out just the right speaker creating an Audio Units stream however I am not creating the audio streams for the other libraries. Is there a way to change the default audio channel? Or is it possible to create an effort or mixer that is applied to all audio going out that mutes the right channel?
Any help/hints would be much appreciated.

If you don't need low latency audio, wich AudioUnit provides, you can try using AudioQueues to play your audio instead, AudioQueues has the highest degree of control over audio in iOS (it supports synchronization, channel control, volume gain, buffer, etc.). Here's some links to Apple's documentation :-)
Audio Queue Services Programming Guide Introduction: http://bit.ly/10ikt4G
About Audio Queues: http://bit.ly/YfnVI8
Playing audio with AudioQueue: http://bit.ly/10ikAgT

Related

How can I make Apple's mixer audio unit on iOS not do an audio fade?

I am using The Amazing Audio Engine to simply play an audio file, but I find that when the channel starts playing, there is some automatic fade in happening.
You can see the top waveform is the output of my iPad, and the bottom waveform is the actual raw audio file. There is definitely a 30ms microfade being done.
There is nothing doing that within the amazing audio engine library, so it's something internally happening from apple's mixer audio unit. Is there any way to turn off this behavior?
I suspect that the AudioFilePlayer (used by TAAE) uses the Extended Audio File Services under the hood. ExtAudioFileRef will do that on the first read after a seek if there is any de-coding or sample rate conversion. I had to use the Audio File Services directly to get ride of the implicit fading.

Using AVPlayer to make audio tinny (adjust playback EQ).

I would like to be able to have music playback that can allow me to make the audio tinny. Like an Equalizer, i would like to alter the Channel EQ of the audio using a slider to choose whether the audio is full and rich, or light and tinny during playback of the audio file. Is this implementation possible?
Thanks,

Which way is more simple to capture Audio by mic, meanwhile play the audio? like audio amplifier

I have roughly researched audio APIs for iOS. There are several layer APIs to perform audio capture and play.
My app needs a simple function like audio amplifier (needs delay around 0.2 Seconds). I don't need save record to file. I am not sure which way is more simple to implement it. Core Audio? Or AVfoundation?
How do I record audio on iPhone with AVAudioRecorder? I am not sure does this link working with my case or not.
While playing a sound does not stop recording Avcapture This link is playing other audio when recording. It is not suit my case.
For buffered near-simultaneous audio record and playback, you will need to use either the Audio Queue API or Audio Units such as RemoteIO. Audio Units will allow a lower latency.

Capturing iPhone game audio

I'd like to capture the audio (music + sound effects) coming from my iPhone game. AVCaptureSession seems to have only the microphone as audio source. I'd like to capture the audio, put it into CMSampleBufferRefs and append these to an AVAssetWriterInput.
I'm currently looking into Audio Queues. Any other ideas?
There is no API to directly capture all the sound effects and music from your game.
The most common solution is for an app to generate all sound twice, once for audio output, plus a second identical copy in the from of PCM samples to feed a DSP or Audio Unit mixer. Then feed the mixer output to AVAssetWriter or other file output. This technique is much easier to implement if all the sounds produced by your app are in the form of raw PCM audio played via Audio Queue or the RemoteIO Audio Unit API, which may require significant rewrites to your music and game sound code.

Can AVAudioRecorder be used to record audio coming from your iOS app?

I'd like to record the audio coming from my iPhone app. So after the background music and sound effects are mixed I'd like to sample the audio before it's played from the device's speakers (or headphones).
I've been experimenting with RemoteIO Audio Units. These seem promising. However they're pretty low level. Can AVAudioRecorder (or other "high-level" object) be used to capture audio coming from an iOS device?
As far as I can tell, there's no way to do this with AVAudioRecorder. You must use Audio Units.
Record audio iOS
Is there a way to record device audio on the iPhone?
Record samples being played with OpenAL
Offline audio recording on iOS with OpenAL

Resources