AudioKit and AUv3 Instrument - audiokit

I have currently an audio app, using AudioKit, that play samples with AKSampler and I'm now trying to make an AUv3 AudioUnit extension for my app.
I have successfully create my AUv3 extension (thanks to the AudioKit AUv3 tutorial) but now I can't figure how to connect my AudioKit AudioEngine to the AUv3 AudioUnit.
I have made a lot of searches about this, but I didn't find anything about AudioKit and Auv3 using audio (AudioKit AUv3 tutorial only talks about MIDI).
Does anyone here know how to connect AudioKit to an AUv3 AudioUnit extension ?
Thanks!!!

Related

#AudioKit Can AudioKit and CoreAudio interwork in same iOS application?

I have a pre-existing iOS app written using Core Audio framework. There is a need to use MicroPhone and process Microphone input. core-audio does not support that. I am trying to use AudioKit just for Microphone.
Program crashes with a simple statement like:
mic = AKMicrophone();
Is this interworking supported?

How do you modify an IOS swift 3 app to use Inter-App Audio (IAA)

I am writing a swift 3 audio app that uses AVFoundation, AVPlayer and AVAudioSession. I've accessed the Apple documentation for Inter-App Audio (IAA) but all the coding is in objective-C. I'd like something similar for Swift 3 as some aspects seem too difficult to convert to swift3. Perhaps somebody has already achieved this in swift3. Thanks.

AUGraph/AudioUnit support in Rdio iOS SDK

I wanted to develop a iOS music player app by using Rdio streaming service.I have gone through rdio sdk documentation and able to understand the audio playback API's. However along with playback, I would also want to add some of the default apple provided audio effects like equalizers etc to audio stream. I could not find any way to do it so.
Is rdio iOS SDK allow us to add audio effects to its pipeline or is there any other alternative?
The Rdio iOS SDK doesn't let you integrate with AUGraph/AudioUnit. The SDK is designed to work like a blackbox, you specify what to play and music comes out the speakers. This is done to prevent developers from accessing and manipulating the audio.

how to record sound for cocos2d-x V.3 for iOS?

I'm developing a game by cocos2d-x V.3.x, I want to record player's sound using iPad's mic, but I cannot import AVFoundation.h. It links to many errors due to OBJC and C++ language problem?
How can I integrate AVFoundation to cocos2d-x? or any sound engine that can help doing audio recording and replay for cocos2d-x?

AudioUnit Echo Cancellation Built-in Feature for iOS

Currently I am developing an iOS app that captures the sound using iPad mic. At the same time, a sound is being played through iPad speakers. Since the objective is to process the isolated input sound, the speaker feedback should be removed (cancelled). I looked for some documentation related to AudioToolbox possibilities. I found out AudioUnit can perform Built-in Features such as echo cancellation. However, I did not manage to configure correctly my audio graph in order to achieve a successful performance.
I would appreciate any suggestion, sample codes, references to tutorials or recommended bibliography.
Thank you very much,
Carles

Resources