Audio Plugin Implementation - ios

Does anyone know about some sample code that illustrates the implementation of an audio plugin for iOS?

There was an AUPlugin.h header for iPhone SDK 3 that seemed to permit you to create your own audio units, which you could then put in an AUGraph. But it apparently never worked, and all its functions (AudioComponentRegister(), etc.) were deprecated in iOS 4.
Basically, to do your own audio processing, you need to set up a render callback as a property of a single audio unit or somewhere in an AUGraph, and do your work in that callback function.

Related

Creating binding wrappers for SuperPoweredSDK in xamarin?

I have been trying to create binding for SuperPowered SDK in xamarin.ios.
I have cloned this repository (https://bitbucket.org/bryonbaker/xamarin-spectrum-analyser) and have added a wrapper for advanced audio player both in xcode wrapper and xamarin wrapper as well.
It appears to have binding working in my xamarin.ios project, having said that, when using AdvancedAudioPlayer's Play() method, it doesn't produce any sound.
I have created a git repo where the code has been pushed. I wish someone could look into and let me know where I have missed anything.
https://github.com/Dhruvbhagat/SuperPoweredBinding.git
I don't see any audio I/O in the repo. The player is a "DSP object", outputting floating point audio data. If that data is not consumed by an audio I/O, then nothing will really happen.

Using the AVPlayer in swift (xcode 6), how can I implement automatic gain control (AGC) while playing remote files?

I absolutely need to play remote files in an iOS app I'm developing, so using AVPlayer seems to be my only option. (I don't want to download/load files as NSData, then implement the AVAudioPlayer because this doesn't allow the files to start playing immediately.) These remote files sometimes vary greatly in output volume from one to another. How can I implement some form of automatic gain control for the AVPlayer? It's seeming like it's not even possible.
Also: I've explored the MTAudioProcessingTap, but couldn't get it to work using information from the following blogs:
http://venodesigns.net/2014/01/08/recording-live-audio-streams-on-ios/
and
https://chritto.wordpress.com/2013/01/07/processing-avplayers-audio-with-mtaudioprocessingtap/
I'm open to any ideas that involve the AVPlayer. Can it be done? (Thanks in advance - cheers!)

What is the simplest way to play a MIDI note for an indefinite duration in iOS?

I want to play instrument 49 in iOS for varying pitches [B2-E5] for varying durations.
I have been using the Load Preset Demo as reference. The vibraphone.aupreset does not reference any files. So, I had presumed that:
I would be able to find and change the instrument in the aupreset file (unsuccessful so far)
there is some way to tell the MIDI interface to turn notes on and off without generating *.mid files.
Here's what I did:
Duplicated the audio related code from the proj
removed the trombone related files and code (called loadPresetTwo: in place of loadPresetOne: in init (as opposed to viewDidLoad)),
added a note sequence, and timer to turn off the previous note, and turn on the next note.
Build. Run. I hear sound on the simulator.
There is NO sound coming from my iPhone.
I have triple checked the code that I copied as well as where the calls are taking place. It's all there. The difference is the trombone related files and code are absent. Perhaps there is some dependency that I'm not aware of. Perhaps this is problem rooted in architectural differences between the simulator running on remote Mac VM and the iPhone. Perhaps I can only speculate because I don't know enough about the problem to understand what questions to ask.
Any thoughts or suggested tests would be great!
Thanks.
MusicPlayer + MusicSequence + MusicTrack works. It was much easier than trying to guess what code in the demo was doing.

JUCE ios audio processing

Could you point me to any examples where JUCE library has been used process Audio in iOS. Thanks in advance.
Regards,
Waruna.
Look at the JUCE demo included with JUCE. This runs just fine on iOS. Just edit that code and register an AudioIODeviceCallback with your AudioDeviceManager object to do some custom audio processing.

Samples/file extensions supported by iOS sampler

I'm writing an iOS app which can play MIDI and output its content using the AUSampler and AUGraph classes. I know for sure it supports files like Soundfont (.sf2) but this one seems to be quite antiquated.
Question: Are there any other files or sample types which this framework supports?
Thanks.
The AUSampler also supports DLS format (.dls) and AUpreset format (.aupreset)

Resources