Creating binding wrappers for SuperPoweredSDK in xamarin? - binding

I have been trying to create binding for SuperPowered SDK in xamarin.ios.
I have cloned this repository (https://bitbucket.org/bryonbaker/xamarin-spectrum-analyser) and have added a wrapper for advanced audio player both in xcode wrapper and xamarin wrapper as well.
It appears to have binding working in my xamarin.ios project, having said that, when using AdvancedAudioPlayer's Play() method, it doesn't produce any sound.
I have created a git repo where the code has been pushed. I wish someone could look into and let me know where I have missed anything.
https://github.com/Dhruvbhagat/SuperPoweredBinding.git

I don't see any audio I/O in the repo. The player is a "DSP object", outputting floating point audio data. If that data is not consumed by an audio I/O, then nothing will really happen.

Related

How do I send messages from my swift file to Unity (NOT from Unity to swift)

I have been having the hardest time watching tutorials and reading documentation on how to bridge Unity and Swift together. It seems most people are looking for a way to have Unity send messages to the swift file. I am looking for the exact opposite. I would like to specifically send screenshot detection messages back to my Unity project.
So far, I have tried using the NotificationCenter.addObserver methods.
Combined with the built-in "UnitySendMessage" function (in the XCode main.mm file) I was expecting it to update my Unity project with no problem.
Is there anybody else that has achieved this?

Using the AVPlayer in swift (xcode 6), how can I implement automatic gain control (AGC) while playing remote files?

I absolutely need to play remote files in an iOS app I'm developing, so using AVPlayer seems to be my only option. (I don't want to download/load files as NSData, then implement the AVAudioPlayer because this doesn't allow the files to start playing immediately.) These remote files sometimes vary greatly in output volume from one to another. How can I implement some form of automatic gain control for the AVPlayer? It's seeming like it's not even possible.
Also: I've explored the MTAudioProcessingTap, but couldn't get it to work using information from the following blogs:
http://venodesigns.net/2014/01/08/recording-live-audio-streams-on-ios/
and
https://chritto.wordpress.com/2013/01/07/processing-avplayers-audio-with-mtaudioprocessingtap/
I'm open to any ideas that involve the AVPlayer. Can it be done? (Thanks in advance - cheers!)

What is the simplest way to play a MIDI note for an indefinite duration in iOS?

I want to play instrument 49 in iOS for varying pitches [B2-E5] for varying durations.
I have been using the Load Preset Demo as reference. The vibraphone.aupreset does not reference any files. So, I had presumed that:
I would be able to find and change the instrument in the aupreset file (unsuccessful so far)
there is some way to tell the MIDI interface to turn notes on and off without generating *.mid files.
Here's what I did:
Duplicated the audio related code from the proj
removed the trombone related files and code (called loadPresetTwo: in place of loadPresetOne: in init (as opposed to viewDidLoad)),
added a note sequence, and timer to turn off the previous note, and turn on the next note.
Build. Run. I hear sound on the simulator.
There is NO sound coming from my iPhone.
I have triple checked the code that I copied as well as where the calls are taking place. It's all there. The difference is the trombone related files and code are absent. Perhaps there is some dependency that I'm not aware of. Perhaps this is problem rooted in architectural differences between the simulator running on remote Mac VM and the iPhone. Perhaps I can only speculate because I don't know enough about the problem to understand what questions to ask.
Any thoughts or suggested tests would be great!
Thanks.
MusicPlayer + MusicSequence + MusicTrack works. It was much easier than trying to guess what code in the demo was doing.

Creating XNA AudioEngine on windows game project

I'm reading this book "Learning XNA 4.0", and in chapter 6 it teaches how to play sounds using XACT Audio files. It asks me to create an AudioEngine object, but I can't find that class.AudioEngine
I have the right using statement (Microsoft.Xna.Framework.Audio) and the right reference (microsoft.xna.framework.dll).
does anybody have any idea whats wrong?
This is an error in the documentation. AudioEngine is actually in the Microsoft.Xna.Framework.Xact assembly. Add a reference to that, and everything should work as expected.
Note that XACT is not available on Windows Phone 7 (this is why it is in a separate assembly). If you plan to go mobile later, use SoundEffect instead.

Audio Plugin Implementation

Does anyone know about some sample code that illustrates the implementation of an audio plugin for iOS?
There was an AUPlugin.h header for iPhone SDK 3 that seemed to permit you to create your own audio units, which you could then put in an AUGraph. But it apparently never worked, and all its functions (AudioComponentRegister(), etc.) were deprecated in iOS 4.
Basically, to do your own audio processing, you need to set up a render callback as a property of a single audio unit or somewhere in an AUGraph, and do your work in that callback function.

Resources