Using the AVPlayer in swift (xcode 6), how can I implement automatic gain control (AGC) while playing remote files? - ios

I absolutely need to play remote files in an iOS app I'm developing, so using AVPlayer seems to be my only option. (I don't want to download/load files as NSData, then implement the AVAudioPlayer because this doesn't allow the files to start playing immediately.) These remote files sometimes vary greatly in output volume from one to another. How can I implement some form of automatic gain control for the AVPlayer? It's seeming like it's not even possible.
Also: I've explored the MTAudioProcessingTap, but couldn't get it to work using information from the following blogs:
http://venodesigns.net/2014/01/08/recording-live-audio-streams-on-ios/
and
https://chritto.wordpress.com/2013/01/07/processing-avplayers-audio-with-mtaudioprocessingtap/
I'm open to any ideas that involve the AVPlayer. Can it be done? (Thanks in advance - cheers!)

Related

Can I access a VLC extension using python-vlc on a Raspberry Pi 4?

So I am running VLC on a Pi 4 and I have installed an extension to VLC that shows the elapsed seconds for the video. However, when I use python-vlc to launch VLC it does not enable the extension. Is there a way to do this using python-vlc?
I think you're confusing VLC and LibVLC? By "extension", I believe you mean VLC app add-ons. I don't think you can easily have these running when using standard libvlc builds.
However, there is a way to achieve what you want using the LibVLC APIs. Look into the marquee APIs, which is a video filter that allows you to display custom text at custom locations on the video.
Docs: https://www.videolan.org/developers/vlc/doc/doxygen/html/group__libvlc__video.html#ga53b271a8a125a351142a32447e243a96

Creating binding wrappers for SuperPoweredSDK in xamarin?

I have been trying to create binding for SuperPowered SDK in xamarin.ios.
I have cloned this repository (https://bitbucket.org/bryonbaker/xamarin-spectrum-analyser) and have added a wrapper for advanced audio player both in xcode wrapper and xamarin wrapper as well.
It appears to have binding working in my xamarin.ios project, having said that, when using AdvancedAudioPlayer's Play() method, it doesn't produce any sound.
I have created a git repo where the code has been pushed. I wish someone could look into and let me know where I have missed anything.
https://github.com/Dhruvbhagat/SuperPoweredBinding.git
I don't see any audio I/O in the repo. The player is a "DSP object", outputting floating point audio data. If that data is not consumed by an audio I/O, then nothing will really happen.

What's wrong with this aupreset for AUSampler?

I created this aupreset to be loaded into an AUSampler in iOS. I followed the process outlined here and used the EPSSampler class for the same post. So, if I run my app in the iOS simulator, on iOS 9, the aupreset loads and I get to play notes. If I run the same app on a device running iOS 6, the preset loads but I get no sound. I have used the same process on simulator and device in the past, but always by filtering the built-in sine wave generator, never with audio samples. Can someone spot what I'm doing wrong?
EDIT I have no way of testing the app on any device running iOS above 6, at least for now.
EDIT 2 To clarify further, this is how my project looks in Xcode, so you know that my files are going to the right places – i.e. the audio files are going to the Sounds folder within the app bundle (I double checked, just to be sure).
EDIT 3 So, I took the Trombone.aupreset from LoadPresetDemo and manually plugged my audio files into it. Magically, it worked. So I figured I'd load it into the AUSampler's GUI through AU Lab, and make whatever changes I needed to make it sound right – i.e., increasing the release time. It stopped working. So, i manually tweaked the working copy to roughly match what I needed (the docs on aupresets plists are surprisingly unhelpful) and I'm rolling with it. It would seem that AUSampler is messing up the preset, at least for iOS 6 on device, which, currently, is the only device I have to test on. Insights?
I didn't really look that deeply, but the file://localhost//Library/Audio/Sounds/C.caf in your file references looks a little fishy. Mine looks like this /Users/dave/Library/Audio/Sounds/C6.wav. Maybe the file:// part is throwing it off.
Create a directory in your iOS project named Sounds (just like your image shows) and put your caf files in there. The URLs will be groked by iOS.
Here is a post that goes into detail.

How do I play a stream of an online M4A/AAC audio file in iOS?

Are there any libraries that exist that take in a URL as input, and starts playing the audio of the file?
I've tried AudioStreamer, but I run into ARC problems. I've also tried a couple of other libraries, but they don't seem to allow AAC files (only mp3).
<AVFoundation/AVAudioPlayer.h> and AVPlayer can help you.
AVPlayer the best solution for playing file in queue.

What is the simplest way to play a MIDI note for an indefinite duration in iOS?

I want to play instrument 49 in iOS for varying pitches [B2-E5] for varying durations.
I have been using the Load Preset Demo as reference. The vibraphone.aupreset does not reference any files. So, I had presumed that:
I would be able to find and change the instrument in the aupreset file (unsuccessful so far)
there is some way to tell the MIDI interface to turn notes on and off without generating *.mid files.
Here's what I did:
Duplicated the audio related code from the proj
removed the trombone related files and code (called loadPresetTwo: in place of loadPresetOne: in init (as opposed to viewDidLoad)),
added a note sequence, and timer to turn off the previous note, and turn on the next note.
Build. Run. I hear sound on the simulator.
There is NO sound coming from my iPhone.
I have triple checked the code that I copied as well as where the calls are taking place. It's all there. The difference is the trombone related files and code are absent. Perhaps there is some dependency that I'm not aware of. Perhaps this is problem rooted in architectural differences between the simulator running on remote Mac VM and the iPhone. Perhaps I can only speculate because I don't know enough about the problem to understand what questions to ask.
Any thoughts or suggested tests would be great!
Thanks.
MusicPlayer + MusicSequence + MusicTrack works. It was much easier than trying to guess what code in the demo was doing.

Resources