AUGraph and streaming audio (http and ets) - ios

I have player based on AUGraph (i need equalizer). Player can play local audio files. Now i want to add support stream audio. What i can use for it ? Anyone standart class like AVAssetReader (AVAssertReader can't play stream :( ), or maybe anyone know open lib for it ? Thanks.

As steaming is now supported natively in AVPlayer, there aren't too libs which are actively supported. One which has been around for some time (i've used it myself in the past) is Matt Gallagher's AudioStreamer, https://github.com/mattgallagher/AudioStreamer, or Thong Nguyen StreamingKit, https://github.com/tumtumtum/StreamingKit
An alternative approach you might want to consider is an implementation similar to the one demonstrated in Apple's AudioTapProcessor. Essentially inserting an MTAudioProcessingTap into AVPlayer's pipeline to introduces a kAudioUnitSubType_BandPassFilter AudioUnit. A similar approach could be used to introduce an kAudioUnitSubType_NBandEQ AudioUnit.

Related

AudioKit and Bass MIDI player

We are considering using the AudioKit framework along with the BASSMIDI sampler provided by http://www.un4seen.com/ in our iOS app.
Here is what we aim to implement :
Play a MIDI File using the AudioKit sequencer
Send the MIDI events (read by the AudioKit sequencer) to the BASSMIDI sampler.
Redirect the BASSMIDI sampler's audio output to an AudioKit Mixer instance
Our main concern is that it doesn't seem possible to access the BASSMIDI sampler's audio output.
Has anyone had the experience to do this ?
Is it simply possible ?
Thanks in advance for your insights!
For those who would be interested in using the BASSMIDI player on iOS, we finally ended up implementing an AUv3 Audio Unit wrapped around the bassmidi library.
The main advantage is that this audio unit can be inserted into a graph of audio nodes handled by the iOS Audio Engine (just like you would do with the AVAudioUnitSampler).
The code is available on a public repository:
https://github.com/newzik/BassMidiAudioUnit
Feel free to use it!

IOS: How to increase the Bass and Treble of a AVAudioPlayer in Swift?

Anyone knows how to increase the bass and treble of a track ?
Within the same track, if I do a split into 3 sections, can I adjust & have say 3 different level of reverb, ie one in each section ?
Thanks
I don't think it is possible to use EQ effects with an AVAudioPlayer.
A quick search gave me answers like this from StackOverflow:
can I use AVAudioPlayer to make an equalizer player?
Or this sadly unanswered question from Apple Developer Forums:
https://forums.developer.apple.com/thread/46998
Instead
What you can do instead is use the AVAudioEngine (https://developer.apple.com/reference/avfoundation/avaudioengine) which gives you the opportunity to add an EQ node (or other effect nodes) to your AVAudioPlayer node.
AVAudioEngine may seem daunting at first, but think of it as a mixer. You have some input nodes that generate sound (AVAudioPlayer nodes for instance), and you can then attach and connect those notes to your AVAudioEngine. The AVAudioEngine has a AVAudioMixerNode so you can control things like volume and so forth.
Between your input nodes and your mixer you can attach effect nodes, like an EQ node for instance and you can add a "tap" and record the final output to a file if so desired.
Reading material
This slideshare introduction helped me a great deal understanding what AVAudioEngine is (the code is Objective C though, but it should be understandable)
The AVAudioEngine in Practice from WWDC 2014 is a great introduction too.
So, I hope you are not frightened by the above, as said, it may seem daunting when you see it at first, but once you get it wired together it works fine and you have the option to add other effects than just EQ (pitch shifting, slowing down a file and so on).
Hope that helps you.
unfortunately, it doesn't allow you to stream remote URLs. The only way around that is to download a track, convert it from mp4 or m4a to LPCM format using audio services API and then schedule a buffer to run through the audio engine. AVPlayer, on the other hand, allows you to stream remote media but it's extremely hard to attach and eq to it ... you may be able to look into MpAudioProcessingTap but that only works with local files as well.
There is a good write up to do this through AVAudioEngine here

Capturing PCM data from AVPlayer playback of HLS

We are trying to use capture the PCM data from an HLS stream for processing, ideally just before it is played, though just after is acceptable. We want to do all this while still using AVPlayer.
Has anyone done this? For non-HLS streams, as well as local files, this seems to be possible with the MPAudioProcessingTap, but not with HLS. This issue discusses doing it with non-HLS:
AVFoundation audio processing using AVPlayer's MTAudioProcessingTap with remote URLs
Thanks!
Unfortunately, this has been confirmed to be unsupported, at least for the time being.
From an Apple engineer:
The MTAudioProcessingTap is not available with HTTP live streaming. I suggest filing an enhancement if this feature is important to you - and it's usually helpful to describe the type of app you're trying to design and how this feature would be used.
Source: https://forums.developer.apple.com/thread/45966
Our best bet is to file enhancement radars to try to get them to devote some development time towards it. I am in the same unfortunate boat as you.

iOS process audio stream while playing video

I am trying to create a video player for iOS, but with some additional audio track reading. I have been checking out MPVideoPlayerController, and also AVPlayer in the AV Foundation, but it's all kinda vague.
What I am trying to do is play a video (from a local .mp4), and while the movie is playing get the current audio buffer/frames, so I can do some calculations and other (not video/audio relevant) actions that depend on the currently played audio. This means that the video should keep on playing, with its audio tracks, but I also want the live raw audio data for calculations (like i.e.: getting the amplitude for certain frequency's).
Does anyone have an example or hints to do this ? Of-course I checked out Apple's AV Foundation library documentation, but it was not clear enough for me.
After a really (really) long time Googling, I found a blog post that describes MTAudioProcessingTap. Introduced in iOS 6.0, it solves my problem perfectly.
The how-to/blogpost can be found here : http://chritto.wordpress.com/2013/01/07/processing-avplayers-audio-with-mtaudioprocessingtap/
I Hope it helps anyone else now....... The only thing popping up for me Googling (with a lot of different terms) is my own post here. And as long as you don't know MTAudioProcessingTap exists, you don't know how to Google for it :-)

How can I generate musical notes on iOS and play them?

I am creating a musical app which generate some music. I already used MIDI functions on Mac to create a MIDI file with MIDI events (unfortunately, I don't remember names of those functions).
I am looking for a way to create instrumental notes (MIDI's or anything else) programmatically in order to play them. I also would like to have multiple channels playing those notes at the same time.
I already tried 'SoundBankPlayer' but apparently, it can't play multiple instruments at the same time.
Have you got an idea?
This answer might be a bit more work than you intended, but you can use PD on iOS to do this. More precisely, you can use libpd for iOS for the synthesis, and then use any number of community-donated patches for the sound you're looking for.
In iOS 5:
MusicSequence, MusicTrack, MusicPlayer will do what you want.
http://developer.apple.com/library/ios/#documentation/AudioToolbox/Reference/MusicSequence_Reference/Reference/reference.html#//apple_ref/doc/uid/TP40009331
Check out AUSampler AudioUnit for iOS, you'll probably have to delve into Core Audio, which has some learning curve. ;)

Resources