iOS - Streaming Multitrack Audio - ios

I'm building an app in which I need to steam multiple tracks of audio that make up a song. They all need to be synchronized so the song plays back naturally.
I've been able to play back local multitrack audio very well with the solution on this thread: multi track mp3 playback for iOS application, but it looks like AVAudioPlayer isn't able to stream.
I've been looking into working with DOUAudioStreamer because I've read that it's the best solution for streaming audio on iOS without going pretty low-level, but it seems to lack the equivalent of a -playAtTime: method, which is how the tracks were synced up using `AVAudioPlayer.
Does anyone know a workaround for this using DOUAudioStreamer, or have any advice on another way I should approach this? Thanks.

Related

Swift - play sounds parallel with spotify

Is it possible to play a sound in swift when another one is being played, my client wants to make an app that emit sounds while spotify is playing songs too, is that possible ?
No you cant have multiple apps playing sounds at the same time.

Audio pitch shift using objective-c - iOS

I'm making an app where I gotta record sound through microphone, change its pitch, save it in Documents Directory and play it.
I have accomplished the recording and playing part successfully via AVRecorder and AVAudioPlayer. But I'm unable to change the pitch of the recorder sound :(
Searched a lot all over StackOverlow and Google, but all methods aren't working. DIRAC is not available now I guess, SoundTouch integration is going over my head, other solutions are in Objectice-C++ which I don't understand.
Also tried OpenAL, but it only allows to play the changed pitch sound, not save it.
Please tell me if it's possible using Objective-C, or any other example/tutorial.
Thank you.

Record audio streamed from AVPlayer

I'm trying to record/capture audio that's being streamed via an AVPlayer. The AVAudioRecorder only records from the microphone which may work if the audio is played via speakerphone (although quality will suffer) but it'll definitely not work if the headphones are plugged in.
I've looked everywhere for a solution but still haven't found a solution that'll work for me. Would I need to grab the audio buffer? Is there another way to capture what's being played?
You can grab audio buffers by adding an MTAudioProcessingTap to your AVPlayer.
The process is a little convoluted, but there is some information out there.
The easiest approach nowadays is to play and record using AVAudioEngine.

Problems Recording and Playing Back Audio Simultaneously

I'm having some trouble working with the iOS Audio frameworks to create a simple app. I would like to record audio through the Microphone and play it back to the user while recording.
I have tried each of the audio framework layers(AVFoundation, Audio Queue API, and RemoteIO), but have only found old documentation and broken examples.It seems like a simple request that AVFoundation should handle, but I have explored the following other SO questions and still find myself circling for hours to get the hang of this. Here is what I have reviewed:
iOS: Sample code for simultaneous record and playback (Other SO Users also state the accepted answer is not concrete and difficult to implement even with a delay of ~70ms.)
Record and play audio Simultaneously (From 2010 and very high level, I have downloaded the sample code and can't find a working example that does simultaneous playback and recording).
Adjust the length of an AudioUnit Buffer (RemoteIO is so confusing to me right now, is this really required?)
I have also downloaded and reviewed both the SpeakHere and AurioTouch sample projects from Apple. I promise I wouldn't post up without hours of googling and struggling. You can see "record audio and playback iOS simultaneously" returns many dated and non-working examples.I know myself and the community could really benefit from some updated documentation and examples in the audio section. RemoteIO seems to be too advanced for such a simple task. Thanks again for your help and consideration.
The appropriate way to do this is via AudioUnit APIs, even though it seems like a common scenario which should be handled by higher level APIs.
I wrote a small demo app using AudioUnit. You're free to try it our and modify it for suiting your purpose. The demo app does record audio and play it simultaneously, but it's recommended to use a ear phone to see the effect.
The RemoteIO Audio Unit is the only way to play back what is being recorded with low latency. RemoteIO is low latency because it runs audio callbacks in a separate dedicated real-time thread which is why it is fast, but also why it is a bit more complex to code. All the other iOS audio APIs are built on top of RemoteIO and thus add latency.
You will also need to configure the app's Audio Session APIs to request low latency with the appropriate audio session type. The foreground app can request and get audio input and output latencies as low as 5.6 milliseconds on most iOS devices most of the time.

ios Can I use MPMoviePlayerController to play .mp3 url

My app needs to play some music files, like .mp3. I would like to use MPMoviePlayerController because it has implemented all the UI stuff for me, i.e. I do not want to bother implementing progress slide bar and things like this.
I tested to use it to play a .mp3 file and it worked fine but I do not know if it is fine to use it to do this because its name says "movie player" and it seems it is supposed to play a movie. Would apple reject this? Thank you.
For playing audio from a file or memory, AVAudioPlayer is your best option but unfortunately it doesn't support a network stream while MPMoviePlayerController can
From documentation :
An instance of the AVAudioPlayer class, called an audio player,
provides playback of audio data from a file or memory.
Apple recommends that you use this class for audio playback unless you
are playing audio captured from a network stream or require very low
I/O latency.
For the Apple validation I don't think that your application can be rejected because you're using the Media Player Framework to play an audio file. In fact here they explicitly say that you can do just that:
Choose the right technology for your needs:
To play the audio items in a user’s iPod library, or to play local or
streamed movies, use the Media Player framework. Classes in this
framework automatically support sending audio and video to AirPlay
devices such as Apple TV.
Not sure about performance and memory issues though!
Best of luck.

Resources