Best strategy for managing audio instances to allow cross-fades, etc - ios

I'm working on an iOS application that works with different pieces of audio. Each piece of audio is tied to a separate button (similar to the kind of functionality you'd see in a soundboard app).
In a simple soundboard application, you've got an instance of a player object (AVAudioPlayer or an AVAudioEngine player node) that gets fired whenever a button is pressed.
If you push buttonOne, then sound1 plays.
If you push buttonOne again while sound1 is still playing, then the current instance is interrupted and "replaced" with a new instance of sound1 that starts over at the beginning.
If you push buttonOne, THEN push buttonTwo before sound1 is finished, the instance of sound1 is interrupted and replaced with sound2, again, played from the beginning.
Suppose you're trying to enable cross-fading between the two sounds. You can just create two player instances, load the first sound into player1 and the second into player2, and cross fade between them.
Building on that, suppose you're trying to allow different combinations of sounds to play at the same time. Either the idea that a large number of sounds (maybe the whole soundboard) can all play without interrupting each other. Or, perhaps, the ability to have multiple sounds playing at the same time, e.g. sound1 is a music bed, and sound2...soundXX are sound effects that should be able to play over the music bed without interrupting it.
QUESTIONS: what is the best design strategy for managing your player instances in this situation? Suppose you have a 5 x 5 grid of buttons in your soundboard. If, hypothetically, you should be able to play all 25 sounds at the same time, would that require you to init 25 player instances at setup? (this seems very anti-DRY and not particularly efficient). Or is there some way to dynamically manage the number of instances you need (maybe with a lazy variable?) so that additional instances are only generated as needed, e.g. when you have x number of sounds playing and you start another, an additional instance is generated to contain the newly added sound?

what is the best design strategy for managing instances of AVAudioPlayer in this situation
None. You should be using AVAudioEngine. That's exactly what it is, a sound board / patch kit.
Or is there some way to dynamically manage the number of instances you need
If I have 25 ordered instances of a thing where one possibility is no instance at all, that sounds like an array of Optionals.

Related

Best way to play silence using AVAudioPlayer on iOS

I found myself in a situation where I need to simulate audio playback to trick OS controls and MPNowPlayingInfoCenter into thinking that an audio is being played. This is because I am building a player that plays multiple audio tracks, with pauses in-between creating one, continuous "audio" track. I have already everything setup inside the app itself, and the lock screen controls are working correctly but the only problem I am facing is while the actual audio stops and a pause is being "played", the lock screen info center stops the timer, and it only continues with showing correct time and overall state once another audio track starts playing.
Here is the example of my audio track built from audio files and pause items:
let items: [AudioItem] = [
.audio("part-1.mp3"),
.pause(duration: 5), // value of type: TimeInterval
.audio("part-2.mp3"),
.pause(duration: 3),
... // the list goes on
]
then in my custom player, once AVAudioPlayer finishes its job with current item, I get the next one from the array and play either a .pause with a scheduled Timer or another .audio with AVAudioPlayer.
extension Player: AVAudioPlayerDelegate {
func audioPlayerDidFinishPlaying(_ player: AVAudioPlayer, successfully flag: Bool) {
playNextItem()
}
}
And here lies the problem, once the AVAudioPlayer stops, the Now Playing info center automatically stops too, even tho I keep feeding it fresh nowPlayingInfo. Then when it hits another .audio item, it resumes correctly and shows current time, etc.
And here lies the question
how do I trick the MPNowPlayingInfoCenter into thinking that audio is being played while I "play" my .pause item?
I realise that it may still not be clear, what I am trying to achieve but I am happy to share more insight if needed. Thanks!
Some solutions I am currently thinking about:
A. Keeping 1s long empty audio track that would play on loop for as long as the pause is needed to play.
B. Creating programatically empty audio track with appropriate lenght and playing it instead of using Timer for keeping track of pause duration/progress and relying completely on AVAudioPlayer for both .audio and .pause items. Not sure this is possible though.
C. Maybe there is a way to tell the MPNowPlayingInfoCenter that the audio keeps playing without the need of using AVAudioPlayer but some API I am not familiar with?
AVAudioPlayer is probably the wrong tool here. You want AVAudioPlayerNode, which is slightly lower-level. Create an AVAudioEngine, and attach an AVAudioPlayerNode. You can then call scheduleFile(_:at:completionHandler:) to play the audio at the times you want.
Much of the Apple documentation on AVAudioEngine appears broken right this moment, but the links hopefully will be available again shortly in the links for Audio Engine Building Blocks. (If it stays down and you have trouble finding docs, leave a comment and I'll hunt down the WWDC videos and other tutorials on using AVAudioEngine. It's not particularly difficult for simple problems.)
If you know in advance how you want to compose these items (and it looks like you may), see also AVMutableComposition, which lets you glue together assets very efficiently, including adding empty segments of silence. See Media Composition and Editing for the various tools in that space.

Playing Two AVPlayers with two remote videos in sync [iOS]

As titled, I'm currently working on an app that needs to play two videos on a server. The thing is I need to play these two videos in sync which means two of the AVPlayers start the video exactly at the same time, and whenever one of the AVPlayer is paused due to buffering, the other AVPlayer needs to be paused, along with the resumed after buffer and vice versa.
I tried to look around but can't really seem to find a solution to this case.
I need to play it via stream which means my business requirement won't allow me to download both of the videos first.

Is it possible to use multiple 'MTAudioProcessingTap's?

In my iOS app, I'm using AVFoundation's AVComposition to create a video, with multiple audio tracks. I am trying to let the user see the volume/power-level for each audio-track. I've successfully implemented this for one track, but as soon as I try to use a second MTAudioProcessingTap, it fails with OSError -12780. In fact, if I use a processingTap and then go 'back' - deallocating the entire view controller, and re-opening that particular window, it won't even attach the processingTap again, even if the first AVPlayer playing the composition has been deallocated. To solve this, I found out from searching that I have to manually release it and clearing out the audioMix for the player, but that's not my problem now. Now, I can't clear out the other processingTap, I need them both!
I'm not completely sure what the MTAudioProcessingTap actually is, as it is the single least documented piece of code ever to come out of the Apple dev-team, by far. I've watched the WWDC, and gone through the iPad-sample-project they have made, but I can't figure out how to have two taps running.
I figured I might not actually need two taps, but maybe use one to handle more than one audioTrack, however if I somehow manage to use the same tap for two AudioTracks, I wouldn't know how to tell the taps apart in the static callbacks. Is there perhaps another way to monitor audio-levels from an AVComposition playing in an AVPlayer, or is there a way like this?

Multiple AVPlayers on Separate UIViewControllers

I have multiple AVPlayers, each on separate UIViewControllers playing different songs.
I need to pause one AVPlayer whenever I play another one (otherwise the audio overlaps).
I would like to let the user traverse through the app while the music plays in the background, so pausing the AVPlayer on viewDidDisappear:(BOOL)animated, would not work.
What is the best way to access the controls of each separate AVPlayer?
In my opinion a singleton with only 1 AVPlayer solves this issue well. This way you guarantee that to play another song you have to stop the previous. Then, in that AVPLayerSingleton you have a private property called avPlayer. You can define two methods:
- (void)createPlayerWithSong:(NSString *)currentSong;
- (void)destroyPlayer
Then, in your createPlayerWithSong you can check if avPlayer is already created and destroy it and create a new one for each new song.
Couple of ways you could do it:
Create a shared singleton object that have weak properties that reference each AVPlayer. That way you can control it from anywhere.
OR
Use NSNotificationCenter to send notifications when you want to control an AVPlayer on a different view controller. This might get cumbersome if you have a lot of AVPlayers you want to control.

Dim volume of iPodMusicPlayer to 50% in iOS

I know that you can easily set the volume property of the music player, but I want to do it smoothly like Google Maps does when they use the voiceover for navigation instructions.
I was wondering what the best way to do this is.
Thanks!
I would try using a repeating NSTimer. Every time the timer fires you lower the volume a bit. When it reaches the target value you invalidate the timer.
Other ways of getting a repeated event (so that you can do something in stages gradually over time) are DISPATCH_SOURCE_TYPE_TIMER and CADisplayLink. But I think a timer is probably the simplest way to get started.
If you have a pre-existing sound that you're playing, a completely different solution is to apply a fadeout to it before you start playing it (and then just play it all at the same volume, because the sound itself fades out, do you see). AVFoundation gives you the tools to do that (e.g. setVolumeRampFromStartVolume:toEndVolume:timeRange:).

Resources