Playing random audio files in sequence with AKPlayer - audiokit

I am working on a sort of multiple audio playback project. First, I have 10 mp3 files in a folder. I wanted AKPlayer to play one of these audio files randomly, but in sequence - one after the other. But playing a random file after another random file seems to be tricky. Here's what I've written:
let file = try? AKAudioFile(readFileName: String(arc4random_uniform(9)+1) + ".mp3")
let player = AKPlayer(audioFile: file!)
player1.isLoopiong = true
player.buffering = .always
AudioKit.output = AKPlayer
try? AudioKit.start()
player.start(at: startTime)
This code loops the first chosen random file forever - but I simply wanted to play each random files once. Is there any way I can reload the 'file' so the player starts again when it's done playing? I've tried calling multiple AKPlayers (but calling 10 players must be wrong), if player.isPlaying = false, sequencer, etc, but couldn't exactly figure out how. Apologize for such a newbie question. Thank you so much.

AKPlayer has a completion handler
to be called when Audio is done playing. The handler won’t be called
if stop() is called while playing or when looping from a buffer.
The completion handler type is AKCallback, which is a typealias for () -> Void. If you have some good reason not to use 10 AKPlayers, you could probably use the completion handler to change the file and restart the player. But you could also create an array with 10 AKPlayers, each loaded with a different file, and have a function that selects a player at random for playback (or that cycles through a a pre-shuffled array). The completion handler for each player in the array could call this function, when appropriate. As per the doc quoted above, make sure that the AKPlayer is not looping or else the completion handler won't be called.

yes, you can use the completionHandler of the player to load a new file into the same player when playback finishes. In your completion block:
player.load(url: nextFile)
player.play()
Another approach is to use the AKClipPlayer with 10 clips of a predetermined random order and schedule them in sequence. This method will be the most seamless (if that matters).

Related

Why Aren't The Sounds in my Sound Group not working?

I was trying to make a music player for my game, however, when I was trying to get my sound to play it refused to work. The games output works before and after the sound, but I can't hear anything. I tried using both a folder and sound group (what I'm using currently) and both did not work. How would I fix this? I presume it has something to do with client-server but I am not sure.
local ss = game:WaitForChild("SoundService")
local rp = game:WaitForChild("ReplicatedStorage")
local list = ss.Music:GetChildren()
rp.SongOn.OnServerEvent:Connect(function(plr)
repeat
local num = math.random(1, #list)
print(num)
local track = list[num]
local name = track.Name
print(name)
plr.PlayerGui.Overhead.Notch.SongTitle.Text = track.Name
local song = ss.Music:WaitForChild(name)
print("played")
wait(track.TimeLength)
print("waited length")
until
rp.SongOff.OnServerEvent
end)
You never play anything. You don't actually reproduce the Sound. To run a Sound, use Sound:Play()
https://create.roblox.com/docs/reference/engine/classes/Sound
Your repeat until condition is also wrong. The music will not stop playing once that event is fired, and it will actually not matter at all - when the loop finishes running once, it will compare if literally rp.SongOff.OnServerEvent evaluates to true. OnServerEvent is the literal event itself, which will be true since it is not false or nil. So the loop will stop running when it runs once.
Instead, you likely want to make a function that plays the music, and run this function whenever:
Playing music is requested with the remote
The playing music naturally ends (https://create.roblox.com/docs/reference/engine/classes/Sound#Ended)
And then, bind to that stop music remote a function that stops the sound.
Also you should have some more shame, you didn't even try to hide you're making a nazi game

AudioKit AKPlayer stop with at AVAudioTime method

In AudioKit there is this method for AKPlayer:
#objc dynamic public func play(at audioTime: AVAudioTime?)
I want the same for stop method because I want to be able to stop the player at any time when the user hits the stop button. I am making a music app and I need to stop the sound in X time which is calculated based on BPM and etc.
Here is how I start my AKPlayer:
drums.play(at: AVAudioTime.now() + timeToClosestBeatGrid)
I want the same API with stop:
drums.stop(at: AVAudioTime.now() + timeToClosestBeatGrid) // this api doesnt exist :(((
I tried using endTime property by setting it but it does not seem to do anything...
How may I accomplish this?
PS: I am not looking for a Timer solution this is because a timer is not 100% accurate. I want my stop method to be 100% accurate just like play method
The most accurate way to schedule events in AudioKit is by using AKSequencer. The sequencer can be connected to a callback instrument, which is a node that passes the events to an user-defined function.
In your case, you would add an event at the time where you want the player to stop. In your callback function, you would stop the player as a response to that event.
This is an outline of what should be done:
Create a track to contain the stop event, using AKSequencer's addTrack method. Connect this track to an AKCallbackInstrument. Please see this answer on how to connect an AKCallbackInstrument to an AKSequencer track.
Add the stop event to the track, at the time position where you want the music to stop. As you will be interpreting the event yourself with a callback function, it doesn't really matter what type of event you use. You could simply use a Note On.
In the callback function, stop the player when that event is received.
This is what your callback function would look like:
func stopCallback(status:UInt8, note:MIDINoteNumber, vel:MIDIVelocity) -> () {
guard let status = AKMIDIStatus(byte: status),
let type = status.type,
type == .noteOn else { return }
drums.stop()
}
According to AudioKit documentation, you can try using the schedule(at:) method:
You can call this to schedule playback in the future or the player will call it when play() is called to load the audio data
After the play() method you should declare this schedule(at:) with an offset AVAudioTime.now() + timeToClosestBeatGrid and specify .dataPlayedBack as completion callback type, because this completion is called when (from docs)...
The buffer or file has finished playing
and now (in completion block) you can call drums.stop()
But... If the .stop() method should be called whenever the button is pressed, why not use some form of delay (Timer or DispatchQueue) with the value timeToClosestBeatGrid as the offset?

Get a callback from AKPlayer at a user specified time

I’m trying to get a callback at a given point in an AKPlayer’s file playback (currently, just before the end). I see the Apple docs on addBoundaryTimeObserver(), which would work, but it doesn’t seem to be accessible from AKPlayer (I guess an AVAudioPlayerNode vs AVPlayer thing). Any suggestions? I see a few callbacks in AVAudioPlayerNode… maybe I could determine the buffer based on the desired time and use dataConsumed?
The goal is to trigger another event just before the file finishes playing (there is a callback on completion, but obviously that's too late).
If anybody has done something similar, or knows of something similar (a gist, etc), that would be great.
There's an AudioKit playground called AKPlaygroundLoop that shows you how to call an arbitrary handler periodically, based on CADisplayLink. In your handler you could check the AKPlayer's currentTime and if it's close to the end (say 1 second before) you could trigger whatever event you want.
This is a rough outline:
var player: AKPlayer!
var loop: AKPlaygroundLoop!
func play() {
// ...
playgroundLoop = AKPlaygroundLoop(frequency: 10.0, handler: myHandler)
}
func myHandler() {
if player.currentTime >= player.duration - 1.0 {
// trigger some event
}
}
See also this answer for advice on how to synchronize events with AudioKit.

Play a sound for a specific number of times in AVAudioPlayerNode in iOS

There is an option numberofloops in AVAudioPlayer which repeats a sound file for a specified number of times.
I have to implement this type of function in AVAudioPlayerNode and I found an option just to loop a sound file in an infinite number of times in a loop using the following code
audioPlayerNode.scheduleBuffer(audioFileBuffer, atTime: nil,
options:.Loops, completionHandler: nil)
Is it possible to repeatedly play a file in a fixed number of times like as AVAudioPlayer? Is there any code sample to achieve this?
Am not sure if there is an option to do it however you can achieve this by creating a simple function like below.
func playBuffer(buffer,times) {
for (var i=1;i<=times;i++) {
audioPlayerNode.scheduleBuffer(audioFileBuffer)
}
}
and then you call the function like this
playBuffer(buffer,3) // to execute three times.
// or
playBuffer(buffer,1) // to execute one time.
p.s: This is almost like a pseudo code just to show the logic. you have to write it in swift.

iOS/AVFoundation: Design pattern for asynch handlers when turning arrays of images into tracks and then into a single video?

Can you point me to design pattern guides to adapt my style to AVFoundation's asynch approach?
Working an app where you create an image and place audio onto hotspots on it. I'm implementing export to a movie that is the image with effects (glow of hotspot) playing under the audio.
I can reliably create the video and audio tracks and can correctly get audio into an AVMutableComposition and play it back. Problem is with the video. I've narrowed it to my having written a synchronous solution to a problem that requires use of AVFoundation's asynch writing methods.
The current approach and where it fails (each step is own method):
Create array of dictionaries. 2 objects in dictionary. One dictionary object is image representing a keyframe, another object is URL of audio that ends on that keyframe. First dictionary has start keyframe but not audio URL.
For each dictionary in the array, replace the UIImage with an array of start image->animation tween images->end state image, with proper count for FPS and duration of audio.
For each dictionary in the array, convert image array into a soundless mp4 and save using [AVAssetWriter finishWritingWithCompletionHandler], then replace image array in dictionary with URL of mp4. Each dictionary of mp4 & audio URL represents a segment of final movie, where order of dictionaries in array dictates insert order for final movie
-- all of above works, stuff gets made & ordered right, vids and audio playback --
For each dictionary with mp4 & audio URL, load into AVAssets and insert into an AVMutableComposition track, one track for audio & one for video. The audio load & insert works, plays back. But the video fails and appears to fail because step 4 starts before step 3's AVAssetWriter finishWritingWithCompletionHandler finishes for all MP4 tracks.
One approach would be to pause via while loop and wait for status on the AVAssetWriter to say done. This smacks of working against the framework. In practice it is also leading to ugly and sometimes seemingly infinite waits for loops to end.
But simply making step 4 the completion handler for finishWritingWithCompletionHandler is non-trivial because I am writing multiple tracks but I want step 4 to launch only after the last track is written. Because step 3 is basically a for-each processor, I think all completion handlers would need to be the same. I guess I could use bools or counters to change up the completion handler, but it just feels like a kluge.
If any of the above made any sense, can someone give me/point to a primer on design patterns for asynch handling like this? TIA.
You can use GCD dispatch groups for that sort of problem.
From the docs:
Grouping blocks allows for aggregate synchronization. Your application
can submit multiple blocks and track when they all complete, even
though they might run on different queues. This behavior can be
helpful when progress can’t be made until all of the specified tasks
are complete.
The basic idea is, that you call dispatch_group_enter for each of your async tasks. In the completion handler of your tasks, you call dispatch_group_leave.
Dispatch groups work similar to counting semaphores. You increment a counter (using dipsatch_group_wait) when you start a task, and you decrement a counter when a task finishes.
dispatch_group_notify lets you install a completion handler block for your group. This block gets executed when the counter reaches 0.
This blog post provides a good overview and a complete code sample: http://amro.co/post/48248949039/using-gcd-to-wait-on-many-tasks
#weichsel Thank you very much. That seems like it should work. But, I'm using dispatch_group_wait and it seems to not wait. I've been banging against it for several hours since you first replied but now luck. Here's what I've done:
Added property that is a dispatch group, called videoDispatchGroup, and call dispatch_group_create in the init of the class doing the video processing
In the method that creates the video tracks, use dispatch_group_async(videoDispatchGroup, dispatch_get_global_queue( DISPATCH_QUEUE_PRIORITY_DEFAULT, 0), ^{ [videoWriter finishWritingWithCompletionHandler:^{
The video track writing method is called from a method chaining together the various steps. In that method, after the call to write the tracks, I call dispatch_group_wait(videoProcessingGroup, DISPATCH_TIME_FOREVER);
In the dealloc, call dispatch_release(videoDispatchGroup)
That's all elided a bit, but essentially the call to dispatch_group_wait doesn't seem to be waiting. My guess is it has something to do with the dispatch_group_asyn call, but I'm not sure exactly what.
I've found another means of handling this, using my own int count/decrement via the async handler on finishWritingWithCompletion handler. But I'd really like to up my skills by understanding GCD better.
Here's the code-- dispatch_group_wait never seems to fire, but the movies themselves are made. Code is elided a bit for brevity, but nothing was removed that doesn't work without the GCD code.
#implementation MovieMaker
// This is the dispatch group
#synthesize videoProcessingGroup = _videoProcessingGroup;
-(id)init {
self = [super init];
if (self) {
_videoProcessingGroup = dispatch_group_create();
}
return self;
}
-(void)dealloc {
dispatch_release(self.videoProcessingGroup);
}
-(id)convert:(MTCanvasElementViewController *)sourceObject {
// code fails in same way with or without this line
dispatch_group_enter(self.videoProcessingGroup);
// This method works its way down to writeImageArrayToMovie
_tracksData = [self collectTracks:sourceObject];
NSString *fileName = #"";
// The following seems to never stop waiting, the movies themselves get made though
// Wait until dispatch group finishes processing temp tracks
dispatch_group_wait(self.videoProcessingGroup, DISPATCH_TIME_FOREVER);
// never gets to here
fileName = [self writeTracksToMovie:_tracksData];
// Wait until dispatch group finishes processing final track
dispatch_group_wait(self.videoProcessingGroup, DISPATCH_TIME_FOREVER);
return fileName;
}
// #param videoFrames should be NSArray of UIImage, all of same size
// #return path to temp file
-(NSString *)writeImageArrayToMovie:(NSArray *)videoFrames usingDispatchGroup:(dispatch_group_t)dispatchGroup {
// elided a bunch of stuff, but it all works
AVAssetWriter *videoWriter = [[AVAssetWriter alloc] initWithURL:[NSURL fileURLWithPath:result]
fileType:AVFileTypeMPEG4
error:&error];
//elided stuff
//Finish the session:
[writerInput markAsFinished];
dispatch_group_async(dispatchGroup, dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0), ^{
[videoWriter finishWritingWithCompletionHandler:^{
dispatch_group_leave(dispatchGroup);
// not sure I ever get here? NSLogs don't write out.
CVPixelBufferPoolRelease(adaptor.pixelBufferPool);
}];
});
return result;
}

Resources