Playing sound without lag in Swift - ios

As many developers know, using AVAudioPlayer for playing sound in games can result in jerky animation/movement, because of a tiny delay each time a sound is played.
I used to overcome this in Objective-C, by using OpenAL through a wrapper class (also in Obj-C).
I now use Swift for all new projects, but I can't figure out how to use my wrapper class from Swift. I can import the class (through a bridging header), but when I need to create ALCdevice and ALCcontext objects in my Swift file, Xcode won't accept it.
Does anyone have or know of a working example of playing a sound using OpenAL from Swift? Or maybe sound without lag can be achieved in some other way in Swift?

I've ran to a delay-type problem once, I hope your problem is the same one I've encountered.
In my situation, I was using Sprite-Kit to play my sounds, using SKAction.playSoundFileNamed:. It would always lag half a second behind where I wanted it to play.
This is because it takes time to allocate memory for each SKAction call. To solve this, store the sound action in a variable so you can reuse the sound later without instantiating new objects. It saved me from the delay. This technique would probably work for AVAudioPlayer too.

Related

Swift change Pitch and Speed of recorded audio

I have an app who's foundation is essentially based on https://blckbirds.com/post/voice-recorder-app-in-swiftui-1/.
It's Swift / XCode 12.5.1 and works great. I call the audio using self.audioPlayer.startPlayback(audio: self.audioURL) which plays the recording perfectly.
Now I want to add the ability for the user to adjust the pitch and speed of the recorded audio. It doesn't have to save the changes, just apply the changes while playing the file on the fly.
I found https://www.hackingwithswift.com/example-code/media/how-to-control-the-pitch-and-speed-of-audio-using-avaudioengine which simplifies the process of applying pitch changes. I'm able to change the startPlayback above to
self.audioPlayer.speedControl.rate = 0.5
do {
try self.audioPlayer.play(self.audioURL)
}
catch let error as NSError {
print(error.localizedDescription)
}
after adding HWS's code into the AudioPlayer class, which proves it's working, but it's not an implementation.. it breaks some of the other capabilities (like updating and using the stopPlayback function), which I think is due to switching between the AVAudioPlayer and the AVAudioPlayerNode I'm trying to figure out if I need to rewrite the AudioPlayer.swift from the blckbirds tutorial, or if there's a friendlier way to incorporate HWS's into the project.
For example, I suppose I could create a toggle that would use the AVAudioPlayer playback if no effects are being used, then if the toggle enables one of the effects, have it use AVAudioPlayerNode instead.. but that seems inefficient. I'd appreciate any thoughts here!
Turns out this was simpler than I had thought using #AppStorage and conditionals to integrate the desired player. Thanks!

multiple MPMoviePlayerController on a tableview

I have multiple MPMoviePlayerController on a UITableView (on different sections).
I know that only one can play on a specific time, but the thing is that if a different player was in "pause" mode, it gets stuck and I need to re-init it.
I can do a sophisticated [tableview reload] on everything else but me - but it seems cycle consuming and idiotic (and not that simple to reload all but me)
Is there a better way? Maybe a 3rd party OS package that does handle this nicely?
Ok, first of all, why you need multiple MPMoviePlayerController instances if you play only one video at a time? You could create one instance of MPMoviePlayerController to play all videos one by one.
And, I think, AVPlayer with AVPlayerLayer would be more extensible solution for playing videos on iOS. Take a look at the AVFoundation framework reference for more information about the AVPlayer here.
Good Luck!

Is it possible to use multiple 'MTAudioProcessingTap's?

In my iOS app, I'm using AVFoundation's AVComposition to create a video, with multiple audio tracks. I am trying to let the user see the volume/power-level for each audio-track. I've successfully implemented this for one track, but as soon as I try to use a second MTAudioProcessingTap, it fails with OSError -12780. In fact, if I use a processingTap and then go 'back' - deallocating the entire view controller, and re-opening that particular window, it won't even attach the processingTap again, even if the first AVPlayer playing the composition has been deallocated. To solve this, I found out from searching that I have to manually release it and clearing out the audioMix for the player, but that's not my problem now. Now, I can't clear out the other processingTap, I need them both!
I'm not completely sure what the MTAudioProcessingTap actually is, as it is the single least documented piece of code ever to come out of the Apple dev-team, by far. I've watched the WWDC, and gone through the iPad-sample-project they have made, but I can't figure out how to have two taps running.
I figured I might not actually need two taps, but maybe use one to handle more than one audioTrack, however if I somehow manage to use the same tap for two AudioTracks, I wouldn't know how to tell the taps apart in the static callbacks. Is there perhaps another way to monitor audio-levels from an AVComposition playing in an AVPlayer, or is there a way like this?

sound design for spritekit with swift

I'm trying to figure out how to add some sound design elements to my game. For example.. I want an engine sound to change pitch or grow louder as my sprite moves faster. Obviously this is outside the scope of SKAction. I've tried AVAudioPlayer.. this works but it seems to be more suited towards playing music. Even running a short loop using AVAudioPlayer produces popping sounds between each loop.
How can I control things like pitch, volume, playback speed programmatically?
This seems useful.
http://kstenerud.github.io/ObjectAL-for-iPhone/index.html
but is there a swift version.. or can I bridge this over to swift somehow.
ObjectAL can be added as a Pod into a Swift project, i am using it right now in my own Swift projects.

AVAudioPlayer initialization & memory management

I am trying to figure out something -
Every time I use AVAudioPlayer I need to initialize a new AVAudioPlayer object.
I have 10 Sentence objects on a view and I wish to add a "PlaySentence" method to each Sentence object so when the user taps the Sentence the app will play a sound file.
I need this behavior on many views, so I thought adding the method to the object class so I can simply call -
[Sentence playSound];
Since AVAudioPlayer is any way initialized every time I wish to use it I can not see why this should be more expensive operation.
Am I right / is it a good approch for this need and why?
Thanks
Shani
So if I understand you right, you want your Sentence object have a playSound method which sets up an AVAudioPlayer and plays the sound.
You can definitely do it like that, but be aware that if you have a lot of Sentence objects then you will end up creating a lot of AVAudioPlayer objects. But you could stop the high water line being too high by releasing it at after the file has finished playing.
Another way you could do this is to have a method on Sentence to return the URL of the file to play and then just have a single AVAudioPlayer instance in your view controller where you want to play the sounds and set it up each time for the correct file. This would be my personal suggested way of doing it.
I think best solution for you is to use SimpleAudioEngine from Cocos2D library.
It's easy to use and integrates well. As I know, it uses OpenAL and AVAudioPlayer. It has easy interfaces to control background music, effects and works with mp3, wav, m4a and other formats and codecs.
Using it is very easy:
// it's better to preload all effects during app start so they will play immediately when you call "playEffect" method
[[SimpleAudioEngine sharedEngine] preloadEffect:#"sentence.m4a"];
...
// play effect where you need it
[[SimpleAudioEngine sharedEngine] playEffect:soundName];
You can find it here inside Cocos2D library. And you can find all useful information about it here.

Resources