sound design for spritekit with swift - ios

I'm trying to figure out how to add some sound design elements to my game. For example.. I want an engine sound to change pitch or grow louder as my sprite moves faster. Obviously this is outside the scope of SKAction. I've tried AVAudioPlayer.. this works but it seems to be more suited towards playing music. Even running a short loop using AVAudioPlayer produces popping sounds between each loop.
How can I control things like pitch, volume, playback speed programmatically?
This seems useful.
http://kstenerud.github.io/ObjectAL-for-iPhone/index.html
but is there a swift version.. or can I bridge this over to swift somehow.

ObjectAL can be added as a Pod into a Swift project, i am using it right now in my own Swift projects.

Related

Swift change Pitch and Speed of recorded audio

I have an app who's foundation is essentially based on https://blckbirds.com/post/voice-recorder-app-in-swiftui-1/.
It's Swift / XCode 12.5.1 and works great. I call the audio using self.audioPlayer.startPlayback(audio: self.audioURL) which plays the recording perfectly.
Now I want to add the ability for the user to adjust the pitch and speed of the recorded audio. It doesn't have to save the changes, just apply the changes while playing the file on the fly.
I found https://www.hackingwithswift.com/example-code/media/how-to-control-the-pitch-and-speed-of-audio-using-avaudioengine which simplifies the process of applying pitch changes. I'm able to change the startPlayback above to
self.audioPlayer.speedControl.rate = 0.5
do {
try self.audioPlayer.play(self.audioURL)
}
catch let error as NSError {
print(error.localizedDescription)
}
after adding HWS's code into the AudioPlayer class, which proves it's working, but it's not an implementation.. it breaks some of the other capabilities (like updating and using the stopPlayback function), which I think is due to switching between the AVAudioPlayer and the AVAudioPlayerNode I'm trying to figure out if I need to rewrite the AudioPlayer.swift from the blckbirds tutorial, or if there's a friendlier way to incorporate HWS's into the project.
For example, I suppose I could create a toggle that would use the AVAudioPlayer playback if no effects are being used, then if the toggle enables one of the effects, have it use AVAudioPlayerNode instead.. but that seems inefficient. I'd appreciate any thoughts here!
Turns out this was simpler than I had thought using #AppStorage and conditionals to integrate the desired player. Thanks!

iOS and SpriteKit camera background

I am developing a game for iOS using Scenekit and maybe GamePlayKit. What I want is for the background of the main game play to be set to a live camera feed. That should be simple enough. A little googling brought me to this`question. My question however, is will a technique like the one in the linked question work well with SpriteKit? The question deals with setting the background with UIKit. Can I use this option with SceneKit too or do I have to use some sort of SceneKit specific way to set the background like so?

Playing sound without lag in Swift

As many developers know, using AVAudioPlayer for playing sound in games can result in jerky animation/movement, because of a tiny delay each time a sound is played.
I used to overcome this in Objective-C, by using OpenAL through a wrapper class (also in Obj-C).
I now use Swift for all new projects, but I can't figure out how to use my wrapper class from Swift. I can import the class (through a bridging header), but when I need to create ALCdevice and ALCcontext objects in my Swift file, Xcode won't accept it.
Does anyone have or know of a working example of playing a sound using OpenAL from Swift? Or maybe sound without lag can be achieved in some other way in Swift?
I've ran to a delay-type problem once, I hope your problem is the same one I've encountered.
In my situation, I was using Sprite-Kit to play my sounds, using SKAction.playSoundFileNamed:. It would always lag half a second behind where I wanted it to play.
This is because it takes time to allocate memory for each SKAction call. To solve this, store the sound action in a variable so you can reuse the sound later without instantiating new objects. It saved me from the delay. This technique would probably work for AVAudioPlayer too.

Dim volume of iPodMusicPlayer to 50% in iOS

I know that you can easily set the volume property of the music player, but I want to do it smoothly like Google Maps does when they use the voiceover for navigation instructions.
I was wondering what the best way to do this is.
Thanks!
I would try using a repeating NSTimer. Every time the timer fires you lower the volume a bit. When it reaches the target value you invalidate the timer.
Other ways of getting a repeated event (so that you can do something in stages gradually over time) are DISPATCH_SOURCE_TYPE_TIMER and CADisplayLink. But I think a timer is probably the simplest way to get started.
If you have a pre-existing sound that you're playing, a completely different solution is to apply a fadeout to it before you start playing it (and then just play it all at the same volume, because the sound itself fades out, do you see). AVFoundation gives you the tools to do that (e.g. setVolumeRampFromStartVolume:toEndVolume:timeRange:).

Examples on Drum Sim Build for iOS

I'm trying to build a drum simulator in Xcode for an app for my brother in laws band. I'm having a little trouble doing this. What I'm wanting to do is use a picture of the drummer's drum set and use that as the drum you can play on the iPhone. I was thinking it could kind of be like an image map in HTML where the buttons would play a sound of the drum and everything else. If anyone has any ideas or how-tos on this it would be greatly appreciated.
Thanks
A very basic approach:
Use a UIImageView with an image of the drumkit as background.
Create a transparent UIView in front of each drum. Attach a UITapGestureRecognizer to each UIView. This gesture recognizer should call a different method for each view. Use AVAudioPlayer to play the sound.
That's just a very basic app but not so bad. After you've done with that your next tasks could be:
optimize performance. Sound should play in the moment you touch the display.
multitouch
use core motion to detect the strength of your tap and adjust the sound
A UIImageView with some nice invisible buttons (UIButton's tintColor property as in button.tintColor = [UIColor clearColor];) over the skins would be perfect. Then, take a look at how to play some audio over here: http://blogs.x2line.com/al/archive/2011/05/19/3831.aspx

Resources