Playing a sound while drawing a line with Sprite Kit - ios

I am looking into making a drawing app with Sprite Kit in iOS, with either swift or objective C.
There is a tutorial here that shows how to draw a line with sprite kit. This is great, but for my app I want more. I want the app to play sound effects while the line is being drawn. The tone of the sound will depend on the speed at which the user is drawing the line. The faster the user moves their finger, the higher pitch the sound is. All i have found regarding playing sounds in sprite kit is background music and playing a single sound. Can someone point me in the right direction to accomplish my goal?

You'll probably want to have a chromatic set of samples, ranging from the lowest to the highest possible tone you want to play, at a very short duration (maybe 0.25 seconds? You'll need to experiment, but they will all have to be the same duration).
Let's say you have 30 different samples, line-0.wav .. line-29.wav. Then all you need to do is calculate the velocity of the user's drag appropriately, and use some function to map the possible range of drag velocity to the integer range of sample indices. As long as the drag is in progress, play the appropriate sound repeatedly.

Related

How To Increase Callback Frequency of AVAudioEngine.installTap()

I'm trying to create an audio visualizer and I'm using the accelerate framework functions to compute the root means square value of the buffer data to get a uniform scalar for my metal shader.
I have an onscreen midi controller where the users can press the buttons to make a sound.
The sounds I'm playing are very short. For example, I have sound buttons that the user presses which makes a beat, this only lasts for about 0.2-0.4s and I only get about 3-4 callbacks during the play.
My visualizer looks quite awful and choppy as it just snaps to 4 different sizes per button press rather than having a smooth transition.
I'm going for a look like this:
Do I install a second tap? Should I try and interpolate the points to "fake" the transitions?
Ideally, I'd like something along the lines of 10-15 frames/second.
Since there is is no way to increase the frequency, I resorted to interpolating the previous value with the current value to smoothen the transition.

Hit Animations in Spritekit

I been playing around with xcode using swift for a while. i recently got into using Spritekit.
I wanted to experiment with animations for when the player shoots a bullet at an enemy and to show that the sprite took damage (or contacted it) it would switch the sprite image for half a second then switch back to the regular image.
I guess an example of this would be in super mario RPG, when mario hits an enemy, their eyes pop out
Is this a good way to approach the hit animation by switching the images back and forth or is there an easier way?
Switching the texture would work, but give a sudden change. Animating the sprite through a short series of textures (e.g eyes beginning to pop, eyes half way popped, eyes popped) would look better and can be implemented easily using SKActions. See Apple's Sprite Programming Guide, Changing a Sprites Texture

iOS record audio and draw waveform like Voice Memos

I'm going to ask this at the risk of being too vague or asking too many things in one question, but I'm really just looking for a point in the right direction.
In my app I want to record audio, show a waveform while recording, and scroll through the waveform to record and playback from a specified time. For example, if I have 3 minutes of audio, I should be able to scroll back to 2:00 and start recording from there to fix a mistake.
In Voice Memos, this is accomplished instantaneously, without any delay or loading time. I'm trying to figure out how the did this, if anyone has a clue.
What I've tried:
EZAudio - This library is great, but doesn't do what I want. You can't scroll through the waveform. It deletes the waveform data at the beginning and begins appending it to the end once it reaches a certain length.
SCWaveformView - This waveform is nice, but it uses images. Once the waveform is too long, putting it in a scroll view causes really jittery scrolling. Also you can't build the waveform while recording, only afterward.
As far as appending, I've used this method: https://stackoverflow.com/a/11520553/1391672
But there is significant processing time, even when appending two very short clips of audio together (in my experience).
How does Voice Memos do what it does? Do you think the waveform is drawn in OpenGL or CoreGraphics? Are they using Core Audio or AVAudioRecorder? Has anyone built anything like this that can point me in the right direction?
When zoomed-in, a scrollview only needs to draw the small portion of the waveform that is visible. When zoomed-out, a graph view might only drawn every Nth point of the audio buffer, or use some other DSP down-sampling algorithm on the data before rendering. This likely has to be done using your own custom drawing or graphics rendering code inside a UIScrollView or similar custom controller. The waveform rendering code during and after recording don't have to be the same.
The recording API and the drawing API you use can be completely independent, and can be almost anything, from OpenGL to Metal to Core Graphics (on newer faster devices). On the audio end, Core Audio will help provide the lowest latency, but Audio Queues and the AVAudioEngine might also be suitable.

What logic is used for creating an Equalizer meter

Basically i'm gonna be working on an iOS music app which when a song is being played, it shows the fancy Equalizer meter, Something like this but with all the animation of bars going up and down:
After looking into this and not finding enough resource, I really want to carry this as a project perhaps making a web version using j query.
I'm not really asking for specific code, i just want to know how the animation works in general ?
Thanks a million !!!
Checkout the Cocoa Waveform Audio Player Control project. It's a cocoa audio player component which displays the waveform of the audio file.
Also, there is already a lot of questions on this topic:
iOS FFT Accerelate.framework draw spectrum during playback
Using the Apple FFT and Accelerate Framework
iOS FFT Draw spectrum
Animation would be pretty straight forward. It is just animating changes of the height of rectangles.

Transferring billiard game state from player to player with box2d

Im using Union server for my iOS (Starling) billiard game.
So far the connections work great.
My question is:
How would you handle the transfer of the ball positions from the opponent.
Lets say I make the break, and I want to copy that shot to the other player?
Do you think its a good idea to send a message over union every frame (x, y)?
Will this cause latency problems?
First about your question:
The game is installed on both devices, the rules are same. So on shot send the white balls force and all other properties you modify. On the receiving device add that force etc. and the ball will repeat the action done by the sending user.
Now the sad part:
I will be the first one to disappoint you - even if you solve your problem and send the message from player to player with out problems the result won't be pleasant: box2d calculations are optimised for performance, but the result will differ as it's calculated with approximate accuracy. As a result even on the same device the balls will end up in a different location on different runs. You won't notice it in one-two hits, but after playing for a minute you'll end up with deferent ball locations.
You can also try to send a message of every ball position in space after all balls stop moving and relocate the remote users ball positions. After that "correcting" message was received return the control to the player.
We had a similar game and I just wrote my own 2d engine. If you're working only with ball to ball and ball to rectangle collisions it's easy to write your own engine.

Resources