Animated audio play button on iPhone - ios

I am playing audio in my app through FMOD. In my media player I currently just have a play button with a slider that tracks your progress through the audio clip. What I want to do is have basically a slider that wraps around the play button, showing the progress there.
For example, the play button in jplayer: http://www.jplayer.org/
I have looked around but have not found any libraries out there or any tutorials on how to implement this in Objective-C.
Any help/suggestions?

Here are some pages you might find useful.
Intro to Quartz - explains the basics of drawing with Quartz
Drawing Pie Charts - explains how to draw arcs of circles
Advanced Drawing Guide - for making it look pretty
Basically, using these tutorials, you should be able to draw the arc of a circle behind your play button, which extends outside the button itself. The angle of the arc should be the percentage of the progress through the audio file as a float (i.e. 0.0 to 1.0), multiplied by 360 degrees.
Do keep in mind the user interface implications of this design. Having a circular progress bar will make it very difficult for the user to seek to different parts of the audio file, if that's a feature you support.

Related

How to play video over SKScene

Basically I have an SKScene, and I want to play a video over the scene. The video is confetti falling with an alpha background. It will play when the player gets a high score. I am using an SKScene with shapes and images drawn with shape nodes and image nodes. I just was wondering if anyone could please tell me how to play the video over the screen and still see the game in the back, and be able to touch the buttons through the video. It is supposed to look like an animation playing.
I am using a video because I was just thinking that playing a video would be more processor efficient than having the game generate particles.
There is no built-in iOS solution. You can play 24BPP (fully opaque) movies under iOS, but the only built-in way to display alpha channel video would be to load a series of PNG images with alpha. Downside is that takes up a huge amount of memory and it bloats the app download. If you want to have a look at some working examples of this kind of functionality with a 3rd party app then see Alpha Channel Examples. You might also be interested in this blog post which shows example code of how to impl Alpha channel textures in OpenGL would could be implemented on top of SpriteKit too. The cube example shows rendering an alpha channel movie onto a cube, it was adapted from a Ray Wenderlich tutorial.
Here as an answer how to do that with GPUImageView. Also project on GitHub here and similar question from stackoverflow
The video stack doesn't yet support formats with alpha. For confetti, you should use SKEmitterNode. Size it to the area you envisioned for your video, and see Creating Particle Effects, i.e., its link to Add a particle emitter to your project and try out the "Snow" effect. It looks more like confetti when you give it a different color than white. Click the dot under "Color Ramp" to set the color.

Playing a sound while drawing a line with Sprite Kit

I am looking into making a drawing app with Sprite Kit in iOS, with either swift or objective C.
There is a tutorial here that shows how to draw a line with sprite kit. This is great, but for my app I want more. I want the app to play sound effects while the line is being drawn. The tone of the sound will depend on the speed at which the user is drawing the line. The faster the user moves their finger, the higher pitch the sound is. All i have found regarding playing sounds in sprite kit is background music and playing a single sound. Can someone point me in the right direction to accomplish my goal?
You'll probably want to have a chromatic set of samples, ranging from the lowest to the highest possible tone you want to play, at a very short duration (maybe 0.25 seconds? You'll need to experiment, but they will all have to be the same duration).
Let's say you have 30 different samples, line-0.wav .. line-29.wav. Then all you need to do is calculate the velocity of the user's drag appropriately, and use some function to map the possible range of drag velocity to the integer range of sample indices. As long as the drag is in progress, play the appropriate sound repeatedly.

What logic is used for creating an Equalizer meter

Basically i'm gonna be working on an iOS music app which when a song is being played, it shows the fancy Equalizer meter, Something like this but with all the animation of bars going up and down:
After looking into this and not finding enough resource, I really want to carry this as a project perhaps making a web version using j query.
I'm not really asking for specific code, i just want to know how the animation works in general ?
Thanks a million !!!
Checkout the Cocoa Waveform Audio Player Control project. It's a cocoa audio player component which displays the waveform of the audio file.
Also, there is already a lot of questions on this topic:
iOS FFT Accerelate.framework draw spectrum during playback
Using the Apple FFT and Accelerate Framework
iOS FFT Draw spectrum
Animation would be pretty straight forward. It is just animating changes of the height of rectangles.

Adding cross dissolve effect - iOS video creation

I would like to create a video from a number of images and add a cross dissolve effect between the images.
How can this be done? I know images can be written into a video file, but I don't see where to apply an effect. Should each image be turned into a video and then the videos written into the fill video with the transition effect?
I have searched around and cannot find much information on how this can be done, e.g. how to use AVMutableComposition and if it is viable to create videos consisting of individual images, to then apply the cross dissolve effect.
Any information will be greatly appreciated.
If you want to dig around in the bowels of AVFoundation for this, I strongly suggest you take a look at this presentation, especially starting at slide 74 Be prepared to do a large amount of work to pull this off...
If you'd like to get down to business several orders of magnitude faster, and don't mind incorporating a 3rd party library, I'd highly recommend you try GPUImage
You'll find it quite simple to push images into a video and swap them out at will, as well as apply any number of blend filters to the transitions, simply by varying a single mix property of your blend filter over the time your transition happens.
I'm currently doing this right now. To make things short: You need an AVPlayer to which you put an AVPlayerItem that can be empty. You then need to set the forwardPlaybackEndTime of your AVPlayerItemto the duration of your animation. You then create an AVPlayerLayer that you initiate with your AVPlayer. (actually maybe you do not need the AVPlayerLayer if you will not put video in your animation. Then the important part, you create an AVSynchronizedLayer that your initiate with your previous AVPlayerItem. this AVSynchronizedLayer and any sublayer it holds will be synchronized with your AVPlayerItem. You can then create some simple CALayer holding your image (through the contents property and add your CAKeyframeAnimation to your layer on the opacity property. Now any animation on those sublayers will follow the time of your AVPlayerItem. To start the animation, simply call playon your AVPlayer. That's the theory for playback. If you want to export this animation in an mp4 you will need to use AVVideoCompositionCoreAnimationTool but it's pretty similar.
for code example see code snippet to create animation

iOS: Draw on top of AV video, then save the drawing in the video file

I'm working on an iPad app that records and plays videos using AVFoundation classes. I have all of the code for basic record/playback in place and now I would like to add a feature that allows the user to draw and make annotations on the video—something I believe will not be too difficult. The harder part, and something that I have not been able to find any examples of, will be to combine the drawing and annotations into the video file itself. I suspect this is part is accomplished with AVComposition but have no idea exactly how. Your help would be greatly appreciated.
Mark
I do not think that you can actually save a drawing into a video file in iOS. You could however consider using a separate view to save the drawing and synchronize the overlay onto the video using a transparent view. In other words, the user circled something at time 3 mins 42 secs in the video. Then when the video is played back you overlay the saved drawing onto the video at the 3:42 mark. It's not what you want but I think it is as close as you can get right now.
EDIT: Actually there might be a way after all. Take a look at this tutorial. I have not read the whole thing but it seems to incorporate the overlay function you need.
http://www.raywenderlich.com/30200/avfoundation-tutorial-adding-overlays-and-animations-to-videos

Resources