Basically I have an SKScene, and I want to play a video over the scene. The video is confetti falling with an alpha background. It will play when the player gets a high score. I am using an SKScene with shapes and images drawn with shape nodes and image nodes. I just was wondering if anyone could please tell me how to play the video over the screen and still see the game in the back, and be able to touch the buttons through the video. It is supposed to look like an animation playing.
I am using a video because I was just thinking that playing a video would be more processor efficient than having the game generate particles.
There is no built-in iOS solution. You can play 24BPP (fully opaque) movies under iOS, but the only built-in way to display alpha channel video would be to load a series of PNG images with alpha. Downside is that takes up a huge amount of memory and it bloats the app download. If you want to have a look at some working examples of this kind of functionality with a 3rd party app then see Alpha Channel Examples. You might also be interested in this blog post which shows example code of how to impl Alpha channel textures in OpenGL would could be implemented on top of SpriteKit too. The cube example shows rendering an alpha channel movie onto a cube, it was adapted from a Ray Wenderlich tutorial.
Here as an answer how to do that with GPUImageView. Also project on GitHub here and similar question from stackoverflow
The video stack doesn't yet support formats with alpha. For confetti, you should use SKEmitterNode. Size it to the area you envisioned for your video, and see Creating Particle Effects, i.e., its link to Add a particle emitter to your project and try out the "Snow" effect. It looks more like confetti when you give it a different color than white. Click the dot under "Color Ramp" to set the color.
Related
I'm currently working on app that records video of user, binds particle emitters to hands landmarks on preview. In result effects are shown on camera preview, but they aren't captured on video.
I saw a great tutorial on AVVideoCompositionCoreAnimationTool https://www.raywenderlich.com/6236502-avfoundation-tutorial-adding-overlays-and-animations-to-videos, but that way it only is possible to rendere animations on already recoded video.
I wonder if there is any chance to use AVVideoCompositionCoreAnimationTool for recording video form camera and animations in real time. Or if you know other method to do it, without diving deep in metal and so on.
Thanks in advance!
I want to experiment with 360° videos and I have a little idea, but I don't know how I can solve this. I want to embed a video from a streed and want to display the street-names and want to draw colored lines on the roads. When I move the view of the video (I turn 45° as example) I want that the lines and street-names go with the video and always stay on their places.
Do you have a solution approach for me?
I understand that you want a 360 video with street names? It will be easiest to just render everything on top of that video. You will need to render the AR layer as a equirectangular 360x180 video in some software like Blender and then merge it with the video you have with something like Adobe After Effects.
An alternative would be to do the AR layer in a 3d engine like Unity and put the video in the background.
It's hard to suggest you anything as what you're asking is very broad.
Basically i'm gonna be working on an iOS music app which when a song is being played, it shows the fancy Equalizer meter, Something like this but with all the animation of bars going up and down:
After looking into this and not finding enough resource, I really want to carry this as a project perhaps making a web version using j query.
I'm not really asking for specific code, i just want to know how the animation works in general ?
Thanks a million !!!
Checkout the Cocoa Waveform Audio Player Control project. It's a cocoa audio player component which displays the waveform of the audio file.
Also, there is already a lot of questions on this topic:
iOS FFT Accerelate.framework draw spectrum during playback
Using the Apple FFT and Accelerate Framework
iOS FFT Draw spectrum
Animation would be pretty straight forward. It is just animating changes of the height of rectangles.
I am playing audio in my app through FMOD. In my media player I currently just have a play button with a slider that tracks your progress through the audio clip. What I want to do is have basically a slider that wraps around the play button, showing the progress there.
For example, the play button in jplayer: http://www.jplayer.org/
I have looked around but have not found any libraries out there or any tutorials on how to implement this in Objective-C.
Any help/suggestions?
Here are some pages you might find useful.
Intro to Quartz - explains the basics of drawing with Quartz
Drawing Pie Charts - explains how to draw arcs of circles
Advanced Drawing Guide - for making it look pretty
Basically, using these tutorials, you should be able to draw the arc of a circle behind your play button, which extends outside the button itself. The angle of the arc should be the percentage of the progress through the audio file as a float (i.e. 0.0 to 1.0), multiplied by 360 degrees.
Do keep in mind the user interface implications of this design. Having a circular progress bar will make it very difficult for the user to seek to different parts of the audio file, if that's a feature you support.
I'm trying to figure out the best approach to display a video on a GL texture while preserving the transparency of the alpha channel.
Information about video as GL texture is here: Is it possible using video as texture for GL in iOS? and iOS4: how do I use video file as an OpenGL texture?.
Using ffmpeg to help with alpha transparency, but not app store friendly is here:
iPhone: Display a semi-transparent video on top of a UIView?
The video source would be filmed in front of a green screen for chroma keying. The video could be untouched to leave the green screen or processed in a video editing suite and exported to Quicktime Animation or Apple Pro Res 4444 with Alpha.
There are multiple approaches that I think could potentially work, but I haven't found a full solution.
Realtime threshold processing of the video looking for green to remove
Figure out how to use the above mentioned Quicktime codecs to preserve the alpha channel
Blending two videos together: 1) Main video with RGB 2) separate video with alpha mask
I would love to get your thoughts on the best approach for iOS and OpenGL ES 2.0
Thanks.
The easiest way to do chroma keying for simple blending of a movie and another scene would be to use the GPUImageChromaKeyBlendFilter from my GPUImage framework. You can supply the movie source as a GPUImageMovie, and then blend that with your background content. The chroma key filter allows you to specify a color, a proximity to that color, and a smoothness of blending to use in the replacement operation. All of this is GPU-accelerated via tuned shaders.
Images, movies, and the live cameras can be used as sources, but if you wish to render this with OpenGL ES content behind your movie, I'd recommend rendering your OpenGL ES content to a texture-backed FBO and passing that texture in via a GPUImageTextureInput.
You could possibly use this to output a texture containing your movie frames with the keyed color replaced by a constant color with a 0 alpha channel, as well. This texture could be extracted using a GPUImageTextureOutput for later use in your OpenGL ES scene.
Apple showed a sample app at WWDC in 2011 called ChromaKey that shows how to handle frames of video passed to an OpenGL texture, manipulated, and optionally written out to a video file.
(In a very performant way)
It's written to use a feed from the video camera, and uses a very crude chromakey algorithm.
As the other poster said, you'll probably want to skip the chromakey code and do the color knockout yourself beforehand.
It shouldn't be that hard to rewrite the Chromakey sample app to use a video file as input instead of a camera feed, and it's quite easy to disable the chormakey code.
You'd need to modify the setup on the video input to expect RGBA data instead of RGB or Y/UV. The sample app is set up to use RGB, but I've seen other example apps from Apple that use Y/UV instead.
Have a look at the free "APNG" app on the app store. It shows how an animated PNG (.apng) can be rendered directly to an iOS view. The key is that APNG supports an alpha channel in the file format, so you don't need to mess around with chroma tricks that will not really work for all your video content. This approach is more efficient that multiple layers or chroma tricks since another round of processing is not needed each time a texture is displayed in a loop.
If you want to have a look at a small example xcode project that displays an alpha channel animation on the side of a spinning cube with OpenGL ES2, it can be found at Load OpenGL textures with alpha channel on iOS. The example code shows a simple call to glTexImage2D() that uploads a texture to the graphics card once for each display link callback.