iOS: Video as GL texture with alpha transparency - ios

I'm trying to figure out the best approach to display a video on a GL texture while preserving the transparency of the alpha channel.
Information about video as GL texture is here: Is it possible using video as texture for GL in iOS? and iOS4: how do I use video file as an OpenGL texture?.
Using ffmpeg to help with alpha transparency, but not app store friendly is here:
iPhone: Display a semi-transparent video on top of a UIView?
The video source would be filmed in front of a green screen for chroma keying. The video could be untouched to leave the green screen or processed in a video editing suite and exported to Quicktime Animation or Apple Pro Res 4444 with Alpha.
There are multiple approaches that I think could potentially work, but I haven't found a full solution.
Realtime threshold processing of the video looking for green to remove
Figure out how to use the above mentioned Quicktime codecs to preserve the alpha channel
Blending two videos together: 1) Main video with RGB 2) separate video with alpha mask
I would love to get your thoughts on the best approach for iOS and OpenGL ES 2.0
Thanks.

The easiest way to do chroma keying for simple blending of a movie and another scene would be to use the GPUImageChromaKeyBlendFilter from my GPUImage framework. You can supply the movie source as a GPUImageMovie, and then blend that with your background content. The chroma key filter allows you to specify a color, a proximity to that color, and a smoothness of blending to use in the replacement operation. All of this is GPU-accelerated via tuned shaders.
Images, movies, and the live cameras can be used as sources, but if you wish to render this with OpenGL ES content behind your movie, I'd recommend rendering your OpenGL ES content to a texture-backed FBO and passing that texture in via a GPUImageTextureInput.
You could possibly use this to output a texture containing your movie frames with the keyed color replaced by a constant color with a 0 alpha channel, as well. This texture could be extracted using a GPUImageTextureOutput for later use in your OpenGL ES scene.

Apple showed a sample app at WWDC in 2011 called ChromaKey that shows how to handle frames of video passed to an OpenGL texture, manipulated, and optionally written out to a video file.
(In a very performant way)
It's written to use a feed from the video camera, and uses a very crude chromakey algorithm.
As the other poster said, you'll probably want to skip the chromakey code and do the color knockout yourself beforehand.
It shouldn't be that hard to rewrite the Chromakey sample app to use a video file as input instead of a camera feed, and it's quite easy to disable the chormakey code.
You'd need to modify the setup on the video input to expect RGBA data instead of RGB or Y/UV. The sample app is set up to use RGB, but I've seen other example apps from Apple that use Y/UV instead.

Have a look at the free "APNG" app on the app store. It shows how an animated PNG (.apng) can be rendered directly to an iOS view. The key is that APNG supports an alpha channel in the file format, so you don't need to mess around with chroma tricks that will not really work for all your video content. This approach is more efficient that multiple layers or chroma tricks since another round of processing is not needed each time a texture is displayed in a loop.
If you want to have a look at a small example xcode project that displays an alpha channel animation on the side of a spinning cube with OpenGL ES2, it can be found at Load OpenGL textures with alpha channel on iOS. The example code shows a simple call to glTexImage2D() that uploads a texture to the graphics card once for each display link callback.

Related

Is there a way to get transparency from an uploaded video?

https://github.com/vimeo/vimeo-unity-sdk/issues/102
Is there any way to get transparency from an uploaded video?
Steps to reproduce the problem:
Open Player scene
Log in using auth token
Choose a video stream that contains alpha (using a color code)
Hit play and confirm the stream playing with the color code
How can I swap this color code out for alpha? Is there perhaps another way to handle alpha?
could you specific the data format of "color code"?
Does it just a float between 0-1, or 0-255?
In general, if you have any data format in pixel[] or texture2D, you can customise a simple shader with transparent pass support.

What's the best way to composite frame-based animated stickers over recorded video?

We want to allow the user to place animated "stickers" over video that they record in the app and are considering different ways to composite these stickers.
Create a video in code from the frame-based animated stickers (which can be rotated, and have translations applied to them) using AVAssetWriter. The problem is that AVAssetWriter only writes to a file and doesn't keep transparency. This would prevent us from being able to overly it over the video using AVMutableComposition.
Create .mov files ahead of time for our frame based stickers and composite them using AVMutableComposition and layer instructions with transformations. The problem with this is that there are no tools for easily converting our PNG based frames to a .mov while maintaining an alpha channel and we'd have to write our own.
Creating separate CALayers for each frame in the sticker animations. This could potentially create a very large number of layers per frame rate of the video.
Or any better ideas?
Thanks.
I would suggest that you take a look at my blog post on this specific subject. Basically, this example shows how RGBA video data can be loaded from a file attached to the app resources. This is imported from a .mov that contains Animation RGBA data on the desktop. A conversion step is required to get the data from the Desktop into iOS, since plain H.264 cannot support an Alpha channel directly (as you have discovered). Note that older hardware may have issues decoding a H.264 user recorded video and then another one on top of that, so this approach of using the CPU instead of the H.264 hardware for the sticker is actually better.

How to play video over SKScene

Basically I have an SKScene, and I want to play a video over the scene. The video is confetti falling with an alpha background. It will play when the player gets a high score. I am using an SKScene with shapes and images drawn with shape nodes and image nodes. I just was wondering if anyone could please tell me how to play the video over the screen and still see the game in the back, and be able to touch the buttons through the video. It is supposed to look like an animation playing.
I am using a video because I was just thinking that playing a video would be more processor efficient than having the game generate particles.
There is no built-in iOS solution. You can play 24BPP (fully opaque) movies under iOS, but the only built-in way to display alpha channel video would be to load a series of PNG images with alpha. Downside is that takes up a huge amount of memory and it bloats the app download. If you want to have a look at some working examples of this kind of functionality with a 3rd party app then see Alpha Channel Examples. You might also be interested in this blog post which shows example code of how to impl Alpha channel textures in OpenGL would could be implemented on top of SpriteKit too. The cube example shows rendering an alpha channel movie onto a cube, it was adapted from a Ray Wenderlich tutorial.
Here as an answer how to do that with GPUImageView. Also project on GitHub here and similar question from stackoverflow
The video stack doesn't yet support formats with alpha. For confetti, you should use SKEmitterNode. Size it to the area you envisioned for your video, and see Creating Particle Effects, i.e., its link to Add a particle emitter to your project and try out the "Snow" effect. It looks more like confetti when you give it a different color than white. Click the dot under "Color Ramp" to set the color.

How do you use Open GL ES 2.0 (shaders) for video processing?

This question is about iOS. On Android, it is very easy to use OpenGL ES 2.0 to render a texture on a view (for previewing) or to send it to an encoder (for file writing). I haven't been able to find any tutorial on iOS to achieve video playback (previewing video effect from a file) and video recording (saving a video with an effect) with shader effects. Is this something possible with iOS?
I've come across a demo about shaders called GLCameraRipple but I have no clue about how to use it more generically. Ex: With AVFoundation.
[EDIT]
I trampled on this tutorial about OpenGL ES, AVFoundation and video merging on iOS while searching for a snippet. That's another interesting entry door.
It's all very low-level stuff over in iOS land, with a whole bunch of pieces to connect.
The main thing you're likely to be interested in is CVOpenGLESTextureCache. As the CV prefix implies, it's part of Core Video, in this case its primary point of interest is CVOpenGLESTextureCacheCreateTextureFromImage which "creates a live binding between the image buffer and the underlying texture object". The documentation further provides you with explicit advice on use of such an image as a GL_COLOR_ATTACHMENT — i.e. the texture ID returned is usable both as a source and as a destination for OpenGL.
The bound image buffer will be tied to a CVImageBuffer, one type of which is a CVPixelBuffer. You can supply pixel buffers to an AVAssetWriterInputPixelBufferAdaptor wired to an AVAssetWriter in order to output to a video.
In the other direction, an AVAssetReaderOutput attached to a AVAssetReader will vend CMSampleBuffers which can be queried for attached image buffers (if you've got video coming in and not just audio, there'll be some) that can then be mapped into OpenGL via a texture cache.

Rendering Video to an OpenGL texture in iOS with scrubbing

I have used the following method iOS4: how do I use video file as an OpenGL texture? to get video frames rendering in openGL successfully.
This method however seems to fall down when you want to scrub (jump to a certain point in the playback) as it only supplies you with video frames sequentially.
Does anyone know a way this behaviour can successfully be achieved?
One easy way to implement this is to export the video to a series of frames, store each frame as a PNG, and then "scrub" by seeing to a PNG at a specific offset. That gives you random access in the image stream at the cost of decoding the entire video first and holding all the data on disk. This would also involve decoding each frame as it is accessed, that would eat up CPU but modern iPhones and iPads can handle it as long as you are not doing too much else.

Resources