I'm working on an app where I have to overlap an alpha channel video over a video recorded from the device camera. Is it possible using GPUImage framework?
If YES how can I implement it using GPUImage?
And if NO what alternative do I have?
Yes you can implement it with GPUImage.
You need to use Blending filters of GPUImage.
Here you can find implementation.
http://indieambitions.com/idevblogaday/mixing-videos-gpuimage/
Related
I am looking for ways to apply visual effect like pixellate,ghost,twirl,X-ray etc. over currently playing video for few seconds not over whole video duration using GPUImage library.
You can change the filter values while running a video processing. So you can use NSTimer to schedule the effect changes.
How is it possible to set multiple color effects on a video when it is playing ?
Just like Instagram. I've used core image framework to give effects on images. But I have no idea how to do same for videos. I'll not prefer to use any third party SDK.
Thanks in advance.
The CoreImage framework can also process a video input.
This Apple sample project will even show you how to save a filtered video stream while displaying the preview: Core Image Filters with Photos and Video for iOS
In my app I have to play an alpha channel video as an overlay over the current view (I'm planning to achieve this alpha channel video using GPUImageAlphaBlendFilter or GPUImageChromaKeyBlendFilter), so I wanted to know if the output video after applying these filters can be played using GPUImage? If we can, then can I get some sample code for the same.
I know AVAnimator is an option but I want to apply filters to these overlay videos i.e.brightness,saturation etc which has to be visible while video is being played because of which I can't use AVAnimator. But this being the next step for now I want to know how to play video using GPUImage.
Thanks in advance! :]
Well, even though I like telling people about AVAnimator, Brad Larson's GPUImage is specifically designed to be a GPU based filtering framework for iOS apps. Application of real time effects like the ones you describe is exactly what GPUImage was designed for. See GPUImage Chroma Key filter
I have main video and I want to overlay it with another animated video with alpha channel like "Action Movie FX" application. How can I do it with AVfoundation, or can you suggest third-party framework?
Thanks
GPUImage by Brad Larson is a great third-party framework for this kind of thing. It has many different blending algorithms you can choose from. This thread has code similar to what you want to do.
I would suggest that you take a look at my 3rd party framework to do this sort of task under iOS. You can find example Xcode projects named AVRender and Fireworks that show the exact kind of composition you describe. The compositions could be done either offline or online and can be implemented fully lossless or with a lossy encoding to h.264 as the final output. That main thing is that you will want to use a technology with a full alpha channel as h.264 does not support an alpha channel by default.
I'm hoping to use IOS5 AV Foundation with or without Open GL to record video from the camera and overlay/merge another video clip on top using some form of alpha channel compositing / foreground matting.
A sample use case of the combined output may be a video of an animated character interacting with the the user's recorded video clip from the iPhone/iPad camera.
Is this possible right now with IOS5 or potentially with Brad Larson's GPUImage framework? Can the alpha channels of the two video sources be combined easily?
If anyone has any sample code they could share, or offer any guidance I'd be really appreciative.
The Apple AVEditDemo (+ accompanying WWDC 2010 video) would be a start. Doesn't show video overlays w/ alpha but if you haven't worked with AVFoundation before this is an excellent intro.
Here's another good walkthrough video-composition-with-ios