Play video using GPUimage - ios

In my app I have to play an alpha channel video as an overlay over the current view (I'm planning to achieve this alpha channel video using GPUImageAlphaBlendFilter or GPUImageChromaKeyBlendFilter), so I wanted to know if the output video after applying these filters can be played using GPUImage? If we can, then can I get some sample code for the same.
I know AVAnimator is an option but I want to apply filters to these overlay videos i.e.brightness,saturation etc which has to be visible while video is being played because of which I can't use AVAnimator. But this being the next step for now I want to know how to play video using GPUImage.
Thanks in advance! :]

Well, even though I like telling people about AVAnimator, Brad Larson's GPUImage is specifically designed to be a GPU based filtering framework for iOS apps. Application of real time effects like the ones you describe is exactly what GPUImage was designed for. See GPUImage Chroma Key filter

Related

Animated overlay for videos on iOS

What's the common modern standard for animated video overlays? (e.g. if you want to add an animated logo to video recorded from the camera)
During research, I've found the following options:
GIF - seems to be pretty outdated technology
FLV - supports alpha-channel, but no longer supported by Adobe.
Requires FFMPEG.
PNG sequence - the downside of this is having multiple files for each
frame.
What's the right format/technology to use?
Ideally, what is natively supported on iOS (doesn't require FFMPEG)?
If you want to overlay your custom video animation over video which user will be recorded I suggest to use GPUImage framework which allow a lot video/photo customization's and different graphic effects. For example how to mix two videos: nice article. Also I suggest you to read article about Chroma key which are something like standard of video/photo mixing. (because as I understand you just want make something like watermark?). GPUImage also has Chroma key filter which you can use in your purpose.
By default Apple supports h264 codec in mp4 container. So your video should be in this codec.
Hope I fully answered on your question
The best way to add overlays using the AVFoundation framework supplied by apple itself. Speaking about the other ways such as GIF, FLV, they are not supported natively by APPLE which puts you out of luck.
Apple suggests various tools such as AVVideoCompositionCoreAnimationTool that lets you stitch the Core Animations and the videos together.
Here is a link that explains how to add various effects such as
Colored borders with custom sizes.
Multiple overlays.
Text for subtitles or captions.
Tilt effects.
Twinkle, rotate, and fade animation effects!
I am not sure how much of this is application for the application that wanted to add animations while recording. May be some one else could help in it. I hope this helps you about the native way to add animations in recorded videos in iOS.

Color effect on video in iOS

How is it possible to set multiple color effects on a video when it is playing ?
Just like Instagram. I've used core image framework to give effects on images. But I have no idea how to do same for videos. I'll not prefer to use any third party SDK.
Thanks in advance.
The CoreImage framework can also process a video input.
This Apple sample project will even show you how to save a filtered video stream while displaying the preview: Core Image Filters with Photos and Video for iOS

Video Overlapping in iOS

I'm working on an app where I have to overlap an alpha channel video over a video recorded from the device camera. Is it possible using GPUImage framework?
If YES how can I implement it using GPUImage?
And if NO what alternative do I have?
Yes you can implement it with GPUImage.
You need to use Blending filters of GPUImage.
Here you can find implementation.
http://indieambitions.com/idevblogaday/mixing-videos-gpuimage/

iOS AVfoundation overlay video with video

I have main video and I want to overlay it with another animated video with alpha channel like "Action Movie FX" application. How can I do it with AVfoundation, or can you suggest third-party framework?
Thanks
GPUImage by Brad Larson is a great third-party framework for this kind of thing. It has many different blending algorithms you can choose from. This thread has code similar to what you want to do.
I would suggest that you take a look at my 3rd party framework to do this sort of task under iOS. You can find example Xcode projects named AVRender and Fireworks that show the exact kind of composition you describe. The compositions could be done either offline or online and can be implemented fully lossless or with a lossy encoding to h.264 as the final output. That main thing is that you will want to use a technology with a full alpha channel as h.264 does not support an alpha channel by default.

AVFoundation to overlay two video clips with Alpha compositing in ios5?

I'm hoping to use IOS5 AV Foundation with or without Open GL to record video from the camera and overlay/merge another video clip on top using some form of alpha channel compositing / foreground matting.
A sample use case of the combined output may be a video of an animated character interacting with the the user's recorded video clip from the iPhone/iPad camera.
Is this possible right now with IOS5 or potentially with Brad Larson's GPUImage framework? Can the alpha channels of the two video sources be combined easily?
If anyone has any sample code they could share, or offer any guidance I'd be really appreciative.
The Apple AVEditDemo (+ accompanying WWDC 2010 video) would be a start. Doesn't show video overlays w/ alpha but if you haven't worked with AVFoundation before this is an excellent intro.
Here's another good walkthrough video-composition-with-ios

Resources