I have been trying to create a video template which uses alpha channel video overlayed on the mp4 videos and images.
This is how I need to create a video http://viewptch.ptchcdn.com/rendered/52b28a9f8d4f980f3a3f99c3_cb44bf2b/52b28a9f8d4f980f3a3f99c3_lrg_main_main.mov
For overlaying alpha video on another videos, I have used AVAnimator, I was succeeded for playing a preview using AVFoundation, AVSynchronizedLayer and AVAnimator.
When rendering video from composition, frames of alpha channel videos renders very slowly.
I need to create a video with alpha channel video on top of another video.
Can any one please suggest me what are the possible ways to render a video like http://viewptch.ptchcdn.com/rendered/52b28a9f8d4f980f3a3f99c3_cb44bf2b/52b28a9f8d4f980f3a3f99c3_lrg_main_main.mov ?
You mention that you have looked at AVAnimator, did you download the KittyBoom example project and try it out? The specifics of how it works are detailed in this post. One thing to note is that when you build and run on the device, you need to turn Debug mode off otherwise it will not execute quickly because a number of extra checks are done in debug mode. Also, you have to make sure to test on the actual device, the simulator is not a good measure of performance on a real device. Performance is a key problem with video that contains an alpha channel as iOS does not support video with an alpha channel by default.
Related
is it possible to have a real-time preview of AVMutableComposition which has some layer instructions applied to its assets?
The only class I found that connects AVMutableComposition with AVVideoComposition (holding instructions) is AVExportSession. Does it mean I must export it first to play a preview?
If so, how does apps like Final Cut Pro serve real-time preview when I edit part of the video. Do they cut the whole video into multiple chunks, export what has changed and keep change of everything else?
This sounds like a difficult problem - is there any library that would help in cutting video into small chunks to export and keeping an eye on cache invalidation?
Cheers,
M.
I don't know if this is still relevant but you can always extract each frame from the video, manipulate it accordingly then render it to the screen.
If its from AVCaptureSession you can get CMSampleBuffer from the callbacks, if it's a file I think AVReader is your best bet then you can use either CoreImage or Metal to manipulate the frames and render them in real-time.
There is no real time preview with AVMutableComposition , they may create a time slot for every change and manage it's visibility when you change the slider below
In my app I have to play an alpha channel video as an overlay over the current view (I'm planning to achieve this alpha channel video using GPUImageAlphaBlendFilter or GPUImageChromaKeyBlendFilter), so I wanted to know if the output video after applying these filters can be played using GPUImage? If we can, then can I get some sample code for the same.
I know AVAnimator is an option but I want to apply filters to these overlay videos i.e.brightness,saturation etc which has to be visible while video is being played because of which I can't use AVAnimator. But this being the next step for now I want to know how to play video using GPUImage.
Thanks in advance! :]
Well, even though I like telling people about AVAnimator, Brad Larson's GPUImage is specifically designed to be a GPU based filtering framework for iOS apps. Application of real time effects like the ones you describe is exactly what GPUImage was designed for. See GPUImage Chroma Key filter
I'm looking for a tips to develop an application for iPhone/iPad that will be able to process video (let's consider only local files stored on the device for simplicity) and play it in real-time. For example you can choose any movie and choose "Old movie" filter and want it like on old lamp TV.
In order to make this idea real i need to implement two key features:
1) Grab frames and audio stream from a movie file and get access to separate frames (I'm interested in raw pixel buffer in BGRA or at least YUV color space).
2) Display processed frames somehow. I know it's possible to render processed frame to OpenGL texture, but i would like to have more powerful component with playback controls. Is there any class of media player that supports playing custom image and audio buffers?
The processing function is done and it's fast (less than duration on one frame).
I'm not asking for ready solution, but any tips are welcome!
Answer
Frame grabbing.
It seems the only way to grab video and audio frames is to use AVAssetReader class. Although it's not recommended to use for real-time grabbing it does the job. In my tests on iPad2 grabbing single frame needs about 7-8 ms. Seeking across the video is a tricky. Maybe someone can point to more efficient solution?
Video playback. I've done this with custom view and GLES to render a rectangle texture with a video frame inside of it. As far as i know it's the fastest way to draw bitmaps.
Problems
Need to manually play a sound samples
AVAssetReader grabbing should be synchronized with a movie frame rate. Otherwise movie will go too fast or too slow.
AVAssetReader allows only continuous frame access. You can't seek forward and backward. Only proposed solution is to delete old reader and create a new with trimmed time range.
This is how you would load a video from the camera roll..
This is a way to start processing video. Brad Larson did a great job..
How to grab video frames..
You can use AVPlayer+ AVPlayerItem, it provide you a chance to apply a filter on the display image.
Well I have a burning issue, with the iPad.. more specifically Safari on the iPad.
The tag doesn't play .mov's at all and converting using Quicktime to .m4v loses my alpha layer.
Currently working with a PNG sequence but its horrible.. Is there any way I can keep my alpha channel and have the video tag work?
Many thanks!
Dave
html5-video is able to play mov-files! usually this happens, if the server isn t well configured with the MIME-type of the video. check my answer here Playing a movie/DVD on a website maybe this solves your mov problem and you don t have to use .m4v at all
it turned out that the iPad doesn't support the alpha channel (or at least every method and format I tried wouldn't work) and it wouldn't play the video because Apple have disabled the autoplay attribute
But I did make a workaround, there appears to be a buffer at page load in which you can start the play() function on a video and stop it meaning its ready to play. But you can only get one video in...
Bummer..
Is it possible to play a video with alpha channel in you iPhone app?
I'm thinking UIView with a subview (the movie view) and play a movie with alpha channel in that view..
Is this possible?
Take a peek at the "APNG" app on the app store. This app is a free demo that shows how animations with an alpha channel can be implemented in an iOS app using the .apng file format. One can also take a peek at h.264 with an alpha channel for a detailed description about how a pair of h.264 videos can be used to implement a video with an alpha channel.
APNG app
It looks like there's no way to do this with iOS provided solutions. But according to this answer to a similar question you might succeed with ffmpeg. The problem with ffmpeg is that GPL/LGPL are incompatible with Apple's terms so you can't use it in an app for the App Store.