How to add Dynamic SubTitle Overlay on OutPut Video using GPUImage iOS? - ios

Currently I am working on Video Processing application. I record video using GPUImage for iOS. Video is generated Perfect but i also want to add dynamic subtitles in Video i.e text change with varying time. I don't have any idea how to do subtitles in exported video using GPUImage so i am having difficulty in getting required Output. Kindly help me in this.
Thanks

Related

Capture multiple frames and prepare a video like boomerang

I am trying to integrate a functionality where i have to capture multiple frames let say 3 frames in a second and at last combines all captured frames to form a video and upload that video on server.
You can refer the functionality same as happens in boomerang, i have searched a lot about the most effective way to do the same but didn't found anything helpful.
Any guidance is appreciated.
Video From Image Array
For combining your image array to video please make use of widely accepted answer here.
By using AVAssetWriter and CVPixelBufferRef .
To Make Video in reverse order
This can be done effectively Using the AVFoundation library.
Use AVAssetReader to read the video and Use AVAssetWriter to write the video in reverse order.
Refer the tutorial Reverse Video AVFoundation

Color effect on video in iOS

How is it possible to set multiple color effects on a video when it is playing ?
Just like Instagram. I've used core image framework to give effects on images. But I have no idea how to do same for videos. I'll not prefer to use any third party SDK.
Thanks in advance.
The CoreImage framework can also process a video input.
This Apple sample project will even show you how to save a filtered video stream while displaying the preview: Core Image Filters with Photos and Video for iOS

Play video using GPUimage

In my app I have to play an alpha channel video as an overlay over the current view (I'm planning to achieve this alpha channel video using GPUImageAlphaBlendFilter or GPUImageChromaKeyBlendFilter), so I wanted to know if the output video after applying these filters can be played using GPUImage? If we can, then can I get some sample code for the same.
I know AVAnimator is an option but I want to apply filters to these overlay videos i.e.brightness,saturation etc which has to be visible while video is being played because of which I can't use AVAnimator. But this being the next step for now I want to know how to play video using GPUImage.
Thanks in advance! :]
Well, even though I like telling people about AVAnimator, Brad Larson's GPUImage is specifically designed to be a GPU based filtering framework for iOS apps. Application of real time effects like the ones you describe is exactly what GPUImage was designed for. See GPUImage Chroma Key filter

Effect to audio and video file

In my app I want to give effect to audio and video file like it is given to Pheed Application. I want to mix audio recorded sound with predefine clips. And also want to give sound different effect like echo , surrounding Normal , And with video file I want to add different effects like sepia,.. I don't know which class library I have to use to implement6 this . Can any one suggest me any library for that or any way how to do that?
Here are a pair of libraries you can try:
For audio filters and effects you can try with The Amazing Audio Engine. It is build on top of Core Audio.
For video effects try GPUImage. I haven't tried it but it is supposed to work better than Core Image when handling videos.

Add an arbitrary image above live camera feed using GPUIMage

I'm currently working on an augmented reality app, that's why I would like to add an image above the live video feed (GPUImageVideoCamera) AND be able to record the whole to an output file.
Get the video live preview, and recording is ok from now, but I can't manage to add the image on the screen the way it's recorded to the output file.
What's the best way to achieve this (I mean a GPUImage compliant way) ?

Resources