GPUImage Green Screen - ios

I am trying to do a green screen effect using GPUImage. The effect I am trying to achieve is to play a movie of curtains opening and replace the white part of the movie with the image. This will display the curtains and then the curtains open to display the image.
I have the movie displaying correctly and the white part of the movie is as black but the image does not display when the curtains open. What am I doing wrong?
NSURL *sampleURL = [[NSBundle mainBundle] URLForResource:#"CurtainsOpening" withExtension:#"m4v"];
GPUImageMovie *movieFile = [[GPUImageMovie alloc] initWithURL:sampleURL];
movieFile.playAtActualSpeed = YES;
NSLog(#"movie file = %#", movieFile);
GPUImageChromaKeyBlendFilter *filter = [[GPUImageChromaKeyBlendFilter alloc] init];
[(GPUImageChromaKeyBlendFilter *)filter setColorToReplaceRed:1.0 green:1.0 blue:1.0];
[(GPUImageChromaKeyBlendFilter *)filter setThresholdSensitivity:0.0]; //was 0.4
[movieFile addTarget:filter];
UIImage *inputImage = [UIImage imageNamed:#"curtains.jpg"];
NSLog(#"inputImage = %#", inputImage);
GPUImagePicture *overlayPicture = [[GPUImagePicture alloc] initWithImage:inputImage smoothlyScaleOutput:YES];
NSLog(#"overlayPicture = %#", overlayPicture);
[overlayPicture processImage];
[overlayPicture addTarget:filter];
//[movieFile addTarget:overlayPicture];
GPUImageView *view0 = [[GPUImageView alloc] initWithFrame:self.view.frame];
[view0 setFillMode:kGPUImageFillModeStretch];
NSLog(#"view0 = %#", view0);
[filter addTarget:view0];
[self.view addSubview:view0];
[view0 bringSubviewToFront:self.view];
NSLog(#"frame = %f %f", self.view.frame.size.width, self.view.frame.size.height);
[movieFile startProcessing];

I figured out it out. If anyone wants to know you need to make the GPUImagePicture variable an instance variable so the code does not remove the variable from memory when it exits the method.

Related

Export Video With Watermark using GPUImage iOS

I has tried the most of the question related to GPUImage (GPUImageMovieWriter) to export the video.I write this code to create and export the video.This gives the black screen and does not export the video.
In the ViewController.h file..
{
GPUImageMovie *movieFile;
GPUImageOutput<GPUImageInput> *filter;
GPUImageMovieWriter *movieWriter;
GPUImageUIElement *uiElementInput;
}
#property (nonatomic,strong) IBOutlet UIView *vwVideo;
In the ViewController.m file
- (void) editVideo {
NSURL *sampleURL = [[NSBundle mainBundle] URLForResource:#"sophie" withExtension:#"mov"];
GPUImagePicture *overlay = [[GPUImagePicture alloc] initWithImage:[UIImage imageNamed:#"iphone-icon-180.png"] smoothlyScaleOutput:YES];
[overlay processImage];
movieFile = [[GPUImageMovie alloc] initWithURL:sampleURL];
movieFile.runBenchmark = YES;
movieFile.playAtActualSpeed = NO;
filter = [[GPUImageBrightnessFilter alloc] init];
[(GPUImageBrightnessFilter*)filter setBrightness:0.0];
GPUImageAlphaBlendFilter *blendFilter = [[GPUImageAlphaBlendFilter alloc] init];
blendFilter.mix = 1.0;
UIView *contentView = [[UIView alloc] initWithFrame:CGRectMake(0, 0, [[UIScreen mainScreen] bounds].size.width, [[UIScreen mainScreen] bounds].size.height-20)];
contentView.backgroundColor = [UIColor clearColor];
UIImageView *ivTemp = [[UIImageView alloc] initWithFrame:CGRectMake(20, 20, 147, 59)];
ivTemp.image = [UIImage imageNamed:#"iphone-icon-180.png"];
[contentView addSubview:ivTemp];
UILabel *lblDemo = [[UILabel alloc] initWithFrame:CGRectMake(0, 100, 100, 30)];
lblDemo.text = #"Did";
lblDemo.font = [UIFont systemFontOfSize:30];
lblDemo.textColor = [UIColor redColor];
lblDemo.tag = 1;
lblDemo.hidden = YES;
lblDemo.backgroundColor = [UIColor clearColor];
[contentView addSubview:lblDemo];
uiElementInput = [[GPUImageUIElement alloc] initWithView:contentView];
[filter addTarget:blendFilter];
[uiElementInput addTarget:blendFilter];
// Only rotate the video for display, leave orientation the same for recording
GPUImageView *filterView = (GPUImageView *)vwVideo;
[filter addTarget:filterView];
[movieFile addTarget:filter];
[blendFilter addTarget:filterView];
[filter setFrameProcessingCompletionBlock:^(GPUImageOutput * filter, CMTime frameTime){
if (frameTime.value/frameTime.timescale == 2) {
[contentView viewWithTag:1].hidden = NO;
}
[uiElementInput update];
}];
// In addition to displaying to the screen, write out a processed version of the movie to disk
NSString *pathToMovie = [NSHomeDirectory() stringByAppendingPathComponent:#"Documents/Movie.mov"];
unlink([pathToMovie UTF8String]); // If a file already exists, AVAssetWriter won't let you record new frames, so delete the old movie
NSLog(#"%#",pathToMovie);
movieWriter = [[GPUImageMovieWriter alloc] initWithMovieURL:[NSURL fileURLWithPath:pathToMovie] size:CGSizeMake(640.0, 480.0)];
[filter addTarget:movieWriter];
movieWriter.shouldPassthroughAudio = YES;
movieFile.audioEncodingTarget = movieWriter;
[movieFile enableSynchronizedEncodingUsingMovieWriter:movieWriter];
[movieWriter startRecording];
[movieFile startProcessing];
[movieWriter setCompletionBlock:^{
NSLog(#"Saved");
UISaveVideoAtPathToSavedPhotosAlbum(pathToMovie, nil, nil, nil);
}];}
This code does not export the video to document directory or does not save.Please help me.How to add watermark on the video.
Note : I also tried the AvassetWriter and AVMutableComposition to create the video,but that does not work in iOS 8 ,but work in ios 7 .
i tried your code, and i found that you use wrong target. i have modified it, you can try it again.
NSURL *sampleURL = [NSURL fileURLWithPath:[[NSBundle mainBundle] pathForResource:#"video" ofType:#"mp4"]];
movieFile = [[GPUImageMovie alloc] initWithURL:sampleURL];
movieFile.runBenchmark = YES;
movieFile.playAtActualSpeed = NO;
filter = [[GPUImageSepiaFilter alloc] init];
GPUImageAlphaBlendFilter *blendFilter = [[GPUImageAlphaBlendFilter alloc] init];
blendFilter.mix = 1.0;
UIView *contentView = [[UIView alloc] initWithFrame:CGRectMake(0,
0,
[[UIScreen mainScreen] bounds].size.width,
[[UIScreen mainScreen] bounds].size.height-20)];
contentView.backgroundColor = [UIColor clearColor];
UIImageView *ivTemp = [[UIImageView alloc] initWithFrame:CGRectMake(20, 20, 147, 59)];
ivTemp.image = [UIImage imageNamed:#"db"];
[contentView addSubview:ivTemp];
UILabel *lblDemo = [[UILabel alloc] initWithFrame:CGRectMake(0, 100, 200, 30)];
lblDemo.text = #"Did---啊啊啊";
lblDemo.font = [UIFont systemFontOfSize:30];
lblDemo.textColor = [UIColor redColor];
lblDemo.tag = 1;
lblDemo.hidden = YES;
lblDemo.backgroundColor = [UIColor clearColor];
[contentView addSubview:lblDemo];
uiElementInput = [[GPUImageUIElement alloc] initWithView:contentView];
[filter addTarget:blendFilter];
[uiElementInput addTarget:blendFilter];
// Only rotate the video for display, leave orientation the same for recording
GPUImageView *filterView = [[GPUImageView alloc] initWithFrame:self.view.frame];
[filterView setBackgroundColor:[UIColor redColor]];
[movieFile addTarget:filter];
[blendFilter addTarget:filterView];
[self.view addSubview:filterView];
__unsafe_unretained GPUImageUIElement *weakUIElementInput = uiElementInput;
[filter setFrameProcessingCompletionBlock:^(GPUImageOutput * filter, CMTime frameTime){
if (frameTime.value/frameTime.timescale == 2) {
[contentView viewWithTag:1].hidden = NO;
}
[weakUIElementInput update];
}];
// In addition to displaying to the screen, write out a processed version of the movie to disk
NSString *pathToMovie = [NSHomeDirectory() stringByAppendingPathComponent:#"Documents/Movie.mov"];
unlink([pathToMovie UTF8String]); // If a file already exists, AVAssetWriter won't let you record new frames, so delete the old movie
NSLog(#"%#",pathToMovie);
movieWriter = [[GPUImageMovieWriter alloc] initWithMovieURL:[NSURL fileURLWithPath:pathToMovie] size:CGSizeMake(640.0, 480.0)];
[blendFilter addTarget:movieWriter];
movieWriter.shouldPassthroughAudio = YES;
movieFile.audioEncodingTarget = movieWriter;
[movieFile enableSynchronizedEncodingUsingMovieWriter:movieWriter];
[movieWriter startRecording];
[movieFile startProcessing];
[movieWriter setCompletionBlock:^{
NSLog(#"Saved");
UISaveVideoAtPathToSavedPhotosAlbum(pathToMovie, nil, nil, nil);
}];

GPUImageGaussianBlurFilter doesn't seem to work with layered filters

I have layered filters that all look great with the images I am using but if I change the Gaussian Blur parameters either higher or lower there is no visible difference in the blurring effect. What am I doing wrong ?
Here is my code :
GPUImageView *finalView;
- (void)viewDidLoad
{
[super viewDidLoad];
UIImage *topLayer = [UIImage imageNamed:#"Glass.png"];
UIImage *baseLayer = [UIImage imageNamed:#"BasePhoto.png"];
GPUImagePicture *stillImageSourceTop = [[GPUImagePicture alloc] initWithImage:topLayer];
GPUImagePicture *stillImageSourceBottom = [[GPUImagePicture alloc] initWithImage:baseLayer];
GPUImageScreenBlendFilter *screenBlendFilter = [[GPUImageScreenBlendFilter alloc] init];
GPUImageBrightnessFilter *brightnessFilter = [[GPUImageBrightnessFilter alloc] init];
GPUImageContrastFilter *contrastFilter = [[GPUImageContrastFilter alloc] init];
GPUImageSaturationFilter *saturationFilter = [[GPUImageSaturationFilter alloc] init];
GPUImageColorMatrixFilter *colorMatrixFilter = [[GPUImageColorMatrixFilter alloc] init];
GPUImageOpacityFilter *opacityFilter = [[GPUImageOpacityFilter alloc] init];
opacityFilter.opacity = 0;
GPUImageGaussianBlurFilter *blurFilter = [[GPUImageGaussianBlurFilter alloc]init];
blurFilter.texelSpacingMultiplier = 4.0;
blurFilter.blurRadiusInPixels = 200.0;
blurFilter.blurPasses = 4;
[stillImageSourceTop addTarget:brightnessFilter];
[brightnessFilter addTarget:contrastFilter];
[contrastFilter addTarget:saturationFilter];
[saturationFilter addTarget:colorMatrixFilter];
[colorMatrixFilter addTarget:blurFilter];
[blurFilter addTarget:opacityFilter];
[stillImageSourceTop processImage];
[opacityFilter useNextFrameForImageCapture];
UIImage *topLayerImage = [opacityFilter imageFromCurrentFramebuffer];
GPUImagePicture *stillImageSourceTopWithFilters = [[GPUImagePicture alloc] initWithImage:topLayerImage];
[stillImageSourceBottom addTarget:screenBlendFilter];
[stillImageSourceTopWithFilters addTarget:screenBlendFilter];
[screenBlendFilter useNextFrameForImageCapture];
[stillImageSourceBottom processImage];
UIImage *mergedlayeredimage = [screenBlendFilter imageFromCurrentFramebuffer];
[finalImageView setImage:mergedlayeredimage];
}
Well, that's because most of the filters in that above don't actually do anything. The only filter you've wired up is the screenBlendFilter, where you have your source images both going into it and then you pull the one image out of it. You never actually use the blur for anything there, so of course it won't affect the output any.

Add a GPUImagePicture on a video

I'm trying to add a GPUImagePicture and a GPUImageUIElement on a video.
GPUImageUIElement is working, but I've a problem with the GPUImagePicture because I only see it on the first frame and then it disappears.
Here's my code:
filter = [[GPUImageBrightnessFilter alloc] init];
[(GPUImageBrightnessFilter *)filter setBrightness:0];
GPUImageAlphaBlendFilter *blendFilter = [[GPUImageAlphaBlendFilter alloc] init];
blendFilter.mix = 1.0;
GPUImagePicture *overlay = [[GPUImagePicture alloc] initWithImage:[UIImage imageNamed:#"Logo.png"] smoothlyScaleOutput:YES];
GPUImageTransformFilter *transformFilter = [GPUImageTransformFilter new];
[transformFilter forceProcessingAtSize:CGSizeMake(73, 83)];
[transformFilter setAffineTransform:CGAffineTransformMakeScale(0.7, 0.7)];
[overlay addTarget:transformFilter];
[overlay processImage];
UIView *subview1 = [[UIView alloc] initWithFrame:CGRectMake(0, 0, 320, 400)];
UILabel *temperaturaText = [[UILabel alloc]initWithFrame:CGRectMake(77, 100, 105, 60)];
temperaturaText.text = #"test";
[subview1 addSubview:temperaturaText];
uiElementInput = [[GPUImageUIElement alloc] initWithView:subview1];
[filter addTarget:blendFilter];
[uiElementInput addTarget:blendFilter];
[blendFilter addTarget:filterView];
[overlay addTarget:filterView];
__unsafe_unretained GPUImageUIElement *weakUIElementInput = uiElementInput;
[filter setFrameProcessingCompletionBlock:^(GPUImageOutput *filter, CMTime frameTime) {
[weakUIElementInput update];
}];
[blendFilter addTarget:movieWriter];
Your problem is that you define your input picture of overlay as a local variable within your setup method. You aren't holding onto a strong reference to it, so it will be deallocated the instant you finish this method, which also will remove the image texture and its output from your processing pipeline.
If you want to hold on to an input image, you need to make overlay an instance variable of your class, like you did for your camera or movie input in the above. Then it will persist to be used by the framework as input.

Add UIImage Element using GPUImage Framework

I am using Brad Larson's GPUImage Framework to add a UIImage element,i have successfully added the image but the main issue is that the image is getting stretched to the video's aspect ratio.
Here is my code:
GPUImageView *filterView = (GPUImageView *)self.view;
videoCamera = [[GPUImageVideoCamera alloc] initWithSessionPreset:AVCaptureSessionPreset640x480 cameraPosition:AVCaptureDevicePositionBack];
videoCamera.outputImageOrientation = UIInterfaceOrientationPortrait;
transformFilter=[[GPUImageTransformFilter alloc]init];
CGAffineTransform t=CGAffineTransformMakeScale(0.5, 0.5);
[(GPUImageTransformFilter *)filter setAffineTransform:t];
[videoCamera addTarget:transformFilter];
filter = [[GPUImageOverlayBlendFilter alloc] init];
[videoCamera addTarget:filter];
inputImage = [UIImage imageNamed:#"eye.png"];
sourcePicture = [[GPUImagePicture alloc] initWithImage:inputImage smoothlyScaleOutput:YES];
[sourcePicture forceProcessingAtSize:CGSizeMake(50, 50)];
[sourcePicture processImage];
[sourcePicture addTarget:filter];
[sourcePicture addTarget:transformFilter];
[filter addTarget:filterView];
[videoCamera startCameraCapture];
I have tried to use transform filter before blending the image,but it isn't getting scaled.
I want the image to appear at the center.How do i do it?
Thanks
You are on the right track, just have a few things out of place.
The following code will load an overlay image and apply a transformation to keep it at actual size. By default is will be centered over the video.
GPUImageView *filterView = (GPUImageView *)self.view;
videoCamera = [[GPUImageVideoCamera alloc] initWithSessionPreset:AVCaptureSessionPreset640x480 cameraPosition:AVCaptureDevicePositionBack];
videoCamera.outputImageOrientation = UIInterfaceOrientationPortrait;
filter = [[GPUImageOverlayBlendFilter alloc] init];
transformFilter = [[GPUImageTransformFilter alloc]init];
[videoCamera addTarget:filter];
[transformFilter addTarget:filter];
// setup overlay image
inputImage = [UIImage imageNamed:#"eye.png"];
sourcePicture = [[GPUImagePicture alloc] initWithImage:inputImage smoothlyScaleOutput:YES];
// determine the necessary scaling to keep image at actual size
CGFloat tx = inputImage.size.width / 480.0; // 480/640: based on video camera preset
CGFloat ty = inputImage.size.height / 640.0;
// apply transform to filter
CGAffineTransform t = CGAffineTransformMakeScale(tx, ty);
[(GPUImageTransformFilter *)transformFilter setAffineTransform:t];
//
[sourcePicture addTarget:filter];
[sourcePicture addTarget:transformFilter];
[sourcePicture processImage];
[filter addTarget:filterView];
[videoCamera startCameraCapture];

GPUImage group filters

I am trying to implement brightness,contrast and exposure filters in a single view same as you see in iPhoto app. I have tried to put up group filter for doing the same. But it shows up a white screen instead of modified picture. Here is the code I applied.
GPUImageFilterGroup *groupFilter = [[GPUImageFilterGroup alloc]init];
GPUImageBrightnessFilter *brightnessFilter = [[GPUImageBrightnessFilter alloc]init];
[brightnessFilter setBrightness:brightnessValue];
GPUImageContrastFilter *contrastFilter = [[GPUImageContrastFilter alloc]init];
[contrastFilter setContrast:contrastValue];
GPUImageExposureFilter *exposureFilter =[[GPUImageExposureFilter alloc]init];
[exposureFilter setExposure:exposureValue];
[groupFilter addFilter:brightnessFilter];
[groupFilter addFilter:contrastFilter];
[groupFilter addFilter:exposureFilter];
GPUImagePicture *stillImage= [[GPUImagePicture alloc]initWithImage:self.imageToModify];
[stillImage addTarget:groupFilter];
[stillImage processImage];
previewPicture.image = [groupFilter imageFromCurrentlyProcessedOutputWithOrientation:self.imageToModify.imageOrientation];
I even tried to put each individually but still it shows up white image. Is the above code I am using correct?
I have also tried using GPUImageFilterPipeline instead of GPUImageFilterGroup but still having the same issue.
For the record, the image is a still image and not live feed.
you have missed some code statement for doing it which is as below.
[brightnessFilter addTarget: contrastFilter];
[contrastFilter addTarget: exposureFilter];
[(GPUImageFilterGroup *) groupFilter setInitialFilters:[NSArray arrayWithObject: brightnessFilter]];
[(GPUImageFilterGroup *) groupFilter setTerminalFilter:exposureFilter];
Thanks
Here is a full code to apply Brightness, Contrast and Saturation to video
i get the reference code from this link and i make change in that..
** in this code we use GPUImage Frameworks
1). in .h file
#import "GPUImage.h"
#interface ViewController : UIViewController
{
GPUImageMovie *movieFile;
GPUImageMovieWriter *movieWriter;
GPUImageUIElement *uiElementInput;
}
2). in .m file
- (void)editVideo
{
NSURL *sampleURL = [[NSBundle mainBundle] URLForResource:#"Sample Video" withExtension:#"m4v"];
movieFile = [[GPUImageMovie alloc] initWithURL:sampleURL];
movieFile.runBenchmark = YES;
movieFile.playAtActualSpeed = YES;
GPUImageFilterGroup *groupFilter = [[GPUImageFilterGroup alloc]init]; //1
GPUImageBrightnessFilter *brightnessFilter = [[GPUImageBrightnessFilter alloc] init]; //2
GPUImageContrastFilter *contrastFilter = [[GPUImageContrastFilter alloc] init]; //3
GPUImageSaturationFilter *saturationFilter = [[GPUImageSaturationFilter alloc] init]; //4
[(GPUImageBrightnessFilter*)brightnessFilter setBrightness:0.10]; // change value between -1.00 to 1.00
[(GPUImageContrastFilter*)contrastFilter setContrast:1.48]; // change value between 0.00 to 4.00
[(GPUImageSaturationFilter*)saturationFilter setSaturation:2.00]; //change value between 0.00 to 2.00
GPUImageAlphaBlendFilter *blendFilter = [[GPUImageAlphaBlendFilter alloc] init]; //5
blendFilter.mix = 0.0;
/* ************************************************** */
UIView *contentView = [[UIView alloc] initWithFrame:CGRectMake(0, 0, [[UIScreen mainScreen] bounds].size.width, [[UIScreen mainScreen] bounds].size.height-20)];
contentView.backgroundColor = [UIColor clearColor];
uiElementInput = [[GPUImageUIElement alloc] initWithView:contentView];
[brightnessFilter addTarget:contrastFilter];
[contrastFilter addTarget:saturationFilter];
[saturationFilter addTarget:blendFilter];
[uiElementInput addTarget:blendFilter];
[(GPUImageFilterGroup *) groupFilter setInitialFilters:[NSArray arrayWithObject: brightnessFilter]];
[(GPUImageFilterGroup *) groupFilter setTerminalFilter:blendFilter];
[movieFile addTarget:brightnessFilter];
[movieFile addTarget:contrastFilter];
[movieFile addTarget:saturationFilter];
// Only rotate the video for display, leave orientation the same for recording
GPUImageView *filterView = (GPUImageView *)vwVideo;
[brightnessFilter addTarget:filterView];
[contrastFilter addTarget:filterView];
[saturationFilter addTarget:filterView];
[blendFilter addTarget:filterView];
[brightnessFilter setFrameProcessingCompletionBlock:^(GPUImageOutput * filter, CMTime frameTime){
if (frameTime.value/frameTime.timescale == 2) {
[contentView viewWithTag:1].hidden = NO;
}
[uiElementInput update];
}];
[contrastFilter setFrameProcessingCompletionBlock:^(GPUImageOutput * filter, CMTime frameTime){
if (frameTime.value/frameTime.timescale == 2) {
[contentView viewWithTag:1].hidden = NO;
}
[uiElementInput update];
}];
[saturationFilter setFrameProcessingCompletionBlock:^(GPUImageOutput * filter, CMTime frameTime){
if (frameTime.value/frameTime.timescale == 2) {
[contentView viewWithTag:1].hidden = NO;
}
[uiElementInput update];
}];
//In addition to displaying to the screen, write out a processed version of the movie to disk
NSString *pathToMovie = [NSHomeDirectory() stringByAppendingPathComponent:#"Documents/Movie.m4v"];
unlink([pathToMovie UTF8String]); // If a file already exists, AVAssetWriter won't let you record new frames, so delete the old movie
movieWriter = [[GPUImageMovieWriter alloc] initWithMovieURL:[NSURL fileURLWithPath:pathToMovie] size:CGSizeMake(640.0, 480.0)];
[brightnessFilter addTarget:movieWriter];
[contrastFilter addTarget:movieWriter];
[blendFilter addTarget:movieWriter];
// Configure this for video from the movie file, where we want to preserve all video frames and audio samples
movieWriter.shouldPassthroughAudio = YES;
movieFile.audioEncodingTarget = movieWriter;
[movieFile enableSynchronizedEncodingUsingMovieWriter:movieWriter];
[movieWriter startRecording];
[movieFile startProcessing];
[movieWriter setCompletionBlock:^{
UISaveVideoAtPathToSavedPhotosAlbum(pathToMovie, nil, nil, nil);
}];
}
*Note :- Please run that demo example in device not in simulator for perfect result.
After process done you will get effected video in your device and for different effect you can change the value at setBrightness , setContrast and setSaturation (see in code)

Resources