I am trying to implement brightness,contrast and exposure filters in a single view same as you see in iPhoto app. I have tried to put up group filter for doing the same. But it shows up a white screen instead of modified picture. Here is the code I applied.
GPUImageFilterGroup *groupFilter = [[GPUImageFilterGroup alloc]init];
GPUImageBrightnessFilter *brightnessFilter = [[GPUImageBrightnessFilter alloc]init];
[brightnessFilter setBrightness:brightnessValue];
GPUImageContrastFilter *contrastFilter = [[GPUImageContrastFilter alloc]init];
[contrastFilter setContrast:contrastValue];
GPUImageExposureFilter *exposureFilter =[[GPUImageExposureFilter alloc]init];
[exposureFilter setExposure:exposureValue];
[groupFilter addFilter:brightnessFilter];
[groupFilter addFilter:contrastFilter];
[groupFilter addFilter:exposureFilter];
GPUImagePicture *stillImage= [[GPUImagePicture alloc]initWithImage:self.imageToModify];
[stillImage addTarget:groupFilter];
[stillImage processImage];
previewPicture.image = [groupFilter imageFromCurrentlyProcessedOutputWithOrientation:self.imageToModify.imageOrientation];
I even tried to put each individually but still it shows up white image. Is the above code I am using correct?
I have also tried using GPUImageFilterPipeline instead of GPUImageFilterGroup but still having the same issue.
For the record, the image is a still image and not live feed.
you have missed some code statement for doing it which is as below.
[brightnessFilter addTarget: contrastFilter];
[contrastFilter addTarget: exposureFilter];
[(GPUImageFilterGroup *) groupFilter setInitialFilters:[NSArray arrayWithObject: brightnessFilter]];
[(GPUImageFilterGroup *) groupFilter setTerminalFilter:exposureFilter];
Thanks
Here is a full code to apply Brightness, Contrast and Saturation to video
i get the reference code from this link and i make change in that..
** in this code we use GPUImage Frameworks
1). in .h file
#import "GPUImage.h"
#interface ViewController : UIViewController
{
GPUImageMovie *movieFile;
GPUImageMovieWriter *movieWriter;
GPUImageUIElement *uiElementInput;
}
2). in .m file
- (void)editVideo
{
NSURL *sampleURL = [[NSBundle mainBundle] URLForResource:#"Sample Video" withExtension:#"m4v"];
movieFile = [[GPUImageMovie alloc] initWithURL:sampleURL];
movieFile.runBenchmark = YES;
movieFile.playAtActualSpeed = YES;
GPUImageFilterGroup *groupFilter = [[GPUImageFilterGroup alloc]init]; //1
GPUImageBrightnessFilter *brightnessFilter = [[GPUImageBrightnessFilter alloc] init]; //2
GPUImageContrastFilter *contrastFilter = [[GPUImageContrastFilter alloc] init]; //3
GPUImageSaturationFilter *saturationFilter = [[GPUImageSaturationFilter alloc] init]; //4
[(GPUImageBrightnessFilter*)brightnessFilter setBrightness:0.10]; // change value between -1.00 to 1.00
[(GPUImageContrastFilter*)contrastFilter setContrast:1.48]; // change value between 0.00 to 4.00
[(GPUImageSaturationFilter*)saturationFilter setSaturation:2.00]; //change value between 0.00 to 2.00
GPUImageAlphaBlendFilter *blendFilter = [[GPUImageAlphaBlendFilter alloc] init]; //5
blendFilter.mix = 0.0;
/* ************************************************** */
UIView *contentView = [[UIView alloc] initWithFrame:CGRectMake(0, 0, [[UIScreen mainScreen] bounds].size.width, [[UIScreen mainScreen] bounds].size.height-20)];
contentView.backgroundColor = [UIColor clearColor];
uiElementInput = [[GPUImageUIElement alloc] initWithView:contentView];
[brightnessFilter addTarget:contrastFilter];
[contrastFilter addTarget:saturationFilter];
[saturationFilter addTarget:blendFilter];
[uiElementInput addTarget:blendFilter];
[(GPUImageFilterGroup *) groupFilter setInitialFilters:[NSArray arrayWithObject: brightnessFilter]];
[(GPUImageFilterGroup *) groupFilter setTerminalFilter:blendFilter];
[movieFile addTarget:brightnessFilter];
[movieFile addTarget:contrastFilter];
[movieFile addTarget:saturationFilter];
// Only rotate the video for display, leave orientation the same for recording
GPUImageView *filterView = (GPUImageView *)vwVideo;
[brightnessFilter addTarget:filterView];
[contrastFilter addTarget:filterView];
[saturationFilter addTarget:filterView];
[blendFilter addTarget:filterView];
[brightnessFilter setFrameProcessingCompletionBlock:^(GPUImageOutput * filter, CMTime frameTime){
if (frameTime.value/frameTime.timescale == 2) {
[contentView viewWithTag:1].hidden = NO;
}
[uiElementInput update];
}];
[contrastFilter setFrameProcessingCompletionBlock:^(GPUImageOutput * filter, CMTime frameTime){
if (frameTime.value/frameTime.timescale == 2) {
[contentView viewWithTag:1].hidden = NO;
}
[uiElementInput update];
}];
[saturationFilter setFrameProcessingCompletionBlock:^(GPUImageOutput * filter, CMTime frameTime){
if (frameTime.value/frameTime.timescale == 2) {
[contentView viewWithTag:1].hidden = NO;
}
[uiElementInput update];
}];
//In addition to displaying to the screen, write out a processed version of the movie to disk
NSString *pathToMovie = [NSHomeDirectory() stringByAppendingPathComponent:#"Documents/Movie.m4v"];
unlink([pathToMovie UTF8String]); // If a file already exists, AVAssetWriter won't let you record new frames, so delete the old movie
movieWriter = [[GPUImageMovieWriter alloc] initWithMovieURL:[NSURL fileURLWithPath:pathToMovie] size:CGSizeMake(640.0, 480.0)];
[brightnessFilter addTarget:movieWriter];
[contrastFilter addTarget:movieWriter];
[blendFilter addTarget:movieWriter];
// Configure this for video from the movie file, where we want to preserve all video frames and audio samples
movieWriter.shouldPassthroughAudio = YES;
movieFile.audioEncodingTarget = movieWriter;
[movieFile enableSynchronizedEncodingUsingMovieWriter:movieWriter];
[movieWriter startRecording];
[movieFile startProcessing];
[movieWriter setCompletionBlock:^{
UISaveVideoAtPathToSavedPhotosAlbum(pathToMovie, nil, nil, nil);
}];
}
*Note :- Please run that demo example in device not in simulator for perfect result.
After process done you will get effected video in your device and for different effect you can change the value at setBrightness , setContrast and setSaturation (see in code)
Related
I keep getting this error on and off. I've seen some solutions that recommend using the GPUImageNormalBlendFilter in the filter chain however doing so has resulted in a solid grey colored output.
videoCamera = [[GPUImageVideoCamera alloc] initWithSessionPreset:AVCaptureSessionPreset1280x720 cameraPosition:AVCaptureDevicePositionFront];
filter = [[GPUImageFilter alloc] init];
[videoCamera addTarget:_filter];
blendFilter = [[GPUImageAlphaBlendFilter alloc] init];
blendFilter.mix = 1.0;
animatedImageView = [[UIImageView alloc] initWithFrame:CGRectMake(0, 0, 100, 100)];
[contentView addSubview:_animatedImageView];
[contentView addSubview:[self watermark]];
uiElementInput = [[GPUImageUIElement alloc] initWithView:contentView];
[filter addTarget:blendFilter];
[uiElementInput addTarget:blendFilter];
[blendFilter addTarget:filteredVideoView];
filtered video view is GPUImageView
You must use setFrameProcessingCompletionBlock and invoke uiElement's update method:
__weak typeof(self) weakSelf = self;
[filter setFrameProcessingCompletionBlock:^(GPUImageOutput *filter, CMTime frameTime) {
[weakSelf.uiElementInput update];
}];
[videoCamera startCameraCapture];
I has tried the most of the question related to GPUImage (GPUImageMovieWriter) to export the video.I write this code to create and export the video.This gives the black screen and does not export the video.
In the ViewController.h file..
{
GPUImageMovie *movieFile;
GPUImageOutput<GPUImageInput> *filter;
GPUImageMovieWriter *movieWriter;
GPUImageUIElement *uiElementInput;
}
#property (nonatomic,strong) IBOutlet UIView *vwVideo;
In the ViewController.m file
- (void) editVideo {
NSURL *sampleURL = [[NSBundle mainBundle] URLForResource:#"sophie" withExtension:#"mov"];
GPUImagePicture *overlay = [[GPUImagePicture alloc] initWithImage:[UIImage imageNamed:#"iphone-icon-180.png"] smoothlyScaleOutput:YES];
[overlay processImage];
movieFile = [[GPUImageMovie alloc] initWithURL:sampleURL];
movieFile.runBenchmark = YES;
movieFile.playAtActualSpeed = NO;
filter = [[GPUImageBrightnessFilter alloc] init];
[(GPUImageBrightnessFilter*)filter setBrightness:0.0];
GPUImageAlphaBlendFilter *blendFilter = [[GPUImageAlphaBlendFilter alloc] init];
blendFilter.mix = 1.0;
UIView *contentView = [[UIView alloc] initWithFrame:CGRectMake(0, 0, [[UIScreen mainScreen] bounds].size.width, [[UIScreen mainScreen] bounds].size.height-20)];
contentView.backgroundColor = [UIColor clearColor];
UIImageView *ivTemp = [[UIImageView alloc] initWithFrame:CGRectMake(20, 20, 147, 59)];
ivTemp.image = [UIImage imageNamed:#"iphone-icon-180.png"];
[contentView addSubview:ivTemp];
UILabel *lblDemo = [[UILabel alloc] initWithFrame:CGRectMake(0, 100, 100, 30)];
lblDemo.text = #"Did";
lblDemo.font = [UIFont systemFontOfSize:30];
lblDemo.textColor = [UIColor redColor];
lblDemo.tag = 1;
lblDemo.hidden = YES;
lblDemo.backgroundColor = [UIColor clearColor];
[contentView addSubview:lblDemo];
uiElementInput = [[GPUImageUIElement alloc] initWithView:contentView];
[filter addTarget:blendFilter];
[uiElementInput addTarget:blendFilter];
// Only rotate the video for display, leave orientation the same for recording
GPUImageView *filterView = (GPUImageView *)vwVideo;
[filter addTarget:filterView];
[movieFile addTarget:filter];
[blendFilter addTarget:filterView];
[filter setFrameProcessingCompletionBlock:^(GPUImageOutput * filter, CMTime frameTime){
if (frameTime.value/frameTime.timescale == 2) {
[contentView viewWithTag:1].hidden = NO;
}
[uiElementInput update];
}];
// In addition to displaying to the screen, write out a processed version of the movie to disk
NSString *pathToMovie = [NSHomeDirectory() stringByAppendingPathComponent:#"Documents/Movie.mov"];
unlink([pathToMovie UTF8String]); // If a file already exists, AVAssetWriter won't let you record new frames, so delete the old movie
NSLog(#"%#",pathToMovie);
movieWriter = [[GPUImageMovieWriter alloc] initWithMovieURL:[NSURL fileURLWithPath:pathToMovie] size:CGSizeMake(640.0, 480.0)];
[filter addTarget:movieWriter];
movieWriter.shouldPassthroughAudio = YES;
movieFile.audioEncodingTarget = movieWriter;
[movieFile enableSynchronizedEncodingUsingMovieWriter:movieWriter];
[movieWriter startRecording];
[movieFile startProcessing];
[movieWriter setCompletionBlock:^{
NSLog(#"Saved");
UISaveVideoAtPathToSavedPhotosAlbum(pathToMovie, nil, nil, nil);
}];}
This code does not export the video to document directory or does not save.Please help me.How to add watermark on the video.
Note : I also tried the AvassetWriter and AVMutableComposition to create the video,but that does not work in iOS 8 ,but work in ios 7 .
i tried your code, and i found that you use wrong target. i have modified it, you can try it again.
NSURL *sampleURL = [NSURL fileURLWithPath:[[NSBundle mainBundle] pathForResource:#"video" ofType:#"mp4"]];
movieFile = [[GPUImageMovie alloc] initWithURL:sampleURL];
movieFile.runBenchmark = YES;
movieFile.playAtActualSpeed = NO;
filter = [[GPUImageSepiaFilter alloc] init];
GPUImageAlphaBlendFilter *blendFilter = [[GPUImageAlphaBlendFilter alloc] init];
blendFilter.mix = 1.0;
UIView *contentView = [[UIView alloc] initWithFrame:CGRectMake(0,
0,
[[UIScreen mainScreen] bounds].size.width,
[[UIScreen mainScreen] bounds].size.height-20)];
contentView.backgroundColor = [UIColor clearColor];
UIImageView *ivTemp = [[UIImageView alloc] initWithFrame:CGRectMake(20, 20, 147, 59)];
ivTemp.image = [UIImage imageNamed:#"db"];
[contentView addSubview:ivTemp];
UILabel *lblDemo = [[UILabel alloc] initWithFrame:CGRectMake(0, 100, 200, 30)];
lblDemo.text = #"Did---啊啊啊";
lblDemo.font = [UIFont systemFontOfSize:30];
lblDemo.textColor = [UIColor redColor];
lblDemo.tag = 1;
lblDemo.hidden = YES;
lblDemo.backgroundColor = [UIColor clearColor];
[contentView addSubview:lblDemo];
uiElementInput = [[GPUImageUIElement alloc] initWithView:contentView];
[filter addTarget:blendFilter];
[uiElementInput addTarget:blendFilter];
// Only rotate the video for display, leave orientation the same for recording
GPUImageView *filterView = [[GPUImageView alloc] initWithFrame:self.view.frame];
[filterView setBackgroundColor:[UIColor redColor]];
[movieFile addTarget:filter];
[blendFilter addTarget:filterView];
[self.view addSubview:filterView];
__unsafe_unretained GPUImageUIElement *weakUIElementInput = uiElementInput;
[filter setFrameProcessingCompletionBlock:^(GPUImageOutput * filter, CMTime frameTime){
if (frameTime.value/frameTime.timescale == 2) {
[contentView viewWithTag:1].hidden = NO;
}
[weakUIElementInput update];
}];
// In addition to displaying to the screen, write out a processed version of the movie to disk
NSString *pathToMovie = [NSHomeDirectory() stringByAppendingPathComponent:#"Documents/Movie.mov"];
unlink([pathToMovie UTF8String]); // If a file already exists, AVAssetWriter won't let you record new frames, so delete the old movie
NSLog(#"%#",pathToMovie);
movieWriter = [[GPUImageMovieWriter alloc] initWithMovieURL:[NSURL fileURLWithPath:pathToMovie] size:CGSizeMake(640.0, 480.0)];
[blendFilter addTarget:movieWriter];
movieWriter.shouldPassthroughAudio = YES;
movieFile.audioEncodingTarget = movieWriter;
[movieFile enableSynchronizedEncodingUsingMovieWriter:movieWriter];
[movieWriter startRecording];
[movieFile startProcessing];
[movieWriter setCompletionBlock:^{
NSLog(#"Saved");
UISaveVideoAtPathToSavedPhotosAlbum(pathToMovie, nil, nil, nil);
}];
I'm trying to add a GPUImagePicture and a GPUImageUIElement on a video.
GPUImageUIElement is working, but I've a problem with the GPUImagePicture because I only see it on the first frame and then it disappears.
Here's my code:
filter = [[GPUImageBrightnessFilter alloc] init];
[(GPUImageBrightnessFilter *)filter setBrightness:0];
GPUImageAlphaBlendFilter *blendFilter = [[GPUImageAlphaBlendFilter alloc] init];
blendFilter.mix = 1.0;
GPUImagePicture *overlay = [[GPUImagePicture alloc] initWithImage:[UIImage imageNamed:#"Logo.png"] smoothlyScaleOutput:YES];
GPUImageTransformFilter *transformFilter = [GPUImageTransformFilter new];
[transformFilter forceProcessingAtSize:CGSizeMake(73, 83)];
[transformFilter setAffineTransform:CGAffineTransformMakeScale(0.7, 0.7)];
[overlay addTarget:transformFilter];
[overlay processImage];
UIView *subview1 = [[UIView alloc] initWithFrame:CGRectMake(0, 0, 320, 400)];
UILabel *temperaturaText = [[UILabel alloc]initWithFrame:CGRectMake(77, 100, 105, 60)];
temperaturaText.text = #"test";
[subview1 addSubview:temperaturaText];
uiElementInput = [[GPUImageUIElement alloc] initWithView:subview1];
[filter addTarget:blendFilter];
[uiElementInput addTarget:blendFilter];
[blendFilter addTarget:filterView];
[overlay addTarget:filterView];
__unsafe_unretained GPUImageUIElement *weakUIElementInput = uiElementInput;
[filter setFrameProcessingCompletionBlock:^(GPUImageOutput *filter, CMTime frameTime) {
[weakUIElementInput update];
}];
[blendFilter addTarget:movieWriter];
Your problem is that you define your input picture of overlay as a local variable within your setup method. You aren't holding onto a strong reference to it, so it will be deallocated the instant you finish this method, which also will remove the image texture and its output from your processing pipeline.
If you want to hold on to an input image, you need to make overlay an instance variable of your class, like you did for your camera or movie input in the above. Then it will persist to be used by the framework as input.
I am working on a project that requires a group of effects.
I am successfully using filterGroup as per the example in the FilterShowcase as follows:
filter = [[GPUImageFilterGroup alloc] init];
GPUImageSepiaFilter *sepiaFilter = [[GPUImageSepiaFilter alloc] init];
[(GPUImageFilterGroup *)filter addFilter:sepiaFilter];
GPUImagePixellateFilter *pixellateFilter = [[GPUImagePixellateFilter alloc] init];
[(GPUImageFilterGroup *)filter addFilter:pixellateFilter];
[sepiaFilter addTarget:pixellateFilter];
[(GPUImageFilterGroup *)filter setInitialFilters:[NSArray arrayWithObject:sepiaFilter]];
[(GPUImageFilterGroup *)filter setTerminalFilter:pixellateFilter]; code here
But now I would like to add a new filter to the group like GPUIMAGE_HARRISCORNERDETECTION this filter also requires a blend.
Here is the filter initialization:
filter = [[GPUImageHarrisCornerDetectionFilter alloc] init];
[(GPUImageHarrisCornerDetectionFilter *)filter setThreshold:0.20];
and then it requires the blending as follows:
GPUImageCrosshairGenerator *crosshairGenerator = [[GPUImageCrosshairGenerator alloc] init];
crosshairGenerator.crosshairWidth = 15.0;
[crosshairGenerator forceProcessingAtSize:CGSizeMake(480.0, 640.0)];
[(GPUImageHarrisCornerDetectionFilter *)filter setCornersDetectedBlock:^(GLfloat* cornerArray, NSUInteger cornersDetected, CMTime frameTime) {
[crosshairGenerator renderCrosshairsFromArray:cornerArray count:cornersDetected frameTime:frameTime];
}];
GPUImageAlphaBlendFilter *blendFilter = [[GPUImageAlphaBlendFilter alloc] init];
[blendFilter forceProcessingAtSize:CGSizeMake(480.0, 640.0)];
GPUImageGammaFilter *gammaFilter = [[GPUImageGammaFilter alloc] init];
[videoCamera addTarget:gammaFilter];
[gammaFilter addTarget:blendFilter];
[crosshairGenerator addTarget:blendFilter];
[blendFilter addTarget:filterView];
Is there a way to add the GPUImageCrosshairGenerator, GPUImageAlphaBlendFilter, & GPUImageGammaFilter to the filter group?
Thank you
More specific detail follows:
=============================================
Code that works based on FilterShowcase example:
The test class GPUImageDrawTriangleTest simply draws random triangles over the live video source
self.title = #"DRAWING TRIANGLES TESTING";
triangleFilter = [[GPUImageDrawTriangleTest alloc] init];
[((GPUImageDrawTriangleTest *)particleFilter) setDrawColorRed:1.0 green:0.0 blue:1.0];
filter = [[GPUImageContrastFilter alloc] init];
__unsafe_unretained GPUImageDrawTriangleTest *weakGPUImageTestCust = (GPUImageDrawTriangleTest *)triangleFilter;
[filter setFrameProcessingCompletionBlock:^(GPUImageOutput * filter, CMTime frameTime){
[weakGPUImageTestCust update:frameTime];
}];
blendingFilters = TRUE;
blendFilter = nil;
blendFilter = [[GPUImageAlphaBlendFilter alloc] init];
[blendFilter forceProcessingAtSize:CGSizeMake(640.0, 480.0)];
GPUImageGammaFilter *gammaFilter = [[GPUImageGammaFilter alloc] init];
[videoCamera addTarget:gammaFilter];
[gammaFilter addTarget:blendFilter];
blendFilter.mix = 1.0;
[triangleFilter addTarget:blendFilter];
[filter addTarget:blendFilter];
[blendFilter addTarget:filterView];
[filter addTarget:filterView];
[videoCamera addTarget:filter];
Based on the FilterShowcase example and the template of the GPUImageUnsharpMaskFilter group I created GPUImageParticleGroupTest
#import "GPUImageParticleGroupTest.h"
#import "GPUImageFilter.h"
#import "GPUImageGammaFilter.h"
#import "GPUImageDrawTriangleTest.h"
#import "GPUImageContrastFilter.h"
#import "GPUImageAlphaBlendFilter.h"
#import "GPUImageDrawTriangleTest.h"
#implementation GPUImageParticleGroupTest
- (id)init;
{
if (!(self = [super init]))
{
return nil;
}
contrastFilter = [[GPUImageContrastFilter alloc] init];
[self addFilter:contrastFilter];
gammaFilter = [[GPUImageGammaFilter alloc] init];
[self addFilter:gammaFilter];
triangleFilter = [[GPUImageDrawTriangleTest alloc] init];
[((GPUImageDrawTriangleTest *)triangleFilter) setDrawColorRed:1.0 green:0.0 blue:1.0];
//[self addFilter:triangleFilter];
__unsafe_unretained GPUImageDrawTriangleTest *weakGPUImageTestCust = (GPUImageDrawTriangleTest *)triangleFilter;
[ contrastFilter setFrameProcessingCompletionBlock:^(GPUImageOutput * contrastfilter, CMTime frameTime){
[weakGPUImageTestCust update:frameTime];
}];
theblendFilter = [[GPUImageAlphaBlendFilter alloc] init];
theblendFilter.mix = 1.0;
[self addFilter:theblendFilter];
[gammaFilter addTarget:theblendFilter atTextureLocation:1];
[triangleFilter addTarget:theblendFilter atTextureLocation:1];
[contrastFilter addTarget:theblendFilter atTextureLocation:1];
self.initialFilters = [NSArray arrayWithObjects:contrastFilter,gammaFilter, nil];
self.terminalFilter = theblendFilter;
return self;
}
#end
The intent is that when this group class was instantiated as follows:
filter= [[GPUImageParticleGroupTest alloc] init];
[filter addTarget:filterView];
[videoCamera addTarget:filter];
I would get the same result and have the same random triangles drawn over live video. The app does not crash but I no longer get any live video or triangles.
Where did I go wrong?
When dealing with a GPUImageFilterGroup that needs to blend the input image with something generated inside the group, there's only one other thing you need to deal with, and that's making sure targets get added to the blend in the right order.
Look at the GPUImageUnsharpMaskFilter as an example. It takes in input frames to that group, passes them through a blur filter, and then blends the output of that blur filter with the input image. To set this up, it uses the following code:
// First pass: apply a variable Gaussian blur
blurFilter = [[GPUImageGaussianBlurFilter alloc] init];
[self addFilter:blurFilter];
// Second pass: combine the blurred image with the original sharp one
unsharpMaskFilter = [[GPUImageTwoInputFilter alloc] initWithFragmentShaderFromString:kGPUImageUnsharpMaskFragmentShaderString];
[self addFilter:unsharpMaskFilter];
// Texture location 0 needs to be the sharp image for both the blur and the second stage processing
[blurFilter addTarget:unsharpMaskFilter atTextureLocation:1];
self.initialFilters = [NSArray arrayWithObjects:blurFilter, unsharpMaskFilter, nil];
self.terminalFilter = unsharpMaskFilter;
The one new method used here is -addTarget:atTextureLocation:, which makes sure that the input image is added as the first input for both the blur filter and the later blend. Also note that there are two initialFilters for this group, to make sure the input image goes to those two filters.
You'd need to do something similar with the above code, feeding input into the Harris corner detector as well as your blend. It should be reasonably straightforward to do based on the code you already have from the FilterShowcase example and the template of the GPUImageUnsharpMaskFilter group.
I am trying to do a green screen effect using GPUImage. The effect I am trying to achieve is to play a movie of curtains opening and replace the white part of the movie with the image. This will display the curtains and then the curtains open to display the image.
I have the movie displaying correctly and the white part of the movie is as black but the image does not display when the curtains open. What am I doing wrong?
NSURL *sampleURL = [[NSBundle mainBundle] URLForResource:#"CurtainsOpening" withExtension:#"m4v"];
GPUImageMovie *movieFile = [[GPUImageMovie alloc] initWithURL:sampleURL];
movieFile.playAtActualSpeed = YES;
NSLog(#"movie file = %#", movieFile);
GPUImageChromaKeyBlendFilter *filter = [[GPUImageChromaKeyBlendFilter alloc] init];
[(GPUImageChromaKeyBlendFilter *)filter setColorToReplaceRed:1.0 green:1.0 blue:1.0];
[(GPUImageChromaKeyBlendFilter *)filter setThresholdSensitivity:0.0]; //was 0.4
[movieFile addTarget:filter];
UIImage *inputImage = [UIImage imageNamed:#"curtains.jpg"];
NSLog(#"inputImage = %#", inputImage);
GPUImagePicture *overlayPicture = [[GPUImagePicture alloc] initWithImage:inputImage smoothlyScaleOutput:YES];
NSLog(#"overlayPicture = %#", overlayPicture);
[overlayPicture processImage];
[overlayPicture addTarget:filter];
//[movieFile addTarget:overlayPicture];
GPUImageView *view0 = [[GPUImageView alloc] initWithFrame:self.view.frame];
[view0 setFillMode:kGPUImageFillModeStretch];
NSLog(#"view0 = %#", view0);
[filter addTarget:view0];
[self.view addSubview:view0];
[view0 bringSubviewToFront:self.view];
NSLog(#"frame = %f %f", self.view.frame.size.width, self.view.frame.size.height);
[movieFile startProcessing];
I figured out it out. If anyone wants to know you need to make the GPUImagePicture variable an instance variable so the code does not remove the variable from memory when it exits the method.