Add UIImage Element using GPUImage Framework - ios

I am using Brad Larson's GPUImage Framework to add a UIImage element,i have successfully added the image but the main issue is that the image is getting stretched to the video's aspect ratio.
Here is my code:
GPUImageView *filterView = (GPUImageView *)self.view;
videoCamera = [[GPUImageVideoCamera alloc] initWithSessionPreset:AVCaptureSessionPreset640x480 cameraPosition:AVCaptureDevicePositionBack];
videoCamera.outputImageOrientation = UIInterfaceOrientationPortrait;
transformFilter=[[GPUImageTransformFilter alloc]init];
CGAffineTransform t=CGAffineTransformMakeScale(0.5, 0.5);
[(GPUImageTransformFilter *)filter setAffineTransform:t];
[videoCamera addTarget:transformFilter];
filter = [[GPUImageOverlayBlendFilter alloc] init];
[videoCamera addTarget:filter];
inputImage = [UIImage imageNamed:#"eye.png"];
sourcePicture = [[GPUImagePicture alloc] initWithImage:inputImage smoothlyScaleOutput:YES];
[sourcePicture forceProcessingAtSize:CGSizeMake(50, 50)];
[sourcePicture processImage];
[sourcePicture addTarget:filter];
[sourcePicture addTarget:transformFilter];
[filter addTarget:filterView];
[videoCamera startCameraCapture];
I have tried to use transform filter before blending the image,but it isn't getting scaled.
I want the image to appear at the center.How do i do it?
Thanks

You are on the right track, just have a few things out of place.
The following code will load an overlay image and apply a transformation to keep it at actual size. By default is will be centered over the video.
GPUImageView *filterView = (GPUImageView *)self.view;
videoCamera = [[GPUImageVideoCamera alloc] initWithSessionPreset:AVCaptureSessionPreset640x480 cameraPosition:AVCaptureDevicePositionBack];
videoCamera.outputImageOrientation = UIInterfaceOrientationPortrait;
filter = [[GPUImageOverlayBlendFilter alloc] init];
transformFilter = [[GPUImageTransformFilter alloc]init];
[videoCamera addTarget:filter];
[transformFilter addTarget:filter];
// setup overlay image
inputImage = [UIImage imageNamed:#"eye.png"];
sourcePicture = [[GPUImagePicture alloc] initWithImage:inputImage smoothlyScaleOutput:YES];
// determine the necessary scaling to keep image at actual size
CGFloat tx = inputImage.size.width / 480.0; // 480/640: based on video camera preset
CGFloat ty = inputImage.size.height / 640.0;
// apply transform to filter
CGAffineTransform t = CGAffineTransformMakeScale(tx, ty);
[(GPUImageTransformFilter *)transformFilter setAffineTransform:t];
//
[sourcePicture addTarget:filter];
[sourcePicture addTarget:transformFilter];
[sourcePicture processImage];
[filter addTarget:filterView];
[videoCamera startCameraCapture];

Related

Applying GPUImage filters with a UISlider

I'm using GPUImage filters to create a simple image brightness,contrast and saturation adjustment tool. The view has one image view and 3 sliders for each filter. At first i thought i would apply each filter as the value of each of the sliders change but then i realized that the filters had to be chained. So i wrote the code below. The problem with this code is that only the brightness filter is applied to the image.
- (IBAction)brightnessSlider:(UISlider *)sender {
brightnessValue = sender.value;
UIImage *inputImage = [UIImage imageNamed:#"2.jpg"];
GPUImagePicture *stillImageSource = [[GPUImagePicture alloc] initWithImage:inputImage];
GPUImageBrightnessFilter *brightnessFilter = [GPUImageBrightnessFilter new];
GPUImageContrastFilter *contrastFilter = [GPUImageContrastFilter new];
GPUImageSaturationFilter *saturationFilter = [GPUImageSaturationFilter new];
[brightnessFilter setBrightness:brightnessValue];
[contrastFilter setContrast:contrastValue];
[saturationFilter setSaturation:saturationValue];
[stillImageSource addTarget:brightnessFilter];
[brightnessFilter addTarget:contrastFilter];
[contrastFilter addTarget:saturationFilter];
[brightnessFilter useNextFrameForImageCapture];
[stillImageSource processImage];
UIImage *currentFilteredVideoFrame = [brightnessFilter imageFromCurrentFramebuffer];
imageView.image = currentFilteredVideoFrame;
}
- (IBAction)contrastSlider:(UISlider *)sender
{
contrastValue = sender.value;
UIImage *inputImage = [UIImage imageNamed:#"2.jpg"];
GPUImagePicture *stillImageSource = [[GPUImagePicture alloc] initWithImage:inputImage];
GPUImageBrightnessFilter *brightnessFilter = [GPUImageBrightnessFilter new];
GPUImageContrastFilter *contrastFilter = [GPUImageContrastFilter new];
GPUImageSaturationFilter *saturationFilter = [GPUImageSaturationFilter new];
[brightnessFilter setBrightness:brightnessValue];
[contrastFilter setContrast:contrastValue];
[saturationFilter setSaturation:saturationValue];
[stillImageSource addTarget:brightnessFilter];
[brightnessFilter addTarget:contrastFilter];
[contrastFilter addTarget:saturationFilter];
[brightnessFilter useNextFrameForImageCapture];
[stillImageSource processImage];
UIImage *currentFilteredVideoFrame = [brightnessFilter imageFromCurrentFramebuffer];
imageView.image = currentFilteredVideoFrame;
}
- (IBAction)saturationSlider:(UISlider *)sender
{
saturationValue = sender.value;
UIImage *inputImage = [UIImage imageNamed:#"2.jpg"];
GPUImagePicture *stillImageSource = [[GPUImagePicture alloc] initWithImage:inputImage];
GPUImageBrightnessFilter *brightnessFilter = [GPUImageBrightnessFilter new];
GPUImageContrastFilter *contrastFilter = [GPUImageContrastFilter new];
GPUImageSaturationFilter *saturationFilter = [GPUImageSaturationFilter new];
[brightnessFilter setBrightness:brightnessValue];
[contrastFilter setContrast:contrastValue];
[saturationFilter setSaturation:saturationValue];
[stillImageSource addTarget:brightnessFilter];
[brightnessFilter addTarget:contrastFilter];
[contrastFilter addTarget:saturationFilter];
[brightnessFilter useNextFrameForImageCapture];
[stillImageSource processImage];
UIImage *currentFilteredVideoFrame = [brightnessFilter imageFromCurrentFramebuffer];
imageView.image = currentFilteredVideoFrame;
}
Now i decided to implement the whole thing in a much simpler way but i am still getting nothing:
UIImage *inputImage = [UIImage imageNamed:#"2.jpg"];
GPUImagePicture *stillImageSource = [[GPUImagePicture alloc] initWithImage:inputImage];
//Set Brightness to 60
GPUImageBrightnessFilter *brightnessFilter = [GPUImageBrightnessFilter new];
[brightnessFilter setBrightness:60.0];
//Set Contrast to 12
GPUImageContrastFilter *contrastFilter = [GPUImageContrastFilter new];
[contrastFilter setContrast:12];
[brightnessFilter addTarget:contrastFilter];
[stillImageSource addTarget:contrastFilter];
[contrastFilter useNextFrameForImageCapture];
[stillImageSource processImage];
UIImage *outputImage1 = [contrastFilter imageFromCurrentFramebuffer];
imageView.image = outputImage1;
You don't need to use the same add target codes for each slider change. you should only change the values of the needed filter and setup all the filters somewhere else in the code; where you will eventually load the image. This will make your code much easier to maintain and will make this question shorter.
Now to the actual problem. Check out GPUImage github issue. You have nested the filters in the following order:
Still Image > Brightness > Contrast > Saturation
And then you are getting the image from Brightness filter. You need to get it through Saturation filter. So, your chain needs to be like:
Still Image > Brightness > Contrast > Saturation > Output Image

GPUImageGaussianBlurFilter doesn't seem to work with layered filters

I have layered filters that all look great with the images I am using but if I change the Gaussian Blur parameters either higher or lower there is no visible difference in the blurring effect. What am I doing wrong ?
Here is my code :
GPUImageView *finalView;
- (void)viewDidLoad
{
[super viewDidLoad];
UIImage *topLayer = [UIImage imageNamed:#"Glass.png"];
UIImage *baseLayer = [UIImage imageNamed:#"BasePhoto.png"];
GPUImagePicture *stillImageSourceTop = [[GPUImagePicture alloc] initWithImage:topLayer];
GPUImagePicture *stillImageSourceBottom = [[GPUImagePicture alloc] initWithImage:baseLayer];
GPUImageScreenBlendFilter *screenBlendFilter = [[GPUImageScreenBlendFilter alloc] init];
GPUImageBrightnessFilter *brightnessFilter = [[GPUImageBrightnessFilter alloc] init];
GPUImageContrastFilter *contrastFilter = [[GPUImageContrastFilter alloc] init];
GPUImageSaturationFilter *saturationFilter = [[GPUImageSaturationFilter alloc] init];
GPUImageColorMatrixFilter *colorMatrixFilter = [[GPUImageColorMatrixFilter alloc] init];
GPUImageOpacityFilter *opacityFilter = [[GPUImageOpacityFilter alloc] init];
opacityFilter.opacity = 0;
GPUImageGaussianBlurFilter *blurFilter = [[GPUImageGaussianBlurFilter alloc]init];
blurFilter.texelSpacingMultiplier = 4.0;
blurFilter.blurRadiusInPixels = 200.0;
blurFilter.blurPasses = 4;
[stillImageSourceTop addTarget:brightnessFilter];
[brightnessFilter addTarget:contrastFilter];
[contrastFilter addTarget:saturationFilter];
[saturationFilter addTarget:colorMatrixFilter];
[colorMatrixFilter addTarget:blurFilter];
[blurFilter addTarget:opacityFilter];
[stillImageSourceTop processImage];
[opacityFilter useNextFrameForImageCapture];
UIImage *topLayerImage = [opacityFilter imageFromCurrentFramebuffer];
GPUImagePicture *stillImageSourceTopWithFilters = [[GPUImagePicture alloc] initWithImage:topLayerImage];
[stillImageSourceBottom addTarget:screenBlendFilter];
[stillImageSourceTopWithFilters addTarget:screenBlendFilter];
[screenBlendFilter useNextFrameForImageCapture];
[stillImageSourceBottom processImage];
UIImage *mergedlayeredimage = [screenBlendFilter imageFromCurrentFramebuffer];
[finalImageView setImage:mergedlayeredimage];
}
Well, that's because most of the filters in that above don't actually do anything. The only filter you've wired up is the screenBlendFilter, where you have your source images both going into it and then you pull the one image out of it. You never actually use the blur for anything there, so of course it won't affect the output any.

Add a GPUImagePicture on a video

I'm trying to add a GPUImagePicture and a GPUImageUIElement on a video.
GPUImageUIElement is working, but I've a problem with the GPUImagePicture because I only see it on the first frame and then it disappears.
Here's my code:
filter = [[GPUImageBrightnessFilter alloc] init];
[(GPUImageBrightnessFilter *)filter setBrightness:0];
GPUImageAlphaBlendFilter *blendFilter = [[GPUImageAlphaBlendFilter alloc] init];
blendFilter.mix = 1.0;
GPUImagePicture *overlay = [[GPUImagePicture alloc] initWithImage:[UIImage imageNamed:#"Logo.png"] smoothlyScaleOutput:YES];
GPUImageTransformFilter *transformFilter = [GPUImageTransformFilter new];
[transformFilter forceProcessingAtSize:CGSizeMake(73, 83)];
[transformFilter setAffineTransform:CGAffineTransformMakeScale(0.7, 0.7)];
[overlay addTarget:transformFilter];
[overlay processImage];
UIView *subview1 = [[UIView alloc] initWithFrame:CGRectMake(0, 0, 320, 400)];
UILabel *temperaturaText = [[UILabel alloc]initWithFrame:CGRectMake(77, 100, 105, 60)];
temperaturaText.text = #"test";
[subview1 addSubview:temperaturaText];
uiElementInput = [[GPUImageUIElement alloc] initWithView:subview1];
[filter addTarget:blendFilter];
[uiElementInput addTarget:blendFilter];
[blendFilter addTarget:filterView];
[overlay addTarget:filterView];
__unsafe_unretained GPUImageUIElement *weakUIElementInput = uiElementInput;
[filter setFrameProcessingCompletionBlock:^(GPUImageOutput *filter, CMTime frameTime) {
[weakUIElementInput update];
}];
[blendFilter addTarget:movieWriter];
Your problem is that you define your input picture of overlay as a local variable within your setup method. You aren't holding onto a strong reference to it, so it will be deallocated the instant you finish this method, which also will remove the image texture and its output from your processing pipeline.
If you want to hold on to an input image, you need to make overlay an instance variable of your class, like you did for your camera or movie input in the above. Then it will persist to be used by the framework as input.

GPUImageHistogramFilter for a still image giving zero data

Very similar this answer, except I want to generate a histogram for a still image.
Below is what I'm doing, and it's giving a histogram with all 0 data. Is there some trick to getting this working?
GPUImageFilter *filter = [[GPUImageHistogramFilter alloc] initWithHistogramType:kGPUImageHistogramRGB];
GPUImagePicture *original = [[[GPUImagePicture alloc] initWithImage:image] autorelease];
GPUImageGammaFilter *gammaFilter = [[GPUImageGammaFilter alloc] init];
[original addTarget:gammaFilter];
[gammaFilter addTarget:filter];
GPUImageHistogramGenerator *histogramGraph = [[GPUImageHistogramGenerator alloc] init];
[histogramGraph forceProcessingAtSize:CGSizeMake(256.0, 330.0)];
[filter addTarget:histogramGraph];
GPUImageAlphaBlendFilter *blendFilter = [[GPUImageAlphaBlendFilter alloc] init];
blendFilter.mix = 0.75;
[blendFilter forceProcessingAtSize:CGSizeMake(256.0, 330.0)];
[original addTarget:blendFilter];
[histogramGraph addTarget:blendFilter];
[blendFilter addTarget:gpuImageView];
[original processImage];
Brad has changed some inner mechanism to GPUImage to improve memory management (and it does)in the last releases, now you should tell the filter to keep the frame for still images -useNextFrameForImageCapture .
UIImage *inputImage = [UIImage imageNamed:#"Lambeau.jpg"];
GPUImagePicture *stillImageSource = [[GPUImagePicture alloc] initWithImage:inputImage];
GPUImageSepiaFilter *stillImageFilter = [[GPUImageSepiaFilter alloc] init];
[stillImageSource addTarget:stillImageFilter];
[stillImageFilter useNextFrameForImageCapture]
[stillImageSource processImage];
UIImage *currentFilteredVideoFrame = [stillImageFilter imageFromCurrentFramebuffer];

GPUImage Green Screen

I am trying to do a green screen effect using GPUImage. The effect I am trying to achieve is to play a movie of curtains opening and replace the white part of the movie with the image. This will display the curtains and then the curtains open to display the image.
I have the movie displaying correctly and the white part of the movie is as black but the image does not display when the curtains open. What am I doing wrong?
NSURL *sampleURL = [[NSBundle mainBundle] URLForResource:#"CurtainsOpening" withExtension:#"m4v"];
GPUImageMovie *movieFile = [[GPUImageMovie alloc] initWithURL:sampleURL];
movieFile.playAtActualSpeed = YES;
NSLog(#"movie file = %#", movieFile);
GPUImageChromaKeyBlendFilter *filter = [[GPUImageChromaKeyBlendFilter alloc] init];
[(GPUImageChromaKeyBlendFilter *)filter setColorToReplaceRed:1.0 green:1.0 blue:1.0];
[(GPUImageChromaKeyBlendFilter *)filter setThresholdSensitivity:0.0]; //was 0.4
[movieFile addTarget:filter];
UIImage *inputImage = [UIImage imageNamed:#"curtains.jpg"];
NSLog(#"inputImage = %#", inputImage);
GPUImagePicture *overlayPicture = [[GPUImagePicture alloc] initWithImage:inputImage smoothlyScaleOutput:YES];
NSLog(#"overlayPicture = %#", overlayPicture);
[overlayPicture processImage];
[overlayPicture addTarget:filter];
//[movieFile addTarget:overlayPicture];
GPUImageView *view0 = [[GPUImageView alloc] initWithFrame:self.view.frame];
[view0 setFillMode:kGPUImageFillModeStretch];
NSLog(#"view0 = %#", view0);
[filter addTarget:view0];
[self.view addSubview:view0];
[view0 bringSubviewToFront:self.view];
NSLog(#"frame = %f %f", self.view.frame.size.width, self.view.frame.size.height);
[movieFile startProcessing];
I figured out it out. If anyone wants to know you need to make the GPUImagePicture variable an instance variable so the code does not remove the variable from memory when it exits the method.

Resources