GPUImage pencil sketch filter optimisation - image-processing

I have implemented pencil sketch filter using GPUImage framework. But I think the filter is too much complex to produce the output.
Pencil Filter function
- (void)pencilSketch
{
//UIImage *inputImage = [UIImage imageNamed:#"portrait.jpg"];
UIImage *inputImage = [UIImage imageNamed:#"selfie4.jpg"];
GPUImageGrayscaleFilter *grayFilter = [[GPUImageGrayscaleFilter alloc] init];
GPUImageColorInvertFilter *edgeFilter = [[GPUImageColorInvertFilter alloc] init];
GPUImageGaussianBlurFilter *blurr = [[GPUImageGaussianBlurFilter alloc] init];
blurr.blurRadiusInPixels = 8.0; // Edge Sensitivity
blurr.texelSpacingMultiplier = 8.0; // Edge Strength
GPUImageLinearBurnBlendFilter *filter = [[GPUImageLinearBurnBlendFilter alloc] init];
UIImage *invImg = [edgeFilter imageByFilteringImage: inputImage]; // Color Invert
UIImage *blurrImg = [blurr imageByFilteringImage: inputImage]; // Gaussian Blurr
// Linear Burn Blend
GPUImagePicture *mainPicture = [[GPUImagePicture alloc] initWithImage: invImg];
GPUImagePicture *topPicture = [[GPUImagePicture alloc] initWithImage: blurrImg];
[mainPicture addTarget: filter];
[topPicture addTarget: filter];
[filter useNextFrameForImageCapture];
[mainPicture processImage];
[topPicture processImage];
UIImage *resultedimage = [filter imageFromCurrentFramebuffer];
resultedimage = [UIImage imageWithCGImage:[resultedimage CGImage] scale:1.0 orientation: invImg.imageOrientation];
GPUImageColorInvertFilter *bf = [[GPUImageColorInvertFilter alloc] init];
resultedimage = [bf imageByFilteringImage: resultedimage]; // Color Invert
resultedimage = [grayFilter imageByFilteringImage: resultedimage]; // Gray scale
UIImage *grainImage = [UIImage imageNamed:#"stroke2.jpg"];
GPUImageGaussianBlurFilter *gBlurr = [[GPUImageGaussianBlurFilter alloc] init];
gBlurr.blurRadiusInPixels = 5.0;
UIImage *blurrGrainImg = [gBlurr imageByFilteringImage: grainImage];
GPUImagePicture *blurrPic = [[GPUImagePicture alloc] initWithImage: resultedimage];
GPUImagePicture *topPic = [[GPUImagePicture alloc] initWithImage: blurrGrainImg];
GPUImageSoftLightBlendFilter *maskedFilter = [[GPUImageSoftLightBlendFilter alloc] init];
[blurrPic addTarget: maskedFilter];
[topPic addTarget: maskedFilter];
[maskedFilter useNextFrameForImageCapture];
[blurrPic processImage];
[topPic processImage];
resultedimage = [maskedFilter imageFromCurrentFramebuffer];
resultedimage = [UIImage imageWithCGImage:[resultedimage CGImage] scale:1.0 orientation: inputImage.imageOrientation];
GPUImageColorBurnFilter *bFilter = [[GPUImageColorBurnFilter alloc] init];
bFilter.brightness = 5.0; // Edge Darkness
resultedimage = [bFilter imageByFilteringImage: resultedimage];
baseImage.image = resultedimage;
}
Output image for as result of the filter
Please anyone suggest me how to optimise this filter?
I would like if anyone guide me to write shader program for this. I can make a custom GPUImageFilter and write shader program for the same.

UIImage *imgCropped=image;
stillImageSource = [[GPUImagePicture alloc] initWithImage:imgCropped];
saturationFilter=[[GPUImageSaturationFilter alloc]init];
[saturationFilter setSaturation:0];
[stillImageSource addTarget:saturationFilter];
[stillImageSource processImage];
[saturationFilter useNextFrameForImageCapture];
UIImage *copyImage=[saturationFilter imageFromCurrentFramebuffer];
stillImageSource=[[GPUImagePicture alloc] initWithImage:copyImage];
copyGPUimage=[[GPUImagePicture alloc] initWithImage:copyImage];
invertFilter=[[GPUImageColorInvertFilter alloc]init];
[ copyGPUimage addTarget:invertFilter];
[ copyGPUimage processImage];
[invertFilter useNextFrameForImageCapture];
UIImage *imgTempr=[invertFilter imageFromCurrentFramebuffer];
copyGPUimage= [[GPUImagePicture alloc]initWithImage:imgTempr];
gaussianBlur=[[GPUImageGaussianBlurFilter alloc]init];
[gaussianBlur setBlurRadiusInPixels:val];
[copyGPUimage addTarget:gaussianBlur];
[ copyGPUimage processImage];
[gaussianBlur useNextFrameForImageCapture];
UIImage *imgTempa=[gaussianBlur imageFromCurrentFramebuffer];
copyGPUimage= [[GPUImagePicture alloc]initWithImage:imgTempa];
colorDodge=[[GPUImageColorDodgeBlendFilter alloc]init];
[stillImageSource addTarget:colorDodge];
[ copyGPUimage addTarget:colorDodge];
[stillImageSource processImage];
[ copyGPUimage processImage];
[colorDodge useNextFrameForImageCapture];
imageWithAppliedThreshold=[colorDodge imageFromCurrentFramebuffer];
//here you have got your sketch now you can set its darkness
stillImageSource = [[GPUImagePicture alloc] initWithImage:imageWithAppliedThreshold];
monochromeFilter=[[GPUImageMonochromeFilter alloc]init];
[monochromeFilter setIntensity:intenVal];
[monochromeFilter setColorRed:0.0 green:0.0 blue:0.0];
[monochromeFilter setBackgroundColorRed:1.0 green:1.0 blue:1.0 alpha:1.0];
[stillImageSource addTarget: monochromeFilter];
[stillImageSource processImage];
[monochromeFilter useNextFrameForImageCapture];
imageWithAppliedThreshold=[monochromeFilter imageFromCurrentFramebuffer];

Related

Applying GPUImage filters with a UISlider

I'm using GPUImage filters to create a simple image brightness,contrast and saturation adjustment tool. The view has one image view and 3 sliders for each filter. At first i thought i would apply each filter as the value of each of the sliders change but then i realized that the filters had to be chained. So i wrote the code below. The problem with this code is that only the brightness filter is applied to the image.
- (IBAction)brightnessSlider:(UISlider *)sender {
brightnessValue = sender.value;
UIImage *inputImage = [UIImage imageNamed:#"2.jpg"];
GPUImagePicture *stillImageSource = [[GPUImagePicture alloc] initWithImage:inputImage];
GPUImageBrightnessFilter *brightnessFilter = [GPUImageBrightnessFilter new];
GPUImageContrastFilter *contrastFilter = [GPUImageContrastFilter new];
GPUImageSaturationFilter *saturationFilter = [GPUImageSaturationFilter new];
[brightnessFilter setBrightness:brightnessValue];
[contrastFilter setContrast:contrastValue];
[saturationFilter setSaturation:saturationValue];
[stillImageSource addTarget:brightnessFilter];
[brightnessFilter addTarget:contrastFilter];
[contrastFilter addTarget:saturationFilter];
[brightnessFilter useNextFrameForImageCapture];
[stillImageSource processImage];
UIImage *currentFilteredVideoFrame = [brightnessFilter imageFromCurrentFramebuffer];
imageView.image = currentFilteredVideoFrame;
}
- (IBAction)contrastSlider:(UISlider *)sender
{
contrastValue = sender.value;
UIImage *inputImage = [UIImage imageNamed:#"2.jpg"];
GPUImagePicture *stillImageSource = [[GPUImagePicture alloc] initWithImage:inputImage];
GPUImageBrightnessFilter *brightnessFilter = [GPUImageBrightnessFilter new];
GPUImageContrastFilter *contrastFilter = [GPUImageContrastFilter new];
GPUImageSaturationFilter *saturationFilter = [GPUImageSaturationFilter new];
[brightnessFilter setBrightness:brightnessValue];
[contrastFilter setContrast:contrastValue];
[saturationFilter setSaturation:saturationValue];
[stillImageSource addTarget:brightnessFilter];
[brightnessFilter addTarget:contrastFilter];
[contrastFilter addTarget:saturationFilter];
[brightnessFilter useNextFrameForImageCapture];
[stillImageSource processImage];
UIImage *currentFilteredVideoFrame = [brightnessFilter imageFromCurrentFramebuffer];
imageView.image = currentFilteredVideoFrame;
}
- (IBAction)saturationSlider:(UISlider *)sender
{
saturationValue = sender.value;
UIImage *inputImage = [UIImage imageNamed:#"2.jpg"];
GPUImagePicture *stillImageSource = [[GPUImagePicture alloc] initWithImage:inputImage];
GPUImageBrightnessFilter *brightnessFilter = [GPUImageBrightnessFilter new];
GPUImageContrastFilter *contrastFilter = [GPUImageContrastFilter new];
GPUImageSaturationFilter *saturationFilter = [GPUImageSaturationFilter new];
[brightnessFilter setBrightness:brightnessValue];
[contrastFilter setContrast:contrastValue];
[saturationFilter setSaturation:saturationValue];
[stillImageSource addTarget:brightnessFilter];
[brightnessFilter addTarget:contrastFilter];
[contrastFilter addTarget:saturationFilter];
[brightnessFilter useNextFrameForImageCapture];
[stillImageSource processImage];
UIImage *currentFilteredVideoFrame = [brightnessFilter imageFromCurrentFramebuffer];
imageView.image = currentFilteredVideoFrame;
}
Now i decided to implement the whole thing in a much simpler way but i am still getting nothing:
UIImage *inputImage = [UIImage imageNamed:#"2.jpg"];
GPUImagePicture *stillImageSource = [[GPUImagePicture alloc] initWithImage:inputImage];
//Set Brightness to 60
GPUImageBrightnessFilter *brightnessFilter = [GPUImageBrightnessFilter new];
[brightnessFilter setBrightness:60.0];
//Set Contrast to 12
GPUImageContrastFilter *contrastFilter = [GPUImageContrastFilter new];
[contrastFilter setContrast:12];
[brightnessFilter addTarget:contrastFilter];
[stillImageSource addTarget:contrastFilter];
[contrastFilter useNextFrameForImageCapture];
[stillImageSource processImage];
UIImage *outputImage1 = [contrastFilter imageFromCurrentFramebuffer];
imageView.image = outputImage1;
You don't need to use the same add target codes for each slider change. you should only change the values of the needed filter and setup all the filters somewhere else in the code; where you will eventually load the image. This will make your code much easier to maintain and will make this question shorter.
Now to the actual problem. Check out GPUImage github issue. You have nested the filters in the following order:
Still Image > Brightness > Contrast > Saturation
And then you are getting the image from Brightness filter. You need to get it through Saturation filter. So, your chain needs to be like:
Still Image > Brightness > Contrast > Saturation > Output Image

Chaining GPUImage filters only apply the last filter

I am trying to chain two filters (contrast and brightness) and apply it to an image but i only get the contrast filter applied.
Here is the code:
UIImage *inputImage = [UIImage imageNamed:#"2.jpg"];
GPUImagePicture *stillImageSource = [[GPUImagePicture alloc] initWithImage:inputImage];
//Set Brightness to 60
GPUImageBrightnessFilter *brightnessFilter = [GPUImageBrightnessFilter new];
[brightnessFilter setBrightness:0.5];
//Set Contrast to 12
GPUImageContrastFilter *contrastFilter = [GPUImageContrastFilter new];
[contrastFilter setContrast:1.0];
[contrastFilter addTarget:brightnessFilter];
[stillImageSource addTarget:contrastFilter];
[contrastFilter useNextFrameForImageCapture];
[stillImageSource processImage];
UIImage *outputImage1 = [contrastFilter imageFromCurrentFramebuffer];
imageView.image = outputImage1;
i hope this solves your problem..
GPUImageView *imageView = [[GPUImageView alloc] initWithFrame:CGRectMake(x, y, width, height)];
[self.view addSubview:imageView];
UIImage *inputImage = [UIImage imageNamed:#"2.jpg"];
GPUImagePicture *stillImageSource = [[GPUImagePicture alloc] initWithImage:inputImage];
//Set Brightness to 60
GPUImageBrightnessFilter *brightnessFilter = [[GPUImageBrightnessFilter alloc]init];
[brightnessFilter setBrightness:0.5];
//Set Contrast to 12
GPUImageContrastFilter *contrastFilter = [[GPUImageContrastFilter alloc]init];
[contrastFilter setContrast:1.0];
[stillImageSource addTarget:brightnessFilter];
[brightnessFilter addTarget:contrastFilter];
[contrastFilter addTarget:imageView];
[stillImageSource processImage];
[contrastFilter useNextFrameForImageCapture];
UIImage *outputImage1 = [contrastFilter imageFromCurrentFramebuffer];
imageView.image = outputImage1;

How to add multiple filters to UIImage GPUImage?

I noticed a lot of people asking questions about linking filters with GPUImage. I can't quite figure out how to do it succinctly. Finally got it working tonight. Just wanted to share my code so people can link to the solution.
UIImage *faceImage = [UIImage imageNamed:#"469453586_640.jpg"];
UIImageView *face = [[UIImageView alloc] initWithFrame:CGRectMake(0.0, 0.0, faceImage.size.width/2.0, faceImage.size.height/2.0)];
[face setImage:faceImage];
[self.view addSubview:face];
GPUImagePicture *stillImageSource = [[GPUImagePicture alloc] initWithImage:faceImage];
GPUImageBrightnessFilter *brightnessFilter = [[GPUImageBrightnessFilter alloc] init];
[brightnessFilter setBrightness:.15];
GPUImageGrayscaleFilter *grayscaleFilter = [[GPUImageGrayscaleFilter alloc] init];
GPUImagePosterizeFilter *posterizeFilter = [[GPUImagePosterizeFilter alloc] init];
[posterizeFilter setColorLevels:1];
[stillImageSource addTarget:brightnessFilter];
[brightnessFilter addTarget:grayscaleFilter];
[grayscaleFilter addTarget:posterizeFilter];
// these need to be changed if you change the order of your filters
// [brightnessFilter useNextFrameForImageCapture];
// [grayscaleFilter useNextFrameForImageCapture];
[posterizeFilter useNextFrameForImageCapture];
[stillImageSource processImage];
[face setImage: [posterizeFilter imageFromCurrentFramebuffer]];

GPUImageGaussianBlurFilter doesn't seem to work with layered filters

I have layered filters that all look great with the images I am using but if I change the Gaussian Blur parameters either higher or lower there is no visible difference in the blurring effect. What am I doing wrong ?
Here is my code :
GPUImageView *finalView;
- (void)viewDidLoad
{
[super viewDidLoad];
UIImage *topLayer = [UIImage imageNamed:#"Glass.png"];
UIImage *baseLayer = [UIImage imageNamed:#"BasePhoto.png"];
GPUImagePicture *stillImageSourceTop = [[GPUImagePicture alloc] initWithImage:topLayer];
GPUImagePicture *stillImageSourceBottom = [[GPUImagePicture alloc] initWithImage:baseLayer];
GPUImageScreenBlendFilter *screenBlendFilter = [[GPUImageScreenBlendFilter alloc] init];
GPUImageBrightnessFilter *brightnessFilter = [[GPUImageBrightnessFilter alloc] init];
GPUImageContrastFilter *contrastFilter = [[GPUImageContrastFilter alloc] init];
GPUImageSaturationFilter *saturationFilter = [[GPUImageSaturationFilter alloc] init];
GPUImageColorMatrixFilter *colorMatrixFilter = [[GPUImageColorMatrixFilter alloc] init];
GPUImageOpacityFilter *opacityFilter = [[GPUImageOpacityFilter alloc] init];
opacityFilter.opacity = 0;
GPUImageGaussianBlurFilter *blurFilter = [[GPUImageGaussianBlurFilter alloc]init];
blurFilter.texelSpacingMultiplier = 4.0;
blurFilter.blurRadiusInPixels = 200.0;
blurFilter.blurPasses = 4;
[stillImageSourceTop addTarget:brightnessFilter];
[brightnessFilter addTarget:contrastFilter];
[contrastFilter addTarget:saturationFilter];
[saturationFilter addTarget:colorMatrixFilter];
[colorMatrixFilter addTarget:blurFilter];
[blurFilter addTarget:opacityFilter];
[stillImageSourceTop processImage];
[opacityFilter useNextFrameForImageCapture];
UIImage *topLayerImage = [opacityFilter imageFromCurrentFramebuffer];
GPUImagePicture *stillImageSourceTopWithFilters = [[GPUImagePicture alloc] initWithImage:topLayerImage];
[stillImageSourceBottom addTarget:screenBlendFilter];
[stillImageSourceTopWithFilters addTarget:screenBlendFilter];
[screenBlendFilter useNextFrameForImageCapture];
[stillImageSourceBottom processImage];
UIImage *mergedlayeredimage = [screenBlendFilter imageFromCurrentFramebuffer];
[finalImageView setImage:mergedlayeredimage];
}
Well, that's because most of the filters in that above don't actually do anything. The only filter you've wired up is the screenBlendFilter, where you have your source images both going into it and then you pull the one image out of it. You never actually use the blur for anything there, so of course it won't affect the output any.

GPUImageHistogramFilter for a still image giving zero data

Very similar this answer, except I want to generate a histogram for a still image.
Below is what I'm doing, and it's giving a histogram with all 0 data. Is there some trick to getting this working?
GPUImageFilter *filter = [[GPUImageHistogramFilter alloc] initWithHistogramType:kGPUImageHistogramRGB];
GPUImagePicture *original = [[[GPUImagePicture alloc] initWithImage:image] autorelease];
GPUImageGammaFilter *gammaFilter = [[GPUImageGammaFilter alloc] init];
[original addTarget:gammaFilter];
[gammaFilter addTarget:filter];
GPUImageHistogramGenerator *histogramGraph = [[GPUImageHistogramGenerator alloc] init];
[histogramGraph forceProcessingAtSize:CGSizeMake(256.0, 330.0)];
[filter addTarget:histogramGraph];
GPUImageAlphaBlendFilter *blendFilter = [[GPUImageAlphaBlendFilter alloc] init];
blendFilter.mix = 0.75;
[blendFilter forceProcessingAtSize:CGSizeMake(256.0, 330.0)];
[original addTarget:blendFilter];
[histogramGraph addTarget:blendFilter];
[blendFilter addTarget:gpuImageView];
[original processImage];
Brad has changed some inner mechanism to GPUImage to improve memory management (and it does)in the last releases, now you should tell the filter to keep the frame for still images -useNextFrameForImageCapture .
UIImage *inputImage = [UIImage imageNamed:#"Lambeau.jpg"];
GPUImagePicture *stillImageSource = [[GPUImagePicture alloc] initWithImage:inputImage];
GPUImageSepiaFilter *stillImageFilter = [[GPUImageSepiaFilter alloc] init];
[stillImageSource addTarget:stillImageFilter];
[stillImageFilter useNextFrameForImageCapture]
[stillImageSource processImage];
UIImage *currentFilteredVideoFrame = [stillImageFilter imageFromCurrentFramebuffer];

Resources