GPUImage crashing in iOS 8 - ios

I have implemented a filter tool mechanism which has many filters. Each filter contains 2-3 different filters i.e. i am using GPUImageFilterGroup for this. Now when i updated GPU Image Library for iOS 8 compatible it shows "Instance Method prepareForImageCapture not found" and app crashes.
I also tried to implement the following code
GPUImageFilterGroup *filter = [[GPUImageFilterGroup alloc] init];
GPUImageRGBFilter *stillImageFilter1 = [[GPUImageRGBFilter alloc] init];
// [stillImageFilter1 prepareForImageCapture];
stillImageFilter1.red = 0.2;
stillImageFilter1.green = 0.8;
[stillImageFilter1 useNextFrameForImageCapture];
[(GPUImageFilterGroup *)filter addFilter:stillImageFilter1];
GPUImageVignetteFilter *stillImageFilter2 = [[GPUImageVignetteFilter alloc] init];
// [stillImageFilter1 prepareForImageCapture];
stillImageFilter2.vignetteStart = 0.32;
[stillImageFilter1 useNextFrameForImageCapture];
[(GPUImageFilterGroup *)filter addFilter:stillImageFilter2];
[stillImageFilter1 addTarget:stillImageFilter2];
[(GPUImageFilterGroup *)filter setInitialFilters:[NSArray arrayWithObject:stillImageFilter1]];
[(GPUImageFilterGroup *)filter setTerminalFilter:stillImageFilter2];
GPUImagePicture *stillImageSource = [[GPUImagePicture alloc] initWithImage:image];
[stillImageSource addTarget:(GPUImageFilterGroup *)filter];
[stillImageSource processImage];
UIImage *img = [(GPUImageFilterGroup *)filter imageFromCurrentFramebuffer];
Its returning nil image. Can anyone tell me whats the correct way!!!
Thanks in advance.

First, that wasn't a crash for iOS 8. You haven't updated your copy of GPUImage in a while, and that method was removed months ago in an update unrelated to any iOS compatibility. The reasons for this are explained here and I'll once again quote the relevant paragraph:
This does add one slight wrinkle to the interface, though, and I've
changed some method names to make this clear to anyone updating their
code. Because framebuffers are now transient, if you want to capture
an image from one of them, you have to tag it before processing. You
do this by using the -useNextFrameForImageCapture method on the filter
to indicate that the next time an image is passed down the filter
chain, you're going to want to hold on to that framebuffer for a
little longer to grab an image out of it. -imageByFilteringImage:
automatically does this for you now, and I've added another
convenience method in -processImageUpToFilter:withCompletionHandler:
to do this in an asynchronous manner.
As you can see, -prepareForImageCapture was removed because it was useless in the new caching system.
The reason why your updated code is returning nil is that you've called -useNextFrameForImageCapture on the wrong filter. It needs to be called on your terminal filter in the group (stillImageFilter2) and only needs to be called once, right before you call -processImage. That signifies that this particular framebuffer needs to hang around long enough to have an image captured from it.
You honestly don't need a GPUImageFilterGroup in the above, as it only complicates your filter chaining.

Related

How to use ChromaKey and Sepia filter with GPUImage at the same time?

I'm using for the first time the GPUImage framework of Brad Larson.
I don't know if it's possible, but I would like to use the GPUImageChromaKeyFilter and GPUImageSepiaFilter. I can use them separately, but at the same time, it doesn't work.
The sepia tone works, but the chromaKey seems doesn't work.
EDIT 2: WORKING
Here is my code:
- (void)setupCameraAndFilters:(AVCaptureDevicePosition)cameraPostion {
videoCamera = [[GPUImageVideoCamera alloc] initWithSessionPreset:AVCaptureSessionPreset640x480 cameraPosition:cameraPostion];
videoCamera.outputImageOrientation = UIInterfaceOrientationLandscapeRight;
// ChromaKey
chromaKeyFilter = [[GPUImageChromaKeyBlendFilter alloc] init];
[(GPUImageChromaKeyBlendFilter *)chromaKeyFilter setColorToReplaceRed:0.0 green:1.0 blue:0.0];
[videoCamera addTarget:chromaKeyFilter];
// Input image (replace the green background)
UIImage *inputImage;
inputImage = [UIImage imageNamed:#"chromaBackground.jpg"];
sourcePicture = [[GPUImagePicture alloc] initWithImage:inputImage smoothlyScaleOutput:YES];
[sourcePicture processImage];
[sourcePicture addTarget:chromaKeyFilter];
// Sepia filter
sepiaFilter = [[GPUImageSepiaFilter alloc] init];
[chromaKeyFilter addTarget:sepiaFilter];
[sepiaFilter addTarget:self.filteredVideoView];
[videoCamera startCameraCapture];
}
Your problem is that the above code doesn't really make sense. In the first example, you have your still image going into the single-input GPUImageChromaKeyFilter, then you try to target both that source image and your video feed to the single-input GPUImageSepiaFilter. One of those two inputs will be overridden by the other.
GPUImageFilterGroups are merely convenience classes for grouping sequences of filters together an an easy-to-reuse package, and won't solve anything here.
If you're trying to blend video with a chroma-keyed image, you need to use a GPUImageChromaKeyBlendFilter, which takes two inputs and blends them together based on the keying. You can then send that single output image to the sepia tone filter, or however you want to sequence that.
You have to use GPUImageFilterGroup filter in order to accomplish what you want. In the examples of the GPUImage you can find how to achieve this. Good Luck!

GPUImage blend filters

I'm trying to apply a blend filters to 2 images.
I've recently updated GPUImage to the last version.
To make things simple I've modified the example SimpleImageFilter.
Here is the code:
UIImage * image1 = [UIImage imageNamed:#"PGSImage_0000.jpg"];
UIImage * image2 = [UIImage imageNamed:#"PGSImage_0001.jpg"];
twoinputFilter = [[GPUImageColorBurnBlendFilter alloc] init];
sourcePicture1 = [[GPUImagePicture alloc] initWithImage:image1 ];
sourcePicture2 = [[GPUImagePicture alloc] initWithImage:image2 ];
[sourcePicture1 addTarget:twoinputFilter];
[sourcePicture1 processImage];
[sourcePicture2 addTarget:twoinputFilter];
[sourcePicture2 processImage];
UIImage * image = [twoinputFilter imageFromCurrentFramebuffer];
The image returned is nil.Applying some breakpoints I can see that the filter fails inside the method - (CGImageRef)newCGImageFromCurrentlyProcessedOutput the problem is that the framebufferForOutput is nil.I'm using simulator.
I don't get why it isn't working.
It seems that I was missing this command, as written in the documentation for still image processing:
Note that for a manual capture of an image from a filter, you need to
set -useNextFrameForImageCapture in order to tell the filter that
you'll be needing to capture from it later. By default, GPUImage
reuses framebuffers within filters to conserve memory, so if you need
to hold on to a filter's framebuffer for manual image capture, you
need to let it know ahead of time.
[twoinputFilter useNextFrameForImageCapture];

Multiple Filters using GPUImage Library

I am trying to apply 3 filters to an image.
One rgbFilter which is has its values constant, a brightness filter and a saturation filter, both of which should be able to be modified and the image should update.
I have followed the advice here.
I have setup a UIView using IB and set its class to GPUImageView. For some reason the image doesnt show.
My steps are as follows:
self.gpuImagePicture = [[GPUImagePicture alloc] initWithImage:image];
[self.gpuImagePicture addTarget:self.brightnessFilter];
[self.brightnessFilter addTarget:self.contrastFilter];
[self.contrastFilter addTarget:self.imageView];
and then I call this which sets the constant values on the rgb filter
[self setRGBFilterValues]
I setup my filters before this using:
- (void) setupFilters
{
self.brightnessFilter = [[GPUImageBrightnessFilter alloc] init];
self.contrastFilter = [[GPUImageContrastFilter alloc] init];
self.rgbFilter = [[GPUImageRGBFilter alloc] init];
}
Am I missing a step or why is the image just displaying nothing?
You're missing one step. You need to call -processImage on your GPUImagePicture instance to get it to propagate through the filter chain.
You also need to call this anytime you change values within your filter chain and wish to update the final output.
For my first time using this GPUImage library, it took me way too long to figure out how to simply apply multiple filters to a single image. The link provided by the OP does help explain why the API is relatively complex (one reason: you must specify the order in which the filters are applied).
For future reference, here's my code to apply two filters:
UIImage *initialImage = ...
GPUImagePicture *stillImageSource = [[GPUImagePicture alloc] initWithImage:initialImage];
GPUImageSaturationFilter *saturationFilter = [[GPUImageSaturationFilter alloc] init];
saturationFilter.saturation = 0.5;
[stillImageSource addTarget:saturationFilter];
GPUImageGaussianBlurFilter *blurFilter = [[GPUImageGaussianBlurFilter alloc] init];
blurFilter.blurRadiusInPixels = 10;
[saturationFilter addTarget:blurFilter];
GPUImageFilter *lastFilter = blurFilter;
[lastFilter useNextFrameForImageCapture];
[stillImageSource processImage];
UIImage *processedImage = [lastFilter imageFromCurrentFramebuffer];

GPUImage GPUImageChromaKeyBlendFilter With Still Image

I would like to create a GPUImageView to display a filter in real time (as opposed to keep reading imageFromCurrentlyProcessedOutput)
Is it possible to use GPUImage's GPUImageChromaKeyBlendFilter with a still source image automatically updating a GPUImageView?
Here is my code reading this into a UIImage;
UIImage *inputImage = [UIImage imageNamed:#"1.JPG"];
UIImage *backgroundImage = [UIImage imageNamed:#"2.JPG"];
GPUImagePicture *stillImageSource = [[GPUImagePicture alloc] initWithImage:inputImage];
GPUImageChromaKeyBlendFilter *stillImageFilter = [[GPUImageChromaKeyBlendFilter alloc] init];
[stillImageFilter setThresholdSensitivity:0.5];
[stillImageFilter setColorToReplaceRed:0.0 green:1.0 blue:0.0];
[stillImageSource addTarget:stillImageFilter];
[stillImageSource processImage];
UIImage *currentFilteredVideoFrame = [stillImageFilter imageByFilteringImage: backgroundImage ];
Everything I have tried so far requires you to add the 'backgroundImage' as a target to the filter (as you would if you were using the StillCamera). If you add the backgroundImage as a target, GPUImage just uses this new images as it's base image.
Can anyone help?
Thanks,
Don't use -imageByFilteringImage: with a two-input filter, like blends. It's a convenience method to quickly set up a small filter chain based on a UIImage and grab a UIImage out. You're not going to want it for something targeting a GPUImageView, anyway.
For the chroma key blend, you'll need to target your input image (the one with the color to be replaced) and background image to the blend, in that order using -addTarget, with GPUImagePicture instances for both. You then target your blend to the GPUImageView.
One note, you'll need to maintain strong references to your GPUImagePictures past the setup method, if you want to keep updating the filter after this point, so you may need to make them instance variables on your controller class.
Once you've set things up in this way, the result will go to your GPUImageView. Every time you call -processImage on one of the two images, the display in your GPUImageView will be updated. Therefore, you can call that after every change in filter settings, like if you had a slider to update filter values, and the image will be updated in realtime.

GPUImageAlphaBlendFilter realtime processing from GPUImageStillCamera source

I am using the GPUImage library and I'm trying to blend two images in realtime, and display them on a GPUImageView. I am trying to alpha-blend plain camera input, with a filtered version of it. Here is what I'm trying to do:
----------------->----v
--camera--| alpha blend ----> image view
-----> color filter --^
I've found some posts about using the blend filters, but they don't seem to be methods for realtime processing. I've found https://github.com/BradLarson/GPUImage/issues/319, GPUImage: blending two images, and https://github.com/BradLarson/GPUImage/issues/751 (but they either aren't for realtime processing, (first and the second), or doesn't work (third one).
I've tried almost everything, but all I'm getting is a white image in the GPUImageView. If I don't use the alpha blend filter, say, just use a false color filter or something similar, it works perfectly. Here is my code:
blendFilter = [[GPUImageAlphaBlendFilter alloc] init];
blendFilter.mix = 0.5;
[blendFilter prepareForImageCapture];
[blendFilter addTarget:imageView];
passThrough = [[GPUImageFilter alloc] init];
[passThrough prepareForImageCapture];
[passThrough addTarget:blendFilter];
selectedFilter = [[GPUImageFalseColorFilter alloc] init];
[selectedFilter prepareForImageCapture];
[selectedFilter addTarget:blendFilter];
stillCamera = [[GPUImageStillCamera alloc] init];
stillCamera.outputImageOrientation = UIInterfaceOrientationPortrait;
[stillCamera addTarget:passThrough];
[stillCamera addTarget:selectedFilter];
[stillCamera startCameraCapture];
All I'm getting is a white, blank screen. If I change [selectedFilter addTarget:blendFilter]; to [selectedFilter addTarget:imageView]; then false color filter gets displayed on the image.
There seems to be something wrong with the alpha blend filter. I've read that in some posts that I need to call processImage on the inputs, but those posts are all for non-realtime inputs as far as I understand. How can I get GPUImageAlphaBlendFilter to work in realtime?
Ok, after investigating the issue further over the internet and on project's issue list (https://github.com/BradLarson/GPUImage/issues) and found a workaround. While setting the blend filter as the target, I needed to specify the texture index specifically. For some reason (probably a bug), adding the target blend filter two times doesn't add the second texture correctly at the next index. So setting the texture indices explicitly as 0 and 1 did work:
[passThrough addTarget:blendFilter atTextureLocation:0];
[selectedFilter addTarget:blendFilter atTextureLocation:1];
For the filters that are targets of single sources, addTarget: is enough though such as [stillCamera addTarget:selectedFilter];.

Resources