Rotation With Animation in GPUImage - ios

I Am using GPUImage.
I Want to Rotate (With 90, 180, 270, 360 degree) Image With Animation using GPUImageFilter.
Please Help.
Thanks in Advance.
Just Check What i did to do For Rotation
int tag - is defined in .h file
-(IBAction)btnRotate_Clicked:(id)sender
{
[self hideFilterView];
[self hideFontView];
[self hideEnhanceView];
if (tag == 0)
{
staticPictureOriginalOrientation = UIImageOrientationRight;
tag = 1;
}
else if (tag == 1)
{
staticPictureOriginalOrientation = UIImageOrientationDown;
tag = 2;
}
else if (tag == 2)
{
staticPictureOriginalOrientation = UIImageOrientationLeft;
tag = 3;
}
else
{
staticPictureOriginalOrientation = UIImageOrientationUp;
tag = 0;
}
UIImageOrientation orientation = staticPictureOriginalOrientation;
switch(orientation){
case UIImageOrientationUp:
[self.MimageView setFillMode:kGPUImageFillModePreserveAspectRatioAndFill];
break;
case UIImageOrientationLeft:
[self.MimageView setFillMode:kGPUImageFillModePreserveAspectRatio];
break;
case UIImageOrientationRight:
[self.MimageView setFillMode:kGPUImageFillModePreserveAspectRatio];
break;
case UIImageOrientationDown:
[self.MimageView setFillMode:kGPUImageFillModePreserveAspectRatioAndFill];
break;
default:
break;
}
[self prepareStaticFilter];
}
-(void) prepareStaticFilter
{
isImageProcessed = TRUE;
[staticPicture addTarget:filter];
if (hasBlur)
{
[filter addTarget:blurFilter];
[blurFilter addTarget:self.MimageView];
}
else
{
[filter addTarget:self.MimageView];
}
GPUImageRotationMode imageViewRotationMode = kGPUImageNoRotation;
switch (staticPictureOriginalOrientation)
{
case UIImageOrientationLeft:
imageViewRotationMode = kGPUImageRotateLeft;
break;
case UIImageOrientationRight:
imageViewRotationMode = kGPUImageRotateRight;
break;
case UIImageOrientationDown:
imageViewRotationMode = kGPUImageRotate180;
break;
default:
imageViewRotationMode = kGPUImageNoRotation;
break;
}
[filter setInputRotation:imageViewRotationMode atIndex:0];
[staticPicture processImage];
}

Related

iOS - CIImage autoAdjustmentFilters leak

I'm using the auto enhance feature to improve a lot of images.
But it seems there is a big leak, so here's the code I'm using :
#import <CoreImage/CoreImage.h>
#import <ImageIO/ImageIO.h>
#implementation UIImage (AutoEnhanced)
- (UIImage *)autoEnhancedImage
{
#autoreleasepool {
CIImage *ciOriginal = self.CIImage;
if(!ciOriginal) {
ciOriginal = [[CIImage alloc] initWithCGImage:self.CGImage];
}
NSDictionary *options = #{ CIDetectorImageOrientation : #(self.CGImagePropertyOrientation)};
NSArray *adjustments = [ciOriginal autoAdjustmentFiltersWithOptions:options];
for (CIFilter *filter in adjustments) {
[filter setValue:ciOriginal forKey:kCIInputImageKey];
ciOriginal = filter.outputImage;
}
CIContext *context = [CIContext contextWithOptions:nil];
CGImageRef cgImage = [context createCGImage:ciOriginal fromRect:ciOriginal.extent];
UIImage *enhancedImage = [[UIImage alloc] initWithCGImage:cgImage];
CGImageRelease(cgImage);
return enhancedImage;
}
}
- (CGImagePropertyOrientation)CGImagePropertyOrientation
{
switch (self.imageOrientation) {
case UIImageOrientationUp:
return kCGImagePropertyOrientationUp;
case UIImageOrientationUpMirrored:
return kCGImagePropertyOrientationUpMirrored;
case UIImageOrientationDown:
return kCGImagePropertyOrientationDown;
case UIImageOrientationDownMirrored:
return kCGImagePropertyOrientationDownMirrored;
case UIImageOrientationLeftMirrored:
return kCGImagePropertyOrientationLeftMirrored;
case UIImageOrientationRight:
return kCGImagePropertyOrientationRight;
case UIImageOrientationRightMirrored:
return kCGImagePropertyOrientationRightMirrored;
case UIImageOrientationLeft:
return kCGImagePropertyOrientationLeft;
}
}
#end
And here's the log from Instruments :
Even with an autoreleasepool, I can't fix it. Any idea would be very much appreciated ! Thanks !

Camera is not showing save/cancel button after capturing?

I am using UIImagePickerController to open camera in my App, and it is not showing save and cancel button along with capture button also.
To implement this I am using objective-c.
- (void)viewDidLoad {
[super viewDidLoad];
pickernew = [[UIImagePickerController alloc] init];
pickernew.sourceType = UIImagePickerControllerSourceTypeCamera;
pickernew.delegate = self;
pickernew.allowsEditing = YES;
[self.view addSubview:pickernew.view];
deviceOrientationCameraLoad=[UIApplication sharedApplication].statusBarOrientation;
//nidhi
[pickernew setShowsCameraControls:YES];
}
-(void)cameraRotetion
{
if(IS_OS_8_OR_LATER) {
switch (deviceOrientationCameraLoad) {
case UIInterfaceOrientationLandscapeLeft:
[self dipslandscapLeft];
break;
case UIInterfaceOrientationLandscapeRight:
[self dipslandscapRight];
break;
case UIInterfaceOrientationPortraitUpsideDown:
[self dipsPortaitdown];
break;
case UIInterfaceOrientationPortrait:
[self dipsPortait];
break;
case UIInterfaceOrientationUnknown:
[self dipsPortait];
break;
default:
break;
}
}
-(void)dipsPortait
{
CGFloat scaleFactor=1.3f;
UIInterfaceOrientation deviceOrientationCamera = [UIApplication sharedApplication].statusBarOrientation;
NSLog(#"before %ld",(long)deviceOrientationCamera);
// switch ([UIApplication sharedApplication].statusBarOrientation) {
switch (deviceOrientationCamera) {
case UIInterfaceOrientationLandscapeLeft:
pickernew.cameraViewTransform = CGAffineTransformScale(CGAffineTransformMakeRotation(M_PI * 90 / 180.0), scaleFactor, scaleFactor);
break;
case UIInterfaceOrientationLandscapeRight:
pickernew.cameraViewTransform = CGAffineTransformScale(CGAffineTransformMakeRotation(M_PI * -90 / 180.0), scaleFactor, scaleFactor);
break;
case UIInterfaceOrientationPortraitUpsideDown:
pickernew.cameraViewTransform = CGAffineTransformScale(CGAffineTransformMakeRotation(M_PI * -180 / 180.0), scaleFactor, scaleFactor);
break;
case UIInterfaceOrientationPortrait:
pickernew.cameraViewTransform = CGAffineTransformScale(CGAffineTransformMakeRotation(0), scaleFactor, scaleFactor);
break;
case UIInterfaceOrientationUnknown:
pickernew.cameraViewTransform = CGAffineTransformScale(CGAffineTransformMakeRotation(0), scaleFactor, scaleFactor);
break;
default:
break;
}
}
-(void)dipsPortaitdown
{
CGFloat scaleFactor=1.3f;
UIInterfaceOrientation deviceOrientationCamera = [UIApplication sharedApplication].statusBarOrientation;
NSLog(#"before %ld",(long)deviceOrientationCamera);
// switch ([UIApplication sharedApplication].statusBarOrientation) {
switch (deviceOrientationCamera) {
case UIInterfaceOrientationLandscapeLeft:
pickernew.cameraViewTransform = CGAffineTransformScale(CGAffineTransformMakeRotation(M_PI * -90 / 180.0), scaleFactor, scaleFactor);
break;
case UIInterfaceOrientationLandscapeRight:
pickernew.cameraViewTransform = CGAffineTransformScale(CGAffineTransformMakeRotation(M_PI * 90 / 180.0), scaleFactor, scaleFactor);
break;
case UIInterfaceOrientationPortraitUpsideDown:
pickernew.cameraViewTransform = CGAffineTransformScale(CGAffineTransformMakeRotation(0), scaleFactor, scaleFactor);
break;
case UIInterfaceOrientationPortrait:
pickernew.cameraViewTransform = CGAffineTransformScale(CGAffineTransformMakeRotation(M_PI * -180 / 180.0), scaleFactor, scaleFactor);
break;
case UIInterfaceOrientationUnknown:
pickernew.cameraViewTransform = CGAffineTransformScale(CGAffineTransformMakeRotation(0), scaleFactor, scaleFactor);
break;
default:
break;
}
}
-(void)dipslandscapLeft
{
CGFloat scaleFactor=1.3f;
UIInterfaceOrientation deviceOrientationCamera = [UIApplication sharedApplication].statusBarOrientation;
NSLog(#"before %ld",(long)deviceOrientationCamera);
// switch ([UIApplication sharedApplication].statusBarOrientation) {
switch (deviceOrientationCamera) {
case UIInterfaceOrientationLandscapeLeft:
pickernew.cameraViewTransform = CGAffineTransformScale(CGAffineTransformMakeRotation(0), scaleFactor, scaleFactor);
break;
case UIInterfaceOrientationLandscapeRight:
pickernew.cameraViewTransform = CGAffineTransformScale(CGAffineTransformMakeRotation(M_PI * -180 / 180.0), scaleFactor, scaleFactor);
break;
case UIInterfaceOrientationPortraitUpsideDown:
pickernew.cameraViewTransform = CGAffineTransformScale(CGAffineTransformMakeRotation(M_PI * 90 / 180.0), scaleFactor, scaleFactor);
break;
case UIInterfaceOrientationPortrait:
pickernew.cameraViewTransform = CGAffineTransformScale(CGAffineTransformMakeRotation(M_PI * -90 / 180.0), scaleFactor, scaleFactor);
break;
case UIInterfaceOrientationUnknown:
pickernew.cameraViewTransform = CGAffineTransformScale(CGAffineTransformMakeRotation(0), scaleFactor, scaleFactor);
break;
default:
break;
}
}
-(void)dipslandscapRight
{
CGFloat scaleFactor=1.3f;
UIInterfaceOrientation deviceOrientationCamera = [UIApplication sharedApplication].statusBarOrientation;
NSLog(#"before %ld",(long)deviceOrientationCamera);
// switch ([UIApplication sharedApplication].statusBarOrientation) {
switch (deviceOrientationCamera) {
case UIInterfaceOrientationLandscapeLeft:
pickernew.cameraViewTransform = CGAffineTransformScale(CGAffineTransformMakeRotation(M_PI * -180 / 180.0), scaleFactor, scaleFactor);
break;
case UIInterfaceOrientationLandscapeRight:
pickernew.cameraViewTransform = CGAffineTransformScale(CGAffineTransformMakeRotation(0), scaleFactor, scaleFactor);
break;
case UIInterfaceOrientationPortraitUpsideDown:
pickernew.cameraViewTransform = CGAffineTransformScale(CGAffineTransformMakeRotation(M_PI * -90 / 180.0), scaleFactor, scaleFactor);
break;
case UIInterfaceOrientationPortrait:
pickernew.cameraViewTransform = CGAffineTransformScale(CGAffineTransformMakeRotation(M_PI * 90 / 180.0), scaleFactor, scaleFactor);
break;
case UIInterfaceOrientationUnknown:
pickernew.cameraViewTransform = CGAffineTransformScale(CGAffineTransformMakeRotation(0), scaleFactor, scaleFactor);
break;
default:
break;
}
}
-(void)objectCurrent1:(UIPopoverController*)pop currentTag:(int)tagCurr lineId:(NSString *)lineId headerId:(NSString *)headerId userDbId:(int)userDBId{
popup = pop;
tagC = tagCurr;
header_Id = headerId;
line_Id = lineId;
User_Db_Id = userDBId;
}
- (void)didReceiveMemoryWarning {
[super didReceiveMemoryWarning];
// Dispose of any resources that can be recreated.
}
- (void)viewWillAppear:(BOOL)animated
{
if (IS_OS_8_OR_LATER)
{
if([[UIDevice currentDevice]orientation] == UIDeviceOrientationFaceUp)
{
if([UIApplication sharedApplication].statusBarOrientation == UIInterfaceOrientationLandscapeLeft)
{
[[UIDevice currentDevice]setValue:[NSNumber numberWithInteger:UIDeviceOrientationLandscapeRight] forKey:#"orientation"];
}
else
{
[[UIDevice currentDevice]setValue:[NSNumber numberWithInteger:UIDeviceOrientationLandscapeLeft] forKey:#"orientation"];
}
}
}
deviceOrientation = [[UIApplication sharedApplication] statusBarOrientation];
if(deviceOrientation == UIInterfaceOrientationPortrait || deviceOrientation == UIInterfaceOrientationPortraitUpsideDown) {
[self portraitView];
}
if(deviceOrientation == UIInterfaceOrientationLandscapeRight || deviceOrientation == UIInterfaceOrientationLandscapeLeft) {
[self landscapeView];
}
//[self cameraRotetion];
[super viewWillAppear:animated];
}
- (void)imagePickerController:(UIImagePickerController *)picker didFinishPickingImage:(UIImage *)image editingInfo:(NSDictionary *)editingInfo
{
}
- (void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info {
UIImage *imgTaken;
NSString *docsDir,*imgNewPath;
NSData *imgData;
docsDir = [NSSearchPathForDirectoriesInDomains(NSDocumentDirectory,NSUserDomainMask, YES) objectAtIndex:0];
imgTaken = info[UIImagePickerControllerOriginalImage];
imgNewPath = [NSTemporaryDirectory() stringByAppendingPathComponent:[NSString stringWithFormat:#"image%#%#%d.png",line_Id,header_Id,User_Db_Id]];
UIImage *image1 = [self resizeImage:imgTaken];
//imgData = UIImageJPEGRepresentation(imgTaken, 0.25);
imgData = UIImagePNGRepresentation(image1);
[imgData writeToFile:imgNewPath atomically:YES];
[popup dismissPopoverAnimated:YES];
pickernew = nil;
}
-(UIImage *)resizeImage :(UIImage *)theImage {
CGSize theNewSize = {400, 400}; ///{width, height}
UIGraphicsBeginImageContextWithOptions(theNewSize, NO, 1.0);
[theImage drawInRect:CGRectMake(0, 0, theNewSize.width, theNewSize.height)];
UIImage *newImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return newImage;
}
- (void)imagePickerControllerDidCancel:(UIImagePickerController *)picker {
// [picker dismissViewControllerAnimated:YES completion:nil];
// [pickernew dismissViewControllerAnimated:YES completion:nil];
// [picker removeFromParentViewController];
// [pickernew removeFromParentViewController];
[popup dismissPopoverAnimated:YES];
//[pickernew removeFromSubView];
pickernew = nil;
}
- (void)cancelButtonPressed:(id)sender {
// [self.captureSession stopRunning]; //stop the capture session
[self.presentingViewController dismissViewControllerAnimated:YES completion:nil]; // dismiss the current view controller
}
- (void)viewDidAppear:(BOOL)animated
{
[[NSNotificationCenter defaultCenter]addObserver:self selector:#selector(onOrientationChange:) name:UIDeviceOrientationDidChangeNotification object:nil];
[[UIDevice currentDevice] beginGeneratingDeviceOrientationNotifications];
[super viewDidAppear:animated];
}
- (void)viewWillDisappear:(BOOL)animated
{
[[NSNotificationCenter defaultCenter] removeObserver:self name:UIDeviceOrientationDidChangeNotification object:nil];
[super viewWillDisappear:animated];
}
-(void)portraitView{
if (pickernew) {
[self cameraRotetion];
}
}
-(void)landscapeView{
if (pickernew) {
[self cameraRotetion];
}
}
The save / cancel button are part of the allowsEditing, it only appears post-capture. Also, you should not use self.view addSubview,
but rather:
[self presentViewController:pickerView animated:YES completion:nil]

How to increase Brightness of GPUImage?

I am using BradLarson's GPUImage library for filtering image.
I am using this code in my project.
GPUImagePicture *staticPicture;
Applying of filters are like this -
-(void) setFilter:(int) index {
switch (index) {
case 1:{
filter = [[GPUImageContrastFilter alloc] init];
[(GPUImageContrastFilter *) filter setContrast:1.75];
} break;
case 2: {
filter = [[GPUImageToneCurveFilter alloc] initWithACV:#"crossprocess"];
} break;
case 3: {
filter = [[GPUImageToneCurveFilter alloc] initWithACV:#"02"];
} break;
case 4: {
filter = [[GrayscaleContrastFilter alloc] init];
} break;
case 5: {
filter = [[GPUImageToneCurveFilter alloc] initWithACV:#"17"];
} break;
case 6: {
filter = [[GPUImageToneCurveFilter alloc] initWithACV:#"aqua"];
} break;
case 7: {
filter = [[GPUImageToneCurveFilter alloc] initWithACV:#"yellow-red"];
} break;
case 8: {
filter = [[GPUImageToneCurveFilter alloc] initWithACV:#"06"];
} break;
case 9: {
filter = [[GPUImageToneCurveFilter alloc] initWithACV:#"purple-green"];
} break;
default:
filter = [[GPUImageFilter alloc] init];
break;
}
Now I want to increase the brighness of the GPUImage without changing the current applied filter.How to do this ?
Include the below methods
-(UIImage*)applyFilterForImage:(UIImage*)img value:(NSNumber*)filterValue
{
switch (type) {
case kPhotoBrightness:
self.filterImage = [self applyBrightnessForImage:img value:filterValue];
break;
case kPhotoContrast:
self.filterImage = [self applyContrastForImage:img value:filterValue];
break;
case kPhotoSaturation:
self.filterImage = [self applySaturationForimage:img value:filterValue];
break;
case kPhotoHue:
self.filterImage = [self applyHueForImage:img value:filterValue];
break;
case kPhotoSharpness:
self.filterImage = [self applySharpnessForImage:img value:filterValue];
break;
default:
break;
}
return self.filterImage;
}
-(UIImage*)applyBrightnessForImage:(UIImage*)img value:(NSNumber*)filterValue {
GPUImageBrightnessFilter *brightnessFilter = (GPUImageBrightnessFilter*)self.filter;
[brightnessFilter setBrightness:[filterValue floatValue]];
return [self outputImageForFilter:brightnessFilter andImage:img];
}
//output the filtered image
-(UIImage*)outputImageForFilter:(GPUImageOutput<GPUImageInput>*)_filter andImage:(UIImage*)_image {
GPUImagePicture *filteredImage = [[GPUImagePicture alloc]initWithImage:_image];
[filteredImage addTarget:_filter];
[filteredImage processImage];
return [_filter imageFromCurrentlyProcessedOutputWithOrientation:_image.imageOrientation];
}

Calculator divide by zero

I created a simple calculator. Everything works great; however, if I divide by zero, I would like to show an error message. I know how to do alert popups, but I don't know how to implement it so it comes up when I divide by zero. Here is a snipped of my calculator code:
- (IBAction)buttonOperationPressed:(id)sender {
if (currentOperation == 0) result = currentNumber;
else {
switch (currentOperation) {
case 1:
result = result + currentNumber;
break;
case 2:
result = result - currentNumber;
break;
case 3:
result = result * currentNumber;
break;
case 4:
result = result / currentNumber;
break;
case 5:
currentOperation = 0;
break;
default:
break;
}
}
currentNumber = 0;
CalcDisplay.text = [NSString stringWithFormat:#"%g",result];
if ([sender tag] == 0) result = 0;
currentOperation = [sender tag];
userInTheMiddleOfEnteringDecimal = NO;
You can just add a test prior to doing the division e.g. change:
case 4:
result = result / currentNumber;
break;
to:
case 4:
if (currentNumber == 0)
// ... do alert here ...
else
result = result / currentNumber;
break;
You have to check if the second division operand is zero, and if yes, then print an error message. Don't forget, that you can't just compare double or something with ==, you have to use presicion, like this:
case 4:
if(ABS(currentNumber) < 1e-12) // most probably its zero
// your message
return;
- (IBAction)buttonOperationPressed:(id)sender {
if (currentOperation == 0) result = currentNumber;
else {
switch (currentOperation) {
case 1:
result = result + currentNumber;
break;
case 2:
result = result - currentNumber;
break;
case 3:
result = result * currentNumber;
break;
case 4:
if(currentNumber == 0){
UIAlertView *alert = [[UIAlertView alloc] initWithTitle:#"Title" message:#"Message" delegate:nil cancelButtonTitle:#"OK" otherButtonTitles:nil, nil];
[alert show];
}else{
result = result / currentNumber;
}
break;
case 5:
currentOperation = 0;
break;
default:
break;
}
}
currentNumber = 0;
CalcDisplay.text = [NSString stringWithFormat:#"%g",result];
if ([sender tag] == 0) result = 0;
currentOperation = [sender tag];
userInTheMiddleOfEnteringDecimal = NO;
Please try this code, I have copied an pasted the code you have given and added some necessary lines to it which i felt would solve your issue.

I want to make a more than two input texture filter with GPUImage. but I got a black output

I want to make a new filter like GPUImage's GPUImageTwoInputFilter.
here is my code . A base class named IFFourInputFilter, it is likely GPUImageTwoInputFilter.
#import "IFFourInputFilter.h"
NSString *const kIFFourInputTextureVertexShaderString = SHADER_STRING
(
attribute vec4 position;
attribute vec4 inputTextureCoordinate;
attribute vec4 inputTextureCoordinate2;
attribute vec4 inputTextureCoordinate3;
attribute vec4 inputTextureCoordinate4;
varying vec2 textureCoordinate;
varying vec2 textureCoordinate2;
varying vec2 textureCoordinate3;
varying vec2 textureCoordinate4;
void main()
{
gl_Position = position;
textureCoordinate = inputTextureCoordinate.xy;
textureCoordinate2 = inputTextureCoordinate2.xy;
textureCoordinate3 = inputTextureCoordinate3.xy;
textureCoordinate4 = inputTextureCoordinate4.xy;
}
);
#implementation IFFourInputFilter
#pragma mark -
#pragma mark Initialization and teardown
- (id)initWithFragmentShaderFromString:(NSString *)fragmentShaderString;
{
if (!(self = [self initWithVertexShaderFromString:kIFFourInputTextureVertexShaderString fragmentShaderFromString:fragmentShaderString]))
{
return nil;
}
return self;
}
- (id)initWithVertexShaderFromString:(NSString *)vertexShaderString fragmentShaderFromString:(NSString *)fragmentShaderString;
{
if (!(self = [super initWithVertexShaderFromString:vertexShaderString fragmentShaderFromString:fragmentShaderString]))
{
return nil;
}
inputRotation2 = kGPUImageNoRotation;
inputRotation3 = kGPUImageNoRotation;
inputRotation4 = kGPUImageNoRotation;
hasSetTexture1 = NO;
hasSetTexture2 = NO;
hasSetTexture3 = NO;
hasReceivedFrame1 = NO;
hasReceivedFrame2 = NO;
hasReceivedFrame3 = NO;
hasReceivedFrame4 = NO;
frameWasVideo1 = NO;
frameWasVideo2 = NO;
frameWasVideo3 = NO;
frameWasVideo4 = NO;
frameCheckDisabled1 = NO;
frameCheckDisabled2 = NO;
frameCheckDisabled3 = NO;
frameCheckDisabled4 = NO;
frameTime1 = kCMTimeInvalid;
frameTime2 = kCMTimeInvalid;
frameTime3 = kCMTimeInvalid;
frameTime4 = kCMTimeInvalid;
runSynchronouslyOnVideoProcessingQueue(^{
[GPUImageOpenGLESContext useImageProcessingContext];
filterTextureCoordinateAttribute2 = [filterProgram attributeIndex:#"inputTextureCoordinate2"];
filterInputTextureUniform2 = [filterProgram uniformIndex:#"inputImageTexture2"]; // This does assume a name of "inputImageTexture2" for second input texture in the fragment shader
glEnableVertexAttribArray(filterTextureCoordinateAttribute2);
filterTextureCoordinateAttribute3 = [filterProgram attributeIndex:#"inputTextureCoordinate3"];
filterInputTextureUniform3 = [filterProgram uniformIndex:#"inputImageTexture3"]; // This does assume a name of "inputImageTexture2" for second input texture in the fragment shader
glEnableVertexAttribArray(filterTextureCoordinateAttribute3);
filterTextureCoordinateAttribute4 = [filterProgram attributeIndex:#"inputTextureCoordinate4"];
filterInputTextureUniform4 = [filterProgram uniformIndex:#"inputImageTexture4"]; // This does assume a name of "inputImageTexture2" for second input texture in the fragment shader
glEnableVertexAttribArray(filterTextureCoordinateAttribute4);
});
return self;
}
- (void)initializeAttributes;
{
[super initializeAttributes];
[filterProgram addAttribute:#"inputTextureCoordinate2"];
[filterProgram addAttribute:#"inputTextureCoordinate3"];
[filterProgram addAttribute:#"inputTextureCoordinate4"];
}
- (void)disableFrameCheck1;
{
frameCheckDisabled1 = YES;
}
- (void)disableFrameCheck2;
{
frameCheckDisabled2 = YES;
}
- (void)disableFrameCheck3;
{
frameCheckDisabled3 = YES;
}
- (void)disableFrameCheck4;
{
frameCheckDisabled4 = YES;
}
#pragma mark -
#pragma mark Rendering
- (void)renderToTextureWithVertices:(const GLfloat *)vertices textureCoordinates:(const GLfloat *)textureCoordinates sourceTexture:(GLuint)sourceTexture;
{
if (self.preventRendering)
{
return;
}
[GPUImageOpenGLESContext setActiveShaderProgram:filterProgram];
[self setUniformsForProgramAtIndex:0];
[self setFilterFBO];
glClearColor(backgroundColorRed, backgroundColorGreen, backgroundColorBlue, backgroundColorAlpha);
glClear(GL_COLOR_BUFFER_BIT);
glActiveTexture(GL_TEXTURE2);
glBindTexture(GL_TEXTURE_2D, sourceTexture);
glUniform1i(filterInputTextureUniform, 2);
glActiveTexture(GL_TEXTURE3);
glBindTexture(GL_TEXTURE_2D, filterSourceTexture2);
glUniform1i(filterInputTextureUniform2, 3);
glActiveTexture(GL_TEXTURE4);
glBindTexture(GL_TEXTURE_2D, filterSourceTexture3);
glUniform1i(filterInputTextureUniform3, 4);
glActiveTexture(GL_TEXTURE5);
glBindTexture(GL_TEXTURE_2D, filterSourceTexture4);
glUniform1i(filterInputTextureUniform4, 5);
glVertexAttribPointer(filterPositionAttribute, 2, GL_FLOAT, 0, 0, vertices);
glVertexAttribPointer(filterTextureCoordinateAttribute, 2, GL_FLOAT, 0, 0, textureCoordinates);
glVertexAttribPointer(filterTextureCoordinateAttribute2, 2, GL_FLOAT, 0, 0, [[self class] textureCoordinatesForRotation:inputRotation2]);
glVertexAttribPointer(filterTextureCoordinateAttribute3, 2, GL_FLOAT, 0, 0, [[self class] textureCoordinatesForRotation:inputRotation3]);
glVertexAttribPointer(filterTextureCoordinateAttribute4, 2, GL_FLOAT, 0, 0, [[self class] textureCoordinatesForRotation:inputRotation4]);
glDrawArrays(GL_TRIANGLE_STRIP, 0, 4);
}
- (void)releaseInputTexturesIfNeeded;
{
if (shouldConserveMemoryForNextFrame)
{
[firstTextureDelegate textureNoLongerNeededForTarget:self];
[textureDelegate2 textureNoLongerNeededForTarget:self];
[textureDelegate3 textureNoLongerNeededForTarget:self];
[textureDelegate4 textureNoLongerNeededForTarget:self];
shouldConserveMemoryForNextFrame = NO;
}
}
#pragma mark -
#pragma mark GPUImageInput
- (NSInteger)nextAvailableTextureIndex;
{
if (!hasSetTexture1){
return 0;
}else if (!hasSetTexture2) {
return 1;
}else if (!hasSetTexture3) {
return 2;
}else{
return 3;
}
}
- (void)setInputTexture:(GLuint)newInputTexture atIndex:(NSInteger)textureIndex;
{
switch (textureIndex) {
case 0:
filterSourceTexture = newInputTexture;
hasSetTexture1 = YES;
break;
case 1:
filterSourceTexture2 = newInputTexture;
hasSetTexture2 = YES;
break;
case 2:
filterSourceTexture3 = newInputTexture;
hasSetTexture3 = YES;
break;
case 3:
filterSourceTexture4 = newInputTexture;
break;
default:
break;
}
}
- (void)setInputSize:(CGSize)newSize atIndex:(NSInteger)textureIndex;
{
if (textureIndex == 0)
{
[super setInputSize:newSize atIndex:textureIndex];
if (CGSizeEqualToSize(newSize, CGSizeZero))
{
hasSetTexture1 = NO;
}
}
}
- (void)setInputRotation:(GPUImageRotationMode)newInputRotation atIndex:(NSInteger)textureIndex;
{
switch (textureIndex) {
case 0:
inputRotation = newInputRotation;
break;
case 1:
inputRotation2 = newInputRotation;
break;
case 2:
inputRotation3 = newInputRotation;
break;
case 3:
inputRotation4 = newInputRotation;
break;
default:
break;
}
}
- (CGSize)rotatedSize:(CGSize)sizeToRotate forIndex:(NSInteger)textureIndex;
{
CGSize rotatedSize = sizeToRotate;
GPUImageRotationMode rotationToCheck;
switch (textureIndex) {
case 0:
rotationToCheck = inputRotation;
break;
case 1:
rotationToCheck = inputRotation2;
break;
case 2:
rotationToCheck = inputRotation3;
break;
case 3:
rotationToCheck = inputRotation4;
break;
default:
break;
}
if (GPUImageRotationSwapsWidthAndHeight(rotationToCheck))
{
rotatedSize.width = sizeToRotate.height;
rotatedSize.height = sizeToRotate.width;
}
return rotatedSize;
}
- (void)newFrameReadyAtTime:(CMTime)frameTime atIndex:(NSInteger)textureIndex;
{
outputTextureRetainCount = [targets count];
// You can set up infinite update loops, so this helps to short circuit them
if (hasReceivedFrame1 && hasReceivedFrame2 && hasReceivedFrame3 && hasReceivedFrame4)
{
return;
}
BOOL updatedMovieFrameOppositeStillImage = NO;
switch (textureIndex) {
case 0:
hasReceivedFrame1 = YES;
frameTime1 = frameTime;
if (frameCheckDisabled2)
{
hasReceivedFrame2 = YES;
}
if (frameCheckDisabled3)
{
hasReceivedFrame3 = YES;
}
if (frameCheckDisabled4)
{
hasReceivedFrame4 = YES;
}
if (!CMTIME_IS_INDEFINITE(frameTime))
{
if (CMTIME_IS_INDEFINITE(frameTime2) && CMTIME_IS_INDEFINITE(frameTime3) && CMTIME_IS_INDEFINITE(frameTime4))
{
updatedMovieFrameOppositeStillImage = YES;
}
}
break;
case 1:
hasReceivedFrame2 = YES;
frameTime2 = frameTime;
if (frameCheckDisabled1)
{
hasReceivedFrame1 = YES;
}
if (frameCheckDisabled3)
{
hasReceivedFrame3 = YES;
}
if (frameCheckDisabled4)
{
hasReceivedFrame4 = YES;
}
if (!CMTIME_IS_INDEFINITE(frameTime))
{
if (CMTIME_IS_INDEFINITE(frameTime1) && CMTIME_IS_INDEFINITE(frameTime3) && CMTIME_IS_INDEFINITE(frameTime4))
{
updatedMovieFrameOppositeStillImage = YES;
}
}
break;
case 2:
hasReceivedFrame3 = YES;
frameTime3 = frameTime;
if (frameCheckDisabled1)
{
hasReceivedFrame1 = YES;
}
if (frameCheckDisabled2)
{
hasReceivedFrame2 = YES;
}
if (frameCheckDisabled4)
{
hasReceivedFrame4 = YES;
}
if (!CMTIME_IS_INDEFINITE(frameTime))
{
if (CMTIME_IS_INDEFINITE(frameTime1) && CMTIME_IS_INDEFINITE(frameTime2) && CMTIME_IS_INDEFINITE(frameTime4))
{
updatedMovieFrameOppositeStillImage = YES;
}
}
break;
case 3:
hasReceivedFrame4 = YES;
frameTime4 = frameTime;
if (frameCheckDisabled1)
{
hasReceivedFrame1 = YES;
}
if (frameCheckDisabled3)
{
hasReceivedFrame3 = YES;
}
if (frameCheckDisabled2)
{
hasReceivedFrame2 = YES;
}
if (!CMTIME_IS_INDEFINITE(frameTime))
{
if (CMTIME_IS_INDEFINITE(frameTime1) && CMTIME_IS_INDEFINITE(frameTime3) && CMTIME_IS_INDEFINITE(frameTime2))
{
updatedMovieFrameOppositeStillImage = YES;
}
}
break;
default:
break;
}
// || (hasReceivedFirstFrame && secondFrameCheckDisabled) || (hasReceivedSecondFrame && firstFrameCheckDisabled)
if ((hasReceivedFrame1 && hasReceivedFrame2 && hasReceivedFrame3 && hasReceivedFrame4) || updatedMovieFrameOppositeStillImage)
{
[super newFrameReadyAtTime:frameTime atIndex:0];
hasReceivedFrame1 = NO;
hasReceivedFrame2 = NO;
hasReceivedFrame3 = NO;
hasReceivedFrame4 = NO;
}
}
- (void)setTextureDelegate:(id<GPUImageTextureDelegate>)newTextureDelegate atIndex:(NSInteger)textureIndex;
{
switch (textureIndex) {
case 0:
firstTextureDelegate = newTextureDelegate;
break;
case 1:
textureDelegate2 = newTextureDelegate;
break;
case 2:
textureDelegate3 = newTextureDelegate;
break;
case 3:
textureDelegate4 = newTextureDelegate;
break;
default:
break;
}
}
#end
A class named IFAmaroFilter extend IFFourInputFilter.
#import "IFAmaroFilter.h"
NSString *const kIFAmaroFilterFragmentShaderString = SHADER_STRING
(
precision lowp float;
varying highp vec2 textureCoordinate;
uniform sampler2D inputImageTexture;
uniform sampler2D inputImageTexture2; //blowout;
uniform sampler2D inputImageTexture3; //overlay;
uniform sampler2D inputImageTexture4; //map
void main()
{
vec4 texel = texture2D(inputImageTexture, textureCoordinate);
vec3 bbTexel = texture2D(inputImageTexture2, textureCoordinate).rgb;
texel.r = texture2D(inputImageTexture3, vec2(bbTexel.r, texel.r)).r;
texel.g = texture2D(inputImageTexture3, vec2(bbTexel.g, texel.g)).g;
texel.b = texture2D(inputImageTexture3, vec2(bbTexel.b, texel.b)).b;
vec4 mapped;
mapped.r = texture2D(inputImageTexture4, vec2(texel.r, 0.16666)).r;
mapped.g = texture2D(inputImageTexture4, vec2(texel.g, .5)).g;
mapped.b = texture2D(inputImageTexture4, vec2(texel.b, .83333)).b;
mapped.a = 1.0;
gl_FragColor = texel;
}
);
#implementation IFAmaroFilter
- (id)init;
{
if (!(self = [super initWithFragmentShaderFromString:kIFAmaroFilterFragmentShaderString]))
{
return nil;
}
return self;
}
#end
when I use the filter I got a black output. code below:
filter = [[IFAmaroFilter alloc] init];
GPUImagePicture *gp1 = [[GPUImagePicture alloc] initWithImage:[UIImage imageWithContentsOfFile:[[NSBundle mainBundle] pathForResource:#"blackboard1024" ofType:#"png"]]];
GPUImagePicture *gp2 = [[GPUImagePicture alloc] initWithImage:[UIImage imageWithContentsOfFile:[[NSBundle mainBundle] pathForResource:#"overlayMap" ofType:#"png"]]];
GPUImagePicture *gp3 = [[GPUImagePicture alloc] initWithImage:[UIImage imageWithContentsOfFile:[[NSBundle mainBundle] pathForResource:#"amaroMap" ofType:#"png"]]];
[stillCamera addTarget:filter atTextureLocation:0];
[gp1 addTarget:filter atTextureLocation:1];
[gp1 processImage];
[gp2 addTarget:filter atTextureLocation:2];
[gp2 processImage];
[gp3 addTarget:filter atTextureLocation:3];
[gp3 processImage];
[filter addTarget:(GPUImageView *)self.view];
I found the GPUImagePicture will be auto release, so the filter will not receive the texture.
If u met the same question, check the texture's life control carefully, watch when they be dealloc.
I agree with ZhouQi's answer, but I'd like to clarify a bit.
You need to have a GPUImagePicture property in your interface declaration so that your GPUImagePictureObject isn't autoreleased right away.
In your interface declaration:
#property (strong, nonatomic) GPUImagePicture *sourcePicture;
In your implementation:
UIImage *inputImage = [UIImage imageNamed:#"MyPicture.png"];
_sourcePicture = [[GPUImagePicture alloc] initWithImage:inputImage
smoothlyScaleOutput:YES];
[_sourcePicture processImage];
[_videoCamera addTarget:_myFilter];
[_sourcePicture addTarget:_myFilter];
[_myFilter addTarget:_myView];
Check out any of the blend filters in the Filter Showcase example project here:
https://github.com/BradLarson/GPUImage/tree/master/examples/iOS/FilterShowcase
I had try to use this shader to merge my three video to make MV effect
filter0 = [[GPUImageThreeInputFilter alloc] initWithFragmentShaderFromString:#"\nprecision highp float;\nuniform sampler2D inputImageTexture; //video\nuniform sampler2D inputImageTexture2; //mv\nuniform sampler2D inputImageTexture3; //alpha\nvarying vec2 textureCoordinate;\nvoid main()\n{\nvec4 video = texture2D(inputImageTexture, textureCoordinate);\nvec4 mv = texture2D(inputImageTexture2, textureCoordinate);\nvec4 alpha = texture2D(inputImageTexture3, textureCoordinate);\ngl_FragColor = video * (1.0 - alpha.r) + mv;\n}"];
I have same issue , could you tell me some idea ?

Resources