GPUImage RawDataInput and RawDataOutput rotates image - ios

I'm trying to use GPUImage rawInput and rawOutput to do some custom stuff, but somehow my output is rotated.
I'm setting my GPUImageVideoCamera like this:
self.videoCamera = [[GPUImageVideoCamera alloc] initWithSessionPreset:AVCaptureSessionPreset640x480 cameraPosition:AVCaptureDevicePositionBack];
self.videoCamera.outputImageOrientation = UIInterfaceOrientationPortrait;
self.videoCamera.runBenchmark = YES;
And then just this.
CGSize rawSize = CGSizeMake(640.0, 480.0 );
GPUImageRawDataOutput *rawOutput = [[GPUImageRawDataOutput alloc] initWithImageSize:rawSize resultsInBGRAFormat:YES];
GPUImageRawDataInput __block *rawInput = [[GPUImageRawDataInput alloc] initWithBytes:[rawOutput rawBytesForImage] size:rawSize];;
__weak GPUImageRawDataOutput *weakRawOutput = rawOutput;
[rawOutput setNewFrameAvailableBlock:^{
[weakRawOutput rawBytesForImage];
[rawInput updateDataFromBytes:[weakRawOutput rawBytesForImage] size:rawSize];
[rawInput processData];
}];
and of course
[self.videoCamera addTarget:rawOutput];
[rawInput addTarget:self.cameraView];
[self.videoCamera startCameraCapture];
This is what i get
https://www.dropbox.com/s/yo19o7ryagk58le/2013-12-05%2018.02.56.png
Any ideas how to PROPERLY handle this?
EDIT:
I just found out, that putting sepia filter before rawOutput corrects rotation, but grayScale filter doesn't ...

Related

Image auto-rotates after using CIFilter

I am writing an app that lets users take a picture and then edit it. I am working on implementing tools with UISliders for brightness/contrast/saturation and am using the Core Image Filter class to do so. When I open the app, I can take a picture and display it correctly. However, if I choose to edit a picture, and then use any of the described slider tools, the image will rotate counterclockwise 90 degrees. Here's the code in question:
- (void)viewDidLoad
{
[super viewDidLoad];
// Do any additional setup after loading the view.
self.navigationItem.hidesBackButton = YES; //hide default nav
//get image to display
DBConnector *dbconnector = [[DBConnector alloc] init];
album.moments = [dbconnector getMomentsForAlbum:album.title];
Moment *mmt = [album.moments firstObject];
_imageView.image = [mmt.moment firstObject];
CGImageRef aCGImage = _imageView.image.CGImage;
CIImage *aCIImage = [CIImage imageWithCGImage:aCGImage];
_editor = [CIFilter filterWithName:#"CIColorControls" keysAndValues:#"inputImage", aCIImage, nil];
_context = [CIContext contextWithOptions: nil];
[self startEditControllerFromViewController:self];
}
//cancel and finish buttons
- (BOOL) startEditControllerFromViewController: (UIViewController*) controller {
[_cancelEdit addTarget:self action:#selector(cancelEdit:) forControlEvents:UIControlEventTouchUpInside];
[_finishEdit addTarget:self action:#selector(finishEdit:) forControlEvents:UIControlEventTouchUpInside];
return YES;
}
//adjust brightness
- (IBAction)brightnessSlider:(UISlider *)sender {
[_editor setValue:[NSNumber numberWithFloat:_brightnessSlider.value] forKey: #"inputBrightness"];
CGImageRef cgiimg = [_context createCGImage:_editor.outputImage fromRect:_editor.outputImage.extent];
_imageView.image = [UIImage imageWithCGImage: cgiimg];
CGImageRelease(cgiimg);
}
I believe that the problem stems from the brightnessSlider method, based on breakpoints that I've placed. Is there a way to stop the auto-rotating of my photo? If not, how can I rotate it back to the normal orientation?
Mere minutes after posting, I figured out the answer to my own question. Go figure. Anyway, I simply changed the slider method to the following:
- (IBAction)brightnessSlider:(UISlider *)sender {
[_editor setValue:[NSNumber numberWithFloat:_brightnessSlider.value] forKey: #"inputBrightness"];
CGImageRef cgiimg = [_context createCGImage:_editor.outputImage fromRect:_editor.outputImage.extent];
UIImageOrientation originalOrientation = _imageView.image.imageOrientation;
CGFloat originalScale = _imageView.image.scale;
_imageView.image = [UIImage imageWithCGImage: cgiimg scale:originalScale orientation:originalOrientation];
CGImageRelease(cgiimg);
}
This simply records the original orientation and scale of the image, and re-sets them when the data is converted back to a UIImage. Hope this helps someone else!

Blurry image after using UIDynamicAnimator

I'm using UIDynamicsAnimator with good results except that after the animations are done the image that have been pushed and bounced is blurry. Please help me.
#property (nonatomic,weak) IBOutlet UIImageView *someImage;
self.animator = [[UIDynamicAnimator alloc] initWithReferenceView:self.view];
self.gravity = [[UIGravityBehavior alloc] initWithItems:#[self.someImage]];
CGVector gravityDirection = {-1.0, 0.5};
[self.gravity setGravityDirection:gravityDirection];
self.collision = [[UICollisionBehavior alloc] initWithItems:#[self.someImage]];
self.collision.translatesReferenceBoundsIntoBoundary = YES;
self.itemBehaviour = [[UIDynamicItemBehavior alloc] initWithItems:#[self.someImage]];
self.itemBehaviour.elasticity = 0.5;
[self.animator removeAllBehaviors]; // to avoid problems from the last behaviour animation
[self.collision addBoundaryWithIdentifier:#"barrier"
fromPoint:CGPointMake(444,0)
toPoint:CGPointMake(444,768)];
self.pushBehavior = [[UIPushBehavior alloc] initWithItems:#[self.someImage] mode:UIPushBehaviorModeInstantaneous];
self.pushBehavior.magnitude = 1.0f;
self.pushBehavior.angle = 0.0f;
[self.animator addBehavior:self.pushBehavior];
self.pushBehavior.pushDirection = CGVectorMake(1.0f, 0.0f)
self.pushBehavior.active = YES;
[self.animator addBehavior:self.itemBehaviour];
[self.animator addBehavior:self.collision];
[self.animator addBehavior:self.gravity];
I think your problem is that the image isn't on an exact point when the animation is done. I would sugest that you add a delegate and when the animation pauses you round the x and y for the image to the closest integer.

dataProvider is 0x0 / nil (GPUImage Framework)

I wrote some code which creates a filter and can be controlled via a UISlider.
But if I slide the UISlider, the app crashes.
My code:
.m file:
- (void) viewDidLoad {
[_sliderBrightness addTarget:self action:#selector(brightnessFilter) forControlEvents:UIControlEventValueChanged];
_sliderBrightness.minimumValue = -1.0;
_sliderBrightness.maximumValue = 1.0;
_sliderBrightness.value = 0.0;
}
- (IBAction)sliderBrightness:(UISlider *)sender {
CGFloat midpoint = [(UISlider *)sender value];
[(GPUImageBrightnessFilter *)brightFilter setBrightness:midpoint - 0.1];
[(GPUImageBrightnessFilter *)brightFilter setBrightness:midpoint + 0.1];
[sourcePicture processImage];
}
- (void) brightnessFilter {
UIImage *inputImage = _imgView.image;
sourcePicture = [[GPUImagePicture alloc] initWithImage:inputImage smoothlyScaleOutput:YES];
brightFilter = [[GPUImageBrightnessFilter alloc] init];
GPUImageView *imgView2 = (GPUImageView *)self.view;
[brightFilter useNextFrameForImageCapture];
[sourcePicture addTarget:brightFilter];
[sourcePicture processImage];
UIImage* outputImage = [brightFilter imageFromCurrentFramebufferWithOrientation:0];
[_imgView setImage:outputImage];
}
Error:
GPUImageFramebuffer.m:
}
else
{
[self activateFramebuffer];
rawImagePixels = (GLubyte *)malloc(totalBytesForImage);
glReadPixels(0, 0, (int)_size.width, (int)_size.height, GL_RGBA, GL_UNSIGNED_BYTE, rawImagePixels);
dataProvider = CGDataProviderCreateWithData(NULL, rawImagePixels, totalBytesForImage, dataProviderReleaseCallback);
[self unlock]; // Don't need to keep this around anymore
}
In this line of code:
[self activateFramebuffer];
Error message:
Thread 1: EXC_BAD_ACCESS (code=EXC_1386_GPFLT)
Console:
self = (GPUImageFramebuffer *const) 0x10a0a6960
rawImagePixels = (GLubyte *) 0x190
dataProvider = (CGDataProviderRef) 0x0
renderTarget = (CVPixelBufferRef) 0x8
Maybe the dataProvider causes the crash but I don't really know because I'm new in developing iOS apps.
This obviously isn't going to work (and shouldn't even compile) because GPUImageBrightnessFilter has no -setTopFocusLevel: or -setBottomFocusLevel: method. You copied this from my sample application without changing these methods to the one appropriate to a brightness filter (which is the brightness property).
It's also rather confusing (and potentially problematic) to have both a brightnessFilter instance variable and -brightnessFilter method. You probably want to rename the former to make it clear that's where you're performing your initial setup of the filter and source image. You'll also need to call that in your view controller's setup (after your Nib is loaded).

Having issue with SKEffectNode

I am following Apple's SpriteKit documentation, now I am trying to use SKEffectNode, but my problem is the effect will not be applied! Here is my code :
SKEffectNode *lightingNode = [[SKEffectNode alloc] init];
SKTexture *texture = [SKTexture textureWithImage:[UIImage imageNamed:#"Spaceship"]];
SKSpriteNode *light = [SKSpriteNode spriteNodeWithTexture:texture];
self.filter = [self blurFilter];
lightingNode.position = self.view.center;
lightingNode.blendMode = SKBlendModeAdd;
[lightingNode addChild: light];
[self addChild:lightingNode];
//applying blur
- (CIFilter *)blurFilter
{
CIFilter *filter = [CIFilter filterWithName:#"CIBoxBlur"]; // 3
[filter setDefaults];
[filter setValue:[NSNumber numberWithFloat:20] forKey:#"inputRadius"];
return filter;
}
When you run the app it just show spaceship without any blur effect.
It looks like #"CIBoxBlur" filter doesn't exist anymore, at least in iOS 7.0. You can use #"CIGaussianBlur". You can see full list of the filters by running:
NSArray* filters = [CIFilter filterNamesInCategories:nil];
for (NSString* filterName in filters)
{
NSLog(#"Filter: %#", filterName);
}
I don't see anywhere that you're setting the filter property of the effect node you've created, nor where you're setting its shouldEnableEffects property to YES.
As implied by the aforelinked documentation, both of those properties need a meaningful value if your effect node is to apply a filter.
I used your code with the following changes and all tested great!
//switch this
lightingNode.position
//to this
light.position
//add this
lightingNode.shouldEnableEffects = YES;
//change
CIBoxBlur
//to
CIGaussianBlur

GPUImage slow memory accumulation

I looked at forums to locate similar questions, but it seems that my issue is different.
I am using GPUImage, I built framework using bash script. I looked at samples of creating filters & their chain for processing.
But in my app I am facing constant increase in memory consumption. There are no memory leaks, I profiled my app.
image - is an image taken from the camera. I call the code described bellow in for loop. I do witness memory increase starting from 7MB, to 150MB, after than the app is terminated.
Here is the code :
- (void)processImageInternal:(UIImage *)image
{
GPUImagePicture * picture = [[GPUImagePicture alloc]initWithImage:image];
GPUImageLanczosResamplingFilter *lanczosResamplingFilter = [GPUImageLanczosResamplingFilter new];
CGFloat scale = image.scale;
CGSize size = image.size;
CGFloat newScale = roundf(2*scale);
CGSize newSize = CGSizeMake((size.width *scale)/newScale,(size.height *scale)/newScale);
[lanczosResamplingFilter forceProcessingAtSize:newSize];
[picture addTarget:lanczosResamplingFilter];
GPUImageGrayscaleFilter * grayScaleFilter = [GPUImageGrayscaleFilter new];
[grayScaleFilter forceProcessingAtSize:newSize];
[lanczosResamplingFilter addTarget:grayScaleFilter];
GPUImageMedianFilter * medianFilter = [GPUImageMedianFilter new];
[medianFilter forceProcessingAtSize:newSize];
[grayScaleFilter addTarget:medianFilter];
GPUImageSharpenFilter * sharpenFilter = [GPUImageSharpenFilter new];
sharpenFilter.sharpness +=2.0;
[sharpenFilter forceProcessingAtSize:newSize];
[medianFilter addTarget:sharpenFilter];
GPUImageGaussianBlurFilter * blurFilter = [GPUImageGaussianBlurFilter new];
blurFilter.blurSize = 0.5;
[blurFilter forceProcessingAtSize:newSize];
[sharpenFilter addTarget:blurFilter];
GPUImageUnsharpMaskFilter * unsharpMask = [GPUImageUnsharpMaskFilter new];
[unsharpMask forceProcessingAtSize:newSize];
[blurFilter addTarget:unsharpMask];
[picture processImage];
image = [unsharpMask imageFromCurrentlyProcessedOutput];
}
The code is executed in background thread.
Here is calling code:
for (NSUInteger i=0;i <100;i++)
{
NSLog(#" INDEX OF I : %d",i);
[self processImageInternal:image];
}
I even added some cleanup logic into - (void)processImageInternal:(UIImage *)image method
- (void)processImageInternal:(UIImage *)image
{
.........
[picture processImage];
image = [unsharpMask imageFromCurrentlyProcessedOutput];
//clean up code....
[picture removeAllTargets];
[self releaseResourcesForGPUOutput:unsharpMask];
[self releaseResourcesForGPUOutput:blurFilter];
[self releaseResourcesForGPUOutput:sharpenFilter];
[self releaseResourcesForGPUOutput:medianFilter];
[self releaseResourcesForGPUOutput:lanczosResamplingFilter];
[self releaseResourcesForGPUOutput:grayScaleFilter];
In release method, basically I am releasing whatever is possible to release: FBO, textures, targets. Here is the code:
- (void)releaseResourcesForGPUOutput:(GPUImageOutput *) output
{
if ([output isKindOfClass:[GPUImageFilterGroup class]])
{
GPUImageFilterGroup *group = (GPUImageFilterGroup *) output;
for (NSUInteger i=0; i<group.filterCount;i++)
{
GPUImageFilter * curFilter = (GPUImageFilter * )[group filterAtIndex:i];
[self releaseResourcesForGPUFilter:curFilter];
}
[group removeAllTargets];
}
else if ([output isKindOfClass:[GPUImageFilter class]])
{
[self releaseResourcesForGPUFilter:(GPUImageFilter *)output];
}
}
Unsharp mask it GPU group filters, i.e. composite of filters.
- (void)releaseResourcesForGPUFilter:(GPUImageFilter *)filter
{
if ([filter respondsToSelector:#selector(releaseInputTexturesIfNeeded)])
{
[filter releaseInputTexturesIfNeeded];
[filter destroyFilterFBO];
[filter cleanupOutputImage];
[filter deleteOutputTexture];
[filter removeAllTargets];
}
}
I did as Brad suggested to do: My code was called in UI touch processing loop. Autorealized pool objects are drained, but from Allocation Instruments I cannot see anything strange. But my application is terminated.
Total allocation was 130 MB at the end, but for some reasons my apps was terminated.
Here is how it looks like: ( I placed screen shots, log from device, even trace from Instruments). My app is called TesseractSample, but I completely switched off tesseract usage, even its' initialisation.
http://goo.gl/w5XrVb
In log from device I see that there maximum allow rpages were used:
http://goo.gl/ImG2Wt
But I do not have an clue what it means. rpages == recent_max ( 167601)

Resources