inputImage = nil : Filtered Image not displaying? - ios

Have been developing an image filtering app with the help of online tutorials on bitFountain. The user should be able to select image of photo they have added to an album and then can either add filter to the photo or delete that image.
My delete image functionality runs fine but adding a filter is not working.
I have logged three of the filter instances to the console as they are returned by the method but it comes back as inputImage = nil.
2015-10-19 10:41:53.634 FilterApp[78451:28732768] <CISepiaTone: inputImage=nil inputIntensity=1>
2015-10-19 10:41:53.634 FilterApp[78451:28732768] <CIGaussianBlur: inputImage=nil inputRadius=1>
2015-10-19 10:41:53.635 FilterApp[78451:28732768] <CIColorClamp: inputImage=nil inputMinComponents=[0.2 0.2 0.2 0.2] inputMaxComponents=[0.9 0.9 0.9 0.9]>
What exactly does inputImage=nil mean?
I'm not sure where the code could be going wrong, or if this problem is even external to the code.
(Using Xcode7 and iPhone 4s simulator.)
Edit: This was the code used to convert to UIImage.
- (UIImage *)filteredImageFromImage:(UIImage *)image andFilter:(CIFilter *)filter
{
CIImage *unfilteredImage = [[CIImage alloc] initWithCGImage:image.CGImage];
[filter setValue:unfilteredImage forKey:kCIInputImageKey];
CIImage *filteredImage = [filter outputImage];
CGRect extent = [filteredImage extent];
CGImageRef cgImage = [self.context createCGImage:filteredImage fromRect:extent];
UIImage *finalImage = [UIImage imageWithCGImage:cgImage];
return finalImage;
}

This problem occurs when you use CIImage. But when you display the image it contains nil value. So Convert your CIImage to UIImage.
e.g.
CGImageRef image = [generator copyCGImageAtTime:time actualTime:&actualTime error:&err];
UIImage *imageDisplay = [UIImage imageWithCGImage:image];
It may be helpful to you.

Related

Ios crash "could not execute support code to read Objective-C" on ios 12.4.2 not on 12.0.1

This method returns a qr code image of a string. It works correctly on Ios 12.0.1 (iphone SE) but it crash on 12.4.2 (iphone 6). The method crash when i try to assign the resultant UIImage to an UIImageView, the resultant UIImage is not nil.
-(UIImage*)get_QR_image :(NSString*)qrString :(UIColor*)ForeGroundCol :(UIColor*)BackGroundCol{
NSData *stringData = [qrString dataUsingEncoding: NSUTF8StringEncoding];
CIFilter *qrFilter = [CIFilter filterWithName:#"CIQRCodeGenerator"];
[qrFilter setValue:stringData forKey:#"inputMessage"];
[qrFilter setValue:#"H" forKey:#"inputCorrectionLevel"];
CIImage *qrImage = qrFilter.outputImage;
float scaleX = 320;
float scaleY = 320;
CIColor *iForegroundColor = [CIColor colorWithCGColor:[ForeGroundCol CGColor]];
CIColor *iBackgroundColor = [CIColor colorWithCGColor:[BackGroundCol CGColor]];
CIFilter * filterColor = [CIFilter filterWithName:#"CIFalseColor" keysAndValues:#"inputImage", qrImage, #"inputColor0", iForegroundColor, #"inputColor1", iBackgroundColor, nil];
CIImage *filtered_image = [filterColor valueForKey:#"outputImage"];
filtered_image = [filtered_image imageByApplyingTransform:CGAffineTransformMakeScale(scaleX, scaleY)];
UIImage *result_image = [UIImage imageWithCIImage:filtered_image
scale:[UIScreen mainScreen].scale
orientation:UIImageOrientationUp];
return result_image;
}
the line involved in crash is:
filtered_image = [filtered_image imageByApplyingTransform:CGAffineTransformMakeScale(scaleX, scaleY)];
it generates this log:
warning: could not execute support code to read Objective-C class data in the process. This may reduce the quality of type information available.
There's something in my method that works only on 12.0.1 ? Or maybe something wrong ? How i can investigate more to solve that crash ?
EDIT
in red i have:
MyQrCodeImageViewBig.image=qrimage;
with messagge:
Thread 1: EXC_BREAKPOINT (code=1, subcode=0x1a83e146c)
I see a lot of problems resulting from the [UIImage imageWithCIImage:] initializer. The main problem being that a CIImage does not actually contain any bitmap data. It needs to be rendered by a CIContext first. So the target you assign the UIImage to needs to know that it is backed by a CIImage that still needs rendering. Usually UIImageView handles this well, but I wouldn't trust it too much.
What you can do instead is to render the image yourself into a bitmap (CGImage) and initialize the UIImage with that instead. You need a CIContext for that, which I recommend you create somewhere outside this method once and re-use it every time you need to render an image (it's an expensive object):
self.context = [CIContext context];
Then in your method, you render the image like this:
CGImageRef cgImage = [self.context createCGImage:filtered_image fromRect:[filtered_image extent]];
UIImage* result_image = [UIImage imageWithCGImage:cgImage];

Reset button to original image from filtered image

I tried to add filters to my image in image view, when i tried to click on filter buttons they are stacking. Can you help me to create a reset or undo button to go to original image with iam getting original image from camera and camera roll?
(IBAction)filter:(id)sender {
CIContext *context=[CIContext contextWithOptions:nil];
CIImage *image =[CIImage imageWithCGImage:imageview.image.CGImage];
CIFilter *filter =[CIFilter filterWithName:#"CISepiaTone"];
[filter setValue:image forKey:kCIInputImageKey];
[filter setValue:#1.0f forKey:#"InputIntensity"];
CIImage *result =[filter valueForKey:kCIOutputImageKey];
CGImageRef cgImage =[context createCGImage:result fromRect:result.extent];
UIImage *uiImage =[[UIImage alloc]initWithCGImage:cgImage];
[self.imageview setImage:uiImage];
Your filters are stacking because you're modifying the image in the imageview, and then putting that modified image back into the imageview. You should store your original image in a separate property and always start your modifications from that. For example, a property like:
#property (nonatomic, strong) UIImage *originalImage;
Then in your method, use the original image instead of the one already in the image view:
CIImage *image = [CIImage imageWithCGImage:self.originalImage.CGImage];
Storing data in member variables/properties is also just better practice than storing data in the view.

UIImagePNGRepresentation returns nil after CIFilter

Running into a problem getting a PNG representation for a UIImage after having rotated it with CIAffineTransform. First, I have a category on UIImage that rotates an image 90 degrees clockwise. It seems to work correctly when I display the rotated image in a UIImageView.
-(UIImage *)cwRotatedRepresentation
{
// Not very precise, stop yelling at me.
CGAffineTransform xfrm=CGAffineTransformMakeRotation(-(6.28 / 4.0));
CIContext *context=[CIContext contextWithOptions:nil];
CIImage *inputImage=[CIImage imageWithCGImage:self.CGImage];
CIFilter *filter=[CIFilter filterWithName:#"CIAffineTransform"];
[filter setValue:inputImage forKey:#"inputImage"];
[filter setValue:[NSValue valueWithBytes:&xfrm objCType:#encode(CGAffineTransform)] forKey:#"inputTransform"];
CIImage *result=[filter valueForKey:#"outputImage"];
CGImageRef cgImage=[context createCGImage:result fromRect:[inputImage extent]];
return [[UIImage alloc] initWithCIImage:result];
}
However, when I try to actually get a PNG for the newly rotated image, UIImagePNGRepresentation returns nil.
-(NSData *)getPNG
{
UIImage *myImg=[UIImage imageNamed:#"canada"];
myImg=[myImg cwRotatedRepresentation];
NSData *d=UIImagePNGRepresentation(myImg);
// d == nil :(
return d;
}
Is core image overwriting the PNG headers or something? Is there a way around this behavior, or a better means of achieving the desired result of a PNG representation of a UIImage rotated 90 degrees clockwise?
Not yelling, but -M_PI_4 will give you the constant you want with maximum precision :)
The only other thing that I see is you probably want to be using [result extent] instead of [inputImage extent] unless your image is known square.
Not sure how that would cause UIImagePNGRepresentation to fail though. One other thought... you create a CGImage and then use the CIImage in the UIImage, perhaps using initWithCGImage would give better results.

UIImage face detection

I am trying to write a routine that takes a UIImage and returns a new UIImage that contains just the face. This would seem to be very straightforward, but my brain is having problems getting around the CoreImage vs. UIImage spaces.
Here's the basics:
- (UIImage *)imageFromImage:(UIImage *)image inRect:(CGRect)rect {
CGImageRef sourceImageRef = [image CGImage];
CGImageRef newImageRef = CGImageCreateWithImageInRect(sourceImageRef, rect);
UIImage *newImage = [UIImage imageWithCGImage:newImageRef];
CGImageRelease(newImageRef);
return newImage;
}
-(UIImage *)getFaceImage:(UIImage *)picture {
CIDetector *detector = [CIDetector detectorOfType:CIDetectorTypeFace
context:nil
options:[NSDictionary dictionaryWithObject: CIDetectorAccuracyHigh forKey: CIDetectorAccuracy]];
CIImage *ciImage = [CIImage imageWithCGImage: [picture CGImage]];
NSArray *features = [detector featuresInImage:ciImage];
// For simplicity, I'm grabbing the first one in this code sample,
// and we can all pretend that the photo has one face for sure. :-)
CIFaceFeature *faceFeature = [features objectAtIndex:0];
return imageFromImage:picture inRect:faceFeature.bounds;
}
The image that is returned is from the flipped image. I've tried adjusting faceFeature.bounds using something like this:
CGAffineTransform t = CGAffineTransformMakeScale(1.0f,-1.0f);
CGRect newRect = CGRectApplyAffineTransform(faceFeature.bounds,t);
... but that gives me results outside the image.
I'm sure there's something simple to fix this, but short of calculating the bottom-down and then creating a new rect using that as the X, is there a "proper" way to do this?
Thanks!
It's much easier and less messy to just use CIContext to crop your face from the image. Something like this:
CGImageRef cgImage = [_ciContext createCGImage:[CIImage imageWithCGImage:inputImage.CGImage] fromRect:faceFeature.bounds];
UIImage *croppedFace = [UIImage imageWithCGImage:cgImage];
Where inputImage is your UIImage object and faceFeature object is of type CIFaceFeature that you get from [CIDetector featuresInImage:] method.
Since there doesn't seem to be a simple way to do this, I just wrote some code to do it:
CGRect newBounds = CGRectMake(faceFeature.bounds.origin.x,
_picture.size.height - faceFeature.bounds.origin.y - largestFace.bounds.size.height,
faceFeature.bounds.size.width,
faceFeature.bounds.size.height);
This worked a charm.
There is no simple way to achieve this, the problem is that the images from the iPhone camera are always in portrait mode, and metadata settings are used to get them to display correctly. You will also get better accuracy in your face detection call if you tell it the rotation of the image beforehand. Just to make things complicated, you have to pass it the image orientation in EXIF format.
Fortunately there is an apple sample project that covers all of this called Squarecam, i suggest you check it for details

Multiply image and color

I have an image with transparence background, for example image.
I need to create many images with different color and I want to use this one image and multiply it with color for create some other images, for example new image.
Could you please help me with some lines of code. Thanks.
This might help:
UIImage *beginUIImage = [UIImage imageNamed:#"myImage.png"];
CIImage *beginImage = [CIImage imageWithCGImage:beginUIImage.CGImage];
CIFilter *filter = [CIFilter filterWithName:#"CISepiaTone"
keysAndValues: kCIInputImageKey, beginImage,
#"inputIntensity", [NSNumber numberWithFloat:0.8], nil];
CIImage *outputImage = [filter outputImage];
UIImage *endImage = [[UIImage alloc] initWithCIImage:outputImage];
The beginUIImage is the initial transparent image. Then I change it into a CIImage to ease the process of applying filters. Then I apply a Sepia filter to the image. Then I output the image with a filter applied into another CIImage called outputImage. Lastly, I change the outputImage into a UIImage to be used later, perhaps put into a UIImageView, perhaps saved into the Photo library. You can change the type of filter to change the output images' colors.

Resources