In my iPhone app, I've always used the following function to horizontally mirror an image.
-(UIImage*)mirrorImage:(UIImage*)img
{
CIImage *coreImage = [CIImage imageWithCGImage:img.CGImage];
coreImage = [coreImage imageByApplyingTransform:CGAffineTransformMakeScale(-1, 1)];
img = [UIImage imageWithCIImage:coreImage scale:img.scale orientation:UIImageOrientationUp];
return img;
}
With iOS 10.0.1 though, this function still runs with no errors, but when I try to use the UIImage from this function, the following warning appears, and the image just doesn't seem to be there.
Failed to render 921600 pixels because a CIKernel's ROI function did not allow tiling.
This error actually appears in the Output window when I attempt to use the UIImage (in the second line in this code) :
UIImage* flippedImage = [self mirrorImage:originalImage];
UIImageView* photo = [[UIImageView alloc] initWithImage:flippedImage];
After calling mirrorImage, the flippedImage variable does contain a value, it's not nil, but when I try to use the image, I get that error message.
If I were to not call the mirrorImage function, then the code works fine:
UIImageView* photo = [[UIImageView alloc] initWithImage:originalImage];
Is there some new quirk with iOS 10 which would prevent my mirrorImage function from working ?
Just to add, in the mirrorImage function, I tried testing the size of the image before and after the transformation (as the error is complaining about having to tile the image), and the size is identical.
I fixed it by converting CIImage -> CGImage -> UIImage
let ciImage: CIImage = "myCIImageFile"
let cgImage: CGImage = {
let context = CIContext(options: nil)
return context.createCGImage(ciImage, from: ciImage.extent)!
}()
let uiImage = UIImage(cgImage: cgImage)
Never mind.
I don't know what iOS 10 has broken, but I managed to fix the problem by replacing my function with this:
-(UIImage*)mirrorImage:(UIImage*)img
{
UIImage* flippedImage = [UIImage imageWithCGImage:img.CGImage
scale:img.scale
orientation:UIImageOrientationUpMirrored];
return flippedImage;
}
Related
This method returns a qr code image of a string. It works correctly on Ios 12.0.1 (iphone SE) but it crash on 12.4.2 (iphone 6). The method crash when i try to assign the resultant UIImage to an UIImageView, the resultant UIImage is not nil.
-(UIImage*)get_QR_image :(NSString*)qrString :(UIColor*)ForeGroundCol :(UIColor*)BackGroundCol{
NSData *stringData = [qrString dataUsingEncoding: NSUTF8StringEncoding];
CIFilter *qrFilter = [CIFilter filterWithName:#"CIQRCodeGenerator"];
[qrFilter setValue:stringData forKey:#"inputMessage"];
[qrFilter setValue:#"H" forKey:#"inputCorrectionLevel"];
CIImage *qrImage = qrFilter.outputImage;
float scaleX = 320;
float scaleY = 320;
CIColor *iForegroundColor = [CIColor colorWithCGColor:[ForeGroundCol CGColor]];
CIColor *iBackgroundColor = [CIColor colorWithCGColor:[BackGroundCol CGColor]];
CIFilter * filterColor = [CIFilter filterWithName:#"CIFalseColor" keysAndValues:#"inputImage", qrImage, #"inputColor0", iForegroundColor, #"inputColor1", iBackgroundColor, nil];
CIImage *filtered_image = [filterColor valueForKey:#"outputImage"];
filtered_image = [filtered_image imageByApplyingTransform:CGAffineTransformMakeScale(scaleX, scaleY)];
UIImage *result_image = [UIImage imageWithCIImage:filtered_image
scale:[UIScreen mainScreen].scale
orientation:UIImageOrientationUp];
return result_image;
}
the line involved in crash is:
filtered_image = [filtered_image imageByApplyingTransform:CGAffineTransformMakeScale(scaleX, scaleY)];
it generates this log:
warning: could not execute support code to read Objective-C class data in the process. This may reduce the quality of type information available.
There's something in my method that works only on 12.0.1 ? Or maybe something wrong ? How i can investigate more to solve that crash ?
EDIT
in red i have:
MyQrCodeImageViewBig.image=qrimage;
with messagge:
Thread 1: EXC_BREAKPOINT (code=1, subcode=0x1a83e146c)
I see a lot of problems resulting from the [UIImage imageWithCIImage:] initializer. The main problem being that a CIImage does not actually contain any bitmap data. It needs to be rendered by a CIContext first. So the target you assign the UIImage to needs to know that it is backed by a CIImage that still needs rendering. Usually UIImageView handles this well, but I wouldn't trust it too much.
What you can do instead is to render the image yourself into a bitmap (CGImage) and initialize the UIImage with that instead. You need a CIContext for that, which I recommend you create somewhere outside this method once and re-use it every time you need to render an image (it's an expensive object):
self.context = [CIContext context];
Then in your method, you render the image like this:
CGImageRef cgImage = [self.context createCGImage:filtered_image fromRect:[filtered_image extent]];
UIImage* result_image = [UIImage imageWithCGImage:cgImage];
In an app I am working on I have some UIImageViews that may (or may not) need to be customised. How can I change the rendering mode of an image that was loaded as original to template at runtime?
You can init a UIImage from another cgImage, then you can render it as you need
ExampleCode
let originalImage = UIImage(named: "TimeClock2Filled")?.withRenderingMode(.alwaysOriginal)
if let original = originalImage?.cgImage {
let image2 = UIImage(cgImage: original).withRenderingMode(.alwaysTemplate)
}
Example code Objective-C
UIImage * image = [[UIImage imageNamed:#"TimeClock2Filled"] imageWithRenderingMode:UIImageRenderingModeAlwaysOriginal];
if(image.CGImage != nil) {
UIImage * image2 = [[UIImage imageWithCGImage:image.CGImage]imageWithRenderingMode:UIImageRenderingModeAlwaysTemplate];
}
this works just fine was tested
Have been developing an image filtering app with the help of online tutorials on bitFountain. The user should be able to select image of photo they have added to an album and then can either add filter to the photo or delete that image.
My delete image functionality runs fine but adding a filter is not working.
I have logged three of the filter instances to the console as they are returned by the method but it comes back as inputImage = nil.
2015-10-19 10:41:53.634 FilterApp[78451:28732768] <CISepiaTone: inputImage=nil inputIntensity=1>
2015-10-19 10:41:53.634 FilterApp[78451:28732768] <CIGaussianBlur: inputImage=nil inputRadius=1>
2015-10-19 10:41:53.635 FilterApp[78451:28732768] <CIColorClamp: inputImage=nil inputMinComponents=[0.2 0.2 0.2 0.2] inputMaxComponents=[0.9 0.9 0.9 0.9]>
What exactly does inputImage=nil mean?
I'm not sure where the code could be going wrong, or if this problem is even external to the code.
(Using Xcode7 and iPhone 4s simulator.)
Edit: This was the code used to convert to UIImage.
- (UIImage *)filteredImageFromImage:(UIImage *)image andFilter:(CIFilter *)filter
{
CIImage *unfilteredImage = [[CIImage alloc] initWithCGImage:image.CGImage];
[filter setValue:unfilteredImage forKey:kCIInputImageKey];
CIImage *filteredImage = [filter outputImage];
CGRect extent = [filteredImage extent];
CGImageRef cgImage = [self.context createCGImage:filteredImage fromRect:extent];
UIImage *finalImage = [UIImage imageWithCGImage:cgImage];
return finalImage;
}
This problem occurs when you use CIImage. But when you display the image it contains nil value. So Convert your CIImage to UIImage.
e.g.
CGImageRef image = [generator copyCGImageAtTime:time actualTime:&actualTime error:&err];
UIImage *imageDisplay = [UIImage imageWithCGImage:image];
It may be helpful to you.
I am trying to capture screen portion to post image on social media.
I am using following code to capture screen.
- (UIImage *) imageWithView:(UIView *)view
{
UIGraphicsBeginImageContextWithOptions(view.bounds.size, NO, 0.0);
[view.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage *img = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return img;
}
Above code is perfect for capturing screen.
Problem :
My UIView contains GPUImageView with the filtered image. When I tries to capture screen using above code, that particular portion of GPUImageView does not contains the filtered image.
I am using GPUImageSwirlFilter with the static image (no camera). I have also tried
UIImage *outImage = [swirlFilter imageFromCurrentFramebuffer]
but its not giving image.
Note : Following is working code, which gives perfect output of swirl effect, but I want same image in UIImage object.
dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_HIGH, 0), ^{
GPUImageSwirlFilter *swirlFilter = [GPUImageSwirlFilter alloc] init];
swirlLevel = 4;
[swirlFilter setAngle:(float)swirlLevel/10];
UIImage *inputImage = [UIImage imageNamed:gi.wordImage];
GPUImagePicture *swirlSourcePicture = [[GPUImagePicture alloc] initWithImage:inputImage];
inputImage = nil;
[swirlSourcePicture addTarget:swirlFilter];
dispatch_async(dispatch_get_main_queue(), ^{
[swirlFilter addTarget:imgSwirl];
[swirlSourcePicture processImage];
// This works perfect and I have filtered image in my imgSwirl. But I want
// filtered image in UIImage to use at different place like posting
// on social media
sharingImage = [swirlFilter imageFromCurrentFramebuffer]; // This also
// returns nothing.
});
});
1) Am I doing something wrong with GPUImage's imageFromCurrentFramebuffer ?
2) And why does screen capture code is not including GPUImageView portion in output image ?
3) How do I get filtered image in UIImage ?
First, -renderInContext: won't work with a GPUImageView, because a GPUImageView renders using OpenGL ES. -renderinContext: does not capture from CAEAGLLayers, which are used to back views presenting OpenGL ES content.
Second, you're probably getting a nil image in the latter code because you've forgotten to set -useNextFrameForImageCapture on your filter before triggering -processImage. Without that, your filter won't hang on to its backing framebuffer long enough to capture an image from it. This is due to a recent change in the way that framebuffers are handled in memory (although this change did not seem to get communicated very well).
I'm trying to save a cropped image to the camera roll.
(I need to do it programmatically, I can't have the user edit it)
This is my (still quite basic) cut and save code:
- (void)cutAndSaveImage:(UIImage*)rawImage
{
CIImage *workingImage = [[CIImage alloc] initWithImage:rawImage];
CGRect croppingRect = CGRectMake(0.0f, 0.0f, 3264.0f, 1224.0f);
CIImage *croppedImage = [workingImage imageByCroppingToRect:croppingRect];
UIImage *endImage = [UIImage imageWithCIImage:croppedImage scale: 1.0f orientation:UIImageOrientationRight];
self.testImage.image = endImage;
UIImageWriteToSavedPhotosAlbum(rawImage, self, #selector(image:didFinishSavingWithError:contextInfo:) , nil);
UIImageWriteToSavedPhotosAlbum(endImage, self, #selector(image:didFinishSavingWithError:contextInfo:) , nil);
}
The method is called within:
- (void)imagePickerController:(UIImagePickerController*)picker didFinishPickingMediaWithInfo:(NSDictionary*)info
I first create a CIImage using the raw UIImage.
Then I get a cropped CIImage using an instance method of the first one.
After that I create a new UIImage using the cropped CIImage.
At this point, to have some feedback, I set the new cropped UIImage as the backing image of a UIImageView. This works, and I can clearly see the image cropped exactly how I desired.
When I try to save it to the camera roll, however, things stop working.
I can't save the newly created endImage.
As you can see, I added a line to save the original UIImage too, just for comparison. The original one saves normally.
Another confusing thing is that the NSError object passed to the image:didFinishSavingWithError:contextInfo: callback is nil. (the callback is normally executed for both saving attempts)
EDIT:
Just made an experiment:
NSLog(#"rawImage: %# - rawImage.CGImage: %#", rawImage, rawImage.CGImage);
NSLog(#"endImage: %# - endImage.CGImage: %#", endImage, endImage.CGImage);
It looks like only the rawImage (coming from the UIImagePickerController) possesses a backing CGImageRef. The other one, created from a CIImage, doesn't.
Can it be that UIImageWriteToSavedPhotosAlbum works using the backing CGImageRef?
Can it be that UIImageWriteToSavedPhotosAlbum works using the backing CGImageRef?
Correct. A CIImage is not an image, and a UIImage backed only by a CIImage is not an image either; it is just a kind of wrapper. Why are you using CIImage at all here? You aren't using CIFilter so this makes no sense. Or if you are using CIFilter, you must render through a CIContext to get the output as a bitmap.
You can crop easily by drawing into a smaller graphics context.
If the UIImage object was initialized using a CIImage object, the
value of the property is NULL.
You can generate UIImage from CIImage like this:
let lecturePicture = UIImage(data: NSData(contentsOfURL: NSURL(string:"http://i.stack.imgur.com/Xs4RX.jpg")!)!)!
let controlsFilter = CIFilter(name: "CIColorControls")
controlsFilter.setValue(CIImage(image: lecturePicture), forKey: kCIInputImageKey)
controlsFilter.setValue(1.5, forKey: kCIInputContrastKey)
let displayImage = UIImage(CGImage: CIContext(options:nil).createCGImage(controlsFilter.outputImage, fromRect:controlsFilter.outputImage.extent()))!
displayImage