UIGraphicsBeginImageContextWithOptions analog in macOS - ios

Here is the code that I'm using to scale images in iOS i.e. scale 500x500 image to 100x100 image and then store scaled copy:
+ (UIImage *)image:(UIImage *)originalImage scaledToSize:(CGSize)desiredSize {
UIGraphicsBeginImageContextWithOptions(desiredSize, YES, [UIScreen mainScreen].scale);
[originalImage drawInRect:CGRectMake(0, 0, desiredSize.width, desiredSize.height)];
UIImage *finalImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext;
return finalImage;
}
Now I need to implement the same functionality in my macOS app. How can I do that? I saw the question like this but I still can't understand the logic of doing this in macOS.

After some search I found a question like mine but in Swift. So I translated it in Objective-C and that's it:
+ (NSImage *)image:(NSImage *)originalImage scaledToSize:(NSSize)desiredSize {
NSImage *newImage = [[NSImage alloc] initWithSize:desiredSize];
[newImage lockFocus];
[originalImage drawInRect:NSMakeRect(0, 0, desiredSize.width, desiredSize.height) fromRect:NSMakeRect(0, 0, imageWidth, imageHeight) operation:NSCompositingOperationSourceOver fraction:1];
[newImage unlockFocus];
newImage.size = desiredSize;
return [[NSImage alloc] initWithData:[newImage TIFFRepresentation]];
}
But there's still an issue: if desiredSize = NSMakeSize(50,50); it will return 50 by 50 pixels. It's should be something with screen scale i guess.
There is Swift code that I translated:
Example 1
Example 2

Related

unable to resize image properly

I am having problem with UIImage resizing, image masking is working fine but after applying mask UIImage is starched, the problem is with scaling as image is not scaled properly.
CCClippingNode *clippingNode = [[CCClippingNode alloc] initWithStencil:pMaskingFrame ];
pTobeMasked.scaleX = (float)pMaskingFrame.contentSize.width / (float)pTobeMasked.contentSize.width;
pTobeMasked.scaleY = (float)pMaskingFrame.contentSize.height / (float)pTobeMasked.contentSize.height;
clippingNode.alphaThreshold = 0;
[pContainerNode addChild:clippingNode];
pTobeMasked.position = ccp(pMaskingFrame.position.x, pMaskingFrame.position.y);
[clippingNode addChild:pTobeMasked];
One of my project I have used below function to resize an image;
/*
method parameters definition
image : original image to be resized
size : new size
*/
+ (UIImage*)resizeImage:(UIImage *)image size:(CGSize)size
{
UIGraphicsBeginImageContext(size);
[image drawInRect:CGRectMake(0, 0, size.width, size.height)];
UIImage *newImage = UIGraphicsGetImageFromCurrentImageContext();
//here is the scaled image which has been changed to the size specified
UIGraphicsEndImageContext();
return newImage;
}
This will work like a charm. It's similar to the already posted answer, but it has some more options:
+(UIImage*)imageWithImage:(UIImage*)image scaledToSize:(CGSize)newSize
{
//UIGraphicsBeginImageContext(newSize);
// In next line, pass 0.0 to use the current device's pixel scaling factor (and thus account for Retina resolution).
// Pass 1.0 to force exact pixel size.
UIGraphicsBeginImageContextWithOptions(newSize, NO, 0.0);
[image drawInRect:CGRectMake(0, 0, newSize.width, newSize.height)];
UIImage *newImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return newImage;
}

Objective-c UIImage bad image quality

I am not the best graphic designer, but need to setup few icons. I made them with gimp with options like Border, etc. All are primary designed in 1000x1000px and used in objective-c code in UIImageView.
I am wondering why resized icon look that terrible. Any suggestions?
In app:
http://s14.postimg.org/k2g9uayld/Screen_Shot_2014_12_18_at_11_22_53.png
http://s14.postimg.org/biwvwjq8x/Screen_Shot_2014_12_18_at_11_23_02.png
Original image:
Can't Post more than 2 links so: s17.postimg.org/qgi4p80an/fav.png
I dont think that matters but
UIImage *image = [UIImage imageNamed:#"fav.png"];
image = [UIImage imageWithCGImage:[image CGImage] scale:25 orientation:UIImageOrientationUp];
self.navigationItem.titleView = [[UIImageView alloc] initWithImage:image];
But one of images has been set up in storyboard and effect is the same.
[self.favO.layer setMinificationFilter:kCAFilterTrilinear];
[self.favO setImage:[self resizeImage:[UIImage imageNamed:#"fav.png"] newSize:CGSizeMake(35,35)]
forState:UIControlStateNormal];
- (UIImage *)resizeImage:(UIImage*)image newSize:(CGSize)newSize {
CGRect newRect = CGRectIntegral(CGRectMake(0, 0, newSize.width, newSize.height));
CGImageRef imageRef = image.CGImage;
UIGraphicsBeginImageContextWithOptions(newSize, NO, 0);
CGContextRef context = UIGraphicsGetCurrentContext();
CGContextSetInterpolationQuality(context, kCGInterpolationHigh);
CGAffineTransform flipVertical = CGAffineTransformMake(1, 0, 0, -1, 0, newSize.height);
CGContextConcatCTM(context, flipVertical);
CGContextDrawImage(context, newRect, imageRef);
CGImageRef newImageRef = CGBitmapContextCreateImage(context);
UIImage *newImage = [UIImage imageWithCGImage:newImageRef];
CGImageRelease(newImageRef);
UIGraphicsEndImageContext();
return newImage;
}
This fixed problem. This problem occurs while UIImageView takes care of resizing.
To avoid this kind of behavior I suggest you make SVGs instead of png files. Using the lib SVGKit you can then resize it to your heart content. Since SVG format is a vectorial format there won't be any loss in your scaling.
https://github.com/SVGKit/SVGKit
EDIT:
To add up to this solution, your PNG file at 1000X1000px must be really heavy, on the other hand a SVG file is a text file so it makes it very lightweight

GPUImage: strange image deformation (but only with some photo) [duplicate]

I'm trying prepare image for OCR,I use GPUImage to do it,code work fine till i crop image!!After cropping i got bad result...
Crop area:
https://www.dropbox.com/s/e3mlp25sl6m55yk/IMG_0709.PNG
Bad Result=(
https://www.dropbox.com/s/wtxw7li6paltx21/IMG_0710.PNG
+ (UIImage *) doBinarize:(UIImage *)sourceImage
{
//first off, try to grayscale the image using iOS core Image routine
UIImage * grayScaledImg = [self grayImage:sourceImage];
GPUImagePicture *imageSource = [[GPUImagePicture alloc] initWithImage:grayScaledImg];
GPUImageAdaptiveThresholdFilter *stillImageFilter = [[GPUImageAdaptiveThresholdFilter alloc] init];
stillImageFilter.blurRadiusInPixels = 8.0;
[stillImageFilter prepareForImageCapture];
[imageSource addTarget:stillImageFilter];
[imageSource processImage];
UIImage *retImage = [stillImageFilter imageFromCurrentlyProcessedOutput];
[imageSource removeAllTargets];
return retImage;
}
+ (UIImage *) grayImage :(UIImage *)inputImage
{
// Create a graphic context.
UIGraphicsBeginImageContextWithOptions(inputImage.size, NO, 1.0);
CGRect imageRect = CGRectMake(0, 0, inputImage.size.width, inputImage.size.height);
// Draw the image with the luminosity blend mode.
// On top of a white background, this will give a black and white image.
[inputImage drawInRect:imageRect blendMode:kCGBlendModeLuminosity alpha:1.0];
// Get the resulting image.
UIImage *outputImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return outputImage;
}
UPDATE:
In the meantime, when you crop your images, do so to the nearest
multiple of 8 pixels in width and you should see the correct result
Thank u #Brad Larson ! i resize image width to nearest multiple of 8 and get what i want
-(UIImage*)imageWithMultiple8ImageWidth:(UIImage*)image
{
float fixSize = next8(image.size.width);
CGSize newSize = CGSizeMake(fixSize, image.size.height);
UIGraphicsBeginImageContext( newSize );
[image drawInRect:CGRectMake(0,0,newSize.width,newSize.height)];
UIImage* newImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return newImage;
}
float next8(float n) {
int bits = (int)n & 7; // give us the distance to the previous 8
if (bits == 0)
return (float)n;
return (float)n + (8-bits);
}
Before I even get to the core issue here, I should point out that the GPUImageAdaptiveThresholdFilter already does a conversion to grayscale as a first step, so your -grayImage: code in the above is unnecessary and will only slow things down. You can remove all that code and just pass your input image directly to the adaptive threshold filter.
What I believe is the problem here is a recent set of changes to the way that GPUImagePicture pulls in image data. It appears that images which aren't a multiple of 8 pixels wide end up looking like the above when imported. Some fixes were proposed about this, but if the latest code from the repository (not CocoaPods, which is often out of date relative to the GitHub repository) is still doing this, some more work may need to be done.
In the meantime, when you crop your images, do so to the nearest multiple of 8 pixels in width and you should see the correct result.

ios GPUImage,bad result of image processing with small-size images?

I'm trying prepare image for OCR,I use GPUImage to do it,code work fine till i crop image!!After cropping i got bad result...
Crop area:
https://www.dropbox.com/s/e3mlp25sl6m55yk/IMG_0709.PNG
Bad Result=(
https://www.dropbox.com/s/wtxw7li6paltx21/IMG_0710.PNG
+ (UIImage *) doBinarize:(UIImage *)sourceImage
{
//first off, try to grayscale the image using iOS core Image routine
UIImage * grayScaledImg = [self grayImage:sourceImage];
GPUImagePicture *imageSource = [[GPUImagePicture alloc] initWithImage:grayScaledImg];
GPUImageAdaptiveThresholdFilter *stillImageFilter = [[GPUImageAdaptiveThresholdFilter alloc] init];
stillImageFilter.blurRadiusInPixels = 8.0;
[stillImageFilter prepareForImageCapture];
[imageSource addTarget:stillImageFilter];
[imageSource processImage];
UIImage *retImage = [stillImageFilter imageFromCurrentlyProcessedOutput];
[imageSource removeAllTargets];
return retImage;
}
+ (UIImage *) grayImage :(UIImage *)inputImage
{
// Create a graphic context.
UIGraphicsBeginImageContextWithOptions(inputImage.size, NO, 1.0);
CGRect imageRect = CGRectMake(0, 0, inputImage.size.width, inputImage.size.height);
// Draw the image with the luminosity blend mode.
// On top of a white background, this will give a black and white image.
[inputImage drawInRect:imageRect blendMode:kCGBlendModeLuminosity alpha:1.0];
// Get the resulting image.
UIImage *outputImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return outputImage;
}
UPDATE:
In the meantime, when you crop your images, do so to the nearest
multiple of 8 pixels in width and you should see the correct result
Thank u #Brad Larson ! i resize image width to nearest multiple of 8 and get what i want
-(UIImage*)imageWithMultiple8ImageWidth:(UIImage*)image
{
float fixSize = next8(image.size.width);
CGSize newSize = CGSizeMake(fixSize, image.size.height);
UIGraphicsBeginImageContext( newSize );
[image drawInRect:CGRectMake(0,0,newSize.width,newSize.height)];
UIImage* newImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return newImage;
}
float next8(float n) {
int bits = (int)n & 7; // give us the distance to the previous 8
if (bits == 0)
return (float)n;
return (float)n + (8-bits);
}
Before I even get to the core issue here, I should point out that the GPUImageAdaptiveThresholdFilter already does a conversion to grayscale as a first step, so your -grayImage: code in the above is unnecessary and will only slow things down. You can remove all that code and just pass your input image directly to the adaptive threshold filter.
What I believe is the problem here is a recent set of changes to the way that GPUImagePicture pulls in image data. It appears that images which aren't a multiple of 8 pixels wide end up looking like the above when imported. Some fixes were proposed about this, but if the latest code from the repository (not CocoaPods, which is often out of date relative to the GitHub repository) is still doing this, some more work may need to be done.
In the meantime, when you crop your images, do so to the nearest multiple of 8 pixels in width and you should see the correct result.

How to resize an image in iOS? [duplicate]

This question already has answers here:
The simplest way to resize an UIImage?
(34 answers)
UIImage resizing not working properly
(1 answer)
Closed 9 years ago.
hi I want to send an image to a web service. But I want to resize user selected any image into 75 x 75 and send it to web service. How can I resize this image into 75 x 75
Thanks
Try to implement the code below. It may be helpful for you:
CGRect rect = CGRectMake(0,0,75,75);
UIGraphicsBeginImageContext( rect.size );
[yourCurrentOriginalImage drawInRect:rect];
UIImage *image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
Use this method to resize your image:
+(UIImage *)imageResize :(UIImage*)img andResizeTo:(CGSize)newSize
{
CGFloat scale = [[UIScreen mainScreen]scale];
/*You can remove the below comment if you dont want to scale the image in retina device .Dont forget to comment UIGraphicsBeginImageContextWithOptions*/
//UIGraphicsBeginImageContext(newSize);
UIGraphicsBeginImageContextWithOptions(newSize, NO, scale);
[img drawInRect:CGRectMake(0,0,newSize.width,newSize.height)];
UIImage* newImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return newImage;
}
try this code:
- (UIImage *)imageWithImage:(UIImage *)image scaledToSize:(CGSize)newSize
{
UIGraphicsBeginImageContext(newSize);
[image drawInRect:CGRectMake(0, 0, newSize.width, newSize.height)];
UIImage *newImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return newImage;
}

Resources