Image loses quality when scaled - ios

I know that when scaling down an image you have to expect some loss of quality, but when I assign an image to a UIButton of size (75,75) it has great quality.
When I scale the image to size (75,75) for copy/paste using UIPasteboard it has really bad quality.
Background: My app is a keyboard extension, so I have buttons with assigned images and when they are clicked, I get the image from the button, scale it to be the right size, copy it to UIPasteboard, then paste.
Code:
Here is my code for detecting a button click and copying an image:
- (IBAction) clickedImage:(id)sender {
UIButton *btn = sender;
UIImage *scaledImage = btn.imageView.image;
UIImage *newImage = [scaledImage imageWithImage:scaledImage andSize:CGSizeMake(75, 75)];
NSData *imgData = UIImagePNGRepresentation(newImage);
UIPasteboard *pasteboard = [UIPasteboard generalPasteboard];
[pasteboard setData:imgData forPasteboardType:[UIPasteboardTypeListImage objectAtIndex:0]];
}
And I have a UIImage category with the imageWithImage:andSize: method for scaling the image. This is the scaling method:
- (UIImage*)imageWithImage:(UIImage*)image andSize:(CGSize)newSize {
// Create a bitmap context.
UIGraphicsBeginImageContextWithOptions(newSize, NO, image.scale);
[image drawInRect:CGRectMake(0,0,newSize.width,newSize.height)];
UIImage* newImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return newImage;
}
What doesn't make sense is that when I put the image in the UIButton it is scaled down to the exact same size as when I scale using code, but the quality is way better for the UIButton than when I return the scaled image. Is there something wrong with my scaling code? Does anyone know why there is such a drop in quality between the two images?

A better way to do this is to use ImageIO to resize your images. It takes a little bit longer, but it is far better for scaling images than redrawing into a graphics context.

Did you try this https://github.com/mbcharbonneau/UIImage-Categories ?
There is an interesting method in the Resize category
- (UIImage *)resizedImage:(CGSize)newSize
interpolationQuality:(CGInterpolationQuality)quality;
Setting quality to kCGInterpolationHigh seems to give a good result (a little bit slower)

Related

UIImage picker changes image size

I have an application in which I am scanning text from a picture. I am using OCR. Now the problem is my source image size is 3024*3024. I have copied the image into iPhone 6 Simulator in Xcode. Now when I uploaded the image using uiimagepicker controller the image was resized to 748*748 and its quality is not same.
When I tried to increase image size the image's quality is not the same.
What I need is image in its actual size and quality/resolution should not change.
try this code to resize image
- (UIImage*)imageWithImage:(UIImage*)img scaledToSize:(CGSize)newSize;{
UIGraphicsBeginImageContext( newSize );
[img drawInRect:CGRectMake(0,0,newSize.width,newSize.height)];
UIImage* newImg = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return newImg;
}
and For the compression of images
NSData *dataForJPEGFile = UIImageJPEGRepresentation(theImage, 0.6);
Thanks
Did you set the qualityType property of your UIImagePickerController to UIImagePickerControllerQualityTypeHigh? The default is medium.

UIImage Distortion from UIGraphicsBeginImageContext with larger files (pixel formats, codecs?)

I'm cropping UIImages with a UIBezierPath using UIGraphicsContext:
CGSize thumbnailSize = CGSizeMake(54.0f, 45.0f); // dimensions of UIBezierPath
UIGraphicsBeginImageContextWithOptions(thumbnailSize, NO, 0);
[path addClip];
[originalImage drawInRect:CGRectMake(0, originalImage.size.height/-3, thumbnailSize.width, originalImage.size.height)];
UIImage *maskedImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
But for some reason my images are getting stretched vertically (everything looks slightly long and skinny), and this effect is stronger the bigger my originalImage is. I'm sure the originalImages are perfectly fine before I do these operations (I've checked)
My images are all 9:16 (say 72px wide by 128px tall) if that matters.
I've seen UIGraphics creates a bitmap with an "ARGB 32-bit integer pixel format using host-byte order"; and I'll admit a bit of ignorance when it comes to pixel formats, but felt this MAY be relevant because I'm not sure if that's the same pixel format I use to encode the picture data.
No idea how relevant this is but here is the FULL processing pipeline:
I'm capturing using AVFoundation and I set my photoSettings as
NSDictionary *photoSettings = #{AVVideoCodecKey : AVVideoCodecH264};
capturing using captureStillImageAsynchronouslyFromConnection:.. then turning it into NSData using [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageDataSampleBuffer]; then downsizing into thumbnail by creating a CGDataProviderRefWithCFData and converting to CGImageRef using CGImageSourceCreateThumbnailAtIndex and getting a UIImage from that.
Later, I once again turn it into NSData using UIImageJPEGRepresentation(thumbnail, 0.7) so I can store. And finally when I'm ready to display I call my own method detailed on top [self maskImage:[UIImage imageWithData:imageData] toPath:_thumbnailPath] and display it on a UIImageView and set contentMode = UIViewContentModeScaleAspectFit.
If the method I'm using to mask the UIImage with the UIBezierPath is fine, I may end up explicitly setting the photoOutput settings with [NSNumber numberWithUnsignedInt:kCVPixelFormatType_32BGRA], (id)kCVPixelBufferPixelFormatTypeKey, nil] and the I can probably use something like how to convert a CVImageBufferRef to UIImage and change a lot of my code... but I really rather not do that unless completely necessary since, as I've mentioned, I really don't know much about video encoding / all these graphical, low level objects.
This line:
[originalImage drawInRect:CGRectMake(0, originalImage.size.height/-3, thumbnailSize.width, originalImage.size.height)];
is a problem. You are drawing originalImage but you specify the width of thumbnailSize.width and the height of originalImage. This messes up the image's aspect ratio.
You need a width and a height based on the same image size. Pick one as needed to maintain the proper aspect ratio.

iOS: How to get a piece of a stretched image?

The generic problem I'm facing is this:
I have a stretchable 50x50 PNG. I'm stretching it to 300x100. I want to get three UIImages of size 100x100 cut from the stretched image, A, B & C in the picture below:
I'm trying to do it like this:
// stretchedImage is the 50x50 UIImage, abcImageView is the 300x100 UIImageView
UIImage *stretchedImage = [abcImageView.image stretchableImageWithLeftCapWidth:25 topCapHeight:25];
CGImageRef image = CGImageCreateWithImageInRect(stretchedImage.CGImage, bButton.frame);
UIImage *result = [UIImage imageWithCGImage:image];
[bButton setBackgroundImage:result forState:UIControlStateSelected];
CGImageRelease(image);
I'm trying to crop the middle 100 ("B") using CGImageCreateWithImageInRect, but this is not right, since stretchedImage is 50x50, not 300x100. How do I get the 300x100 image to crop from? If the original image was 300x100 there would be no problem, but then I would lose the advantage of stretchable image.
I guess to generalize the problem even more, the question would be as simple as: if you scale or stretch an image to a bigger image view, how do you get the scaled/stretched image?
Background for the specific task I'd like to apply the solution for (if you can come up with an alternative solution):
I'm trying to implement a UI that's similar to the one you see during a call in native iPhone call application: a plate containing buttons for mute, speaker, hold, etc. Some of them are toggle type buttons with a different background color for selected state.
I have two graphics for the whole plate, for non-selected and selected states. I'm stretching both images to the desired size. For the buttons in selected state I want to get a piece of the stretched selected graphic.
You should be able to do this by rendering abcImageView to a UIImage
UIGraphicsBeginImageContextWithOptions(abcImageView.bounds.size, NO, 0.f);
[abcImageView.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage *image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
Then, you can crop the image like this (given cropRect):
CGImageRef cgImage = CGImageCreateWithImageInRect(image.CGImage, cropRect);
UIImage *croppedImage = [UIImage imageWithCGImage:cgImage];
// Do something with the image.
CGImageRelease(cgImage);

Reduce Resolution UIImage/UIImageView [duplicate]

This question already has answers here:
The simplest way to resize an UIImage?
(34 answers)
Closed 9 years ago.
I've created a UIImageView which contains an image with a high resolution. The problem is that the resolution of that image is too high to put into the imageView. The size of the imageView is 92 x 91 (so it's small). It contains an UIImage, whose resolution is too high so it looks ugly in the UIImageView.
So how can I reduce the resolution of that UIImage?
My code for the UIImageView:
UIImageView *myImageView = [[UIImageView alloc] initWithImage:[UIImage imageWithContentsOfFile:pngFilePath]];
myImageView.frame = CGRectMake(212.0, 27, 92,91);
have a look at this
https://stackoverflow.com/a/2658801
This will help you to resize your image according to your need
Add method to your code and call like this
UIImage *myImage =[UIImage imageWithContentsOfFile:pngFilePath];
UIImage *newImage =[UIImage imageWithImage:myImage scaledToSize:CGSizeMake(92,91)];
You can resize an image using this method that returns a resized image :
-(UIImage *)imageWithImage:(UIImage *)image scaledToSize:(CGSize)newSize {
UIGraphicsBeginImageContextWithOptions(newSize, NO, 0.0);
// Here pass new size you need
[image drawInRect:CGRectMake(0, 0, newSize.width, newSize.height)];
UIImage *newImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return newImage;
}
Hope it helps you.
try with this
myImageView.contentMode = UIViewContentModeScaleToFill;
You need to downsize the image, which will reduce the resolution, make it easier to store. I actually wrote a class (building on some stuff from SO) that does just that. It's on my github, take a look:
https://github.com/pavlovonline/UIImageResizer
the main method is
-(UIImage*)resizeImage:(UIImage*)image toSize:(CGFloat)size
so you give this method the size to which you want to downsize your image. If the height is greater than the width, it will auto-calculate the middle and give you a perfectly centered square. Same for width being greater than height. If you need an image that is not square, make your own adjustments.
so you'll get back a downsized UIImage which you can then put into your UIImageView. Save some memory too.

UIImagePickerController image Scaling and maintain its quality

Image Size captured using Camera return's image of size 720*960.
The captured Image is displayed in a UIImageView of 320*436, like this.
UIImageView *imgView=[[UIImageView alloc] initWithFrame:CGRectMake(0.0,0.0,320.0,436.0)];
imgView.image=img;//Image received from camera.
[self.view addSubView:imgView];
This, works fine image 720*960 is scaled to 320*436 and displayed.
Now, from here actual problem starts. I have another image of size 72*72. This image is overlapped with the image received from camera at some arbitrary coordinates.
CGRectMake(0.0,0.0,72.0,72.0);
I am not able to find a better way to handle scaling and applying a overlay of another Image, at the same time maintain its quality.
The image needs to be send to a server.
Use the following code to scale images:
-(UIImage*) imageWithImage:(UIImage*)image scaledToSize:(CGSize)newSize
{
UIGraphicsBeginImageContext(newSize);
[image drawInRect:CGRectMake(0,0,newSize.width,newSize.height)];
UIImage *newImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return newImage;
}

Resources