Manage quality and size of an UIImage with alpha channel - ios

I have an UIView with UIImageViews and UILabels, which I have to capture into image and then export to photo gallery. The image has a fixed size in pixels and must have alpha channel, because UIView background color is clear.
Now I use UIGraphicsBeginImageContextWithOptions with renderInContext or drawViewHierarchyInRect, then I resize image to a given size and save it with UIImagePNGRepresentation. It works - I get an UIImage of the exact pixel size I need, with alpha channel, saved in gallery.
UIGraphicsBeginImageContextWithOptions(_templateView.bounds.size, NO, 0.0);
[_templateView drawViewHierarchyInRect:_templateView.bounds afterScreenUpdates:NO];
UIImage * img = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsBeginImageContext(CGSizeMake(1080.0f, 1080.0f));
[img drawInRect:CGRectMake(0, 0, 1080.0f, 1080.0f)];
img = UIGraphicsGetImageFromCurrentImageContext();
NSData *pngImageData = UIImagePNGRepresentation(img);
The problem is the size of the result image. It is a way larger than expected. When I add only one UIImageView (filling parent UIView) with the image of 1.2Mb. it's capture results in 1.65Mb.. It is crucial because I have a limit size for an image. How can I reduce it's size? Is is possible to reduce quality of such an image with alpha channel?
I tried resize it to 50% and then again to 100% but it results even in largest size.

Related

iOS - UIImage Resize to Bigger get Blurry

I'm trying to resize the UIImage
Before the UIImage:
After this code and UIImage:
UIGraphicsBeginImageContext (CGPointMake(155,139));
[currentImageView.image drawInRect:CGRectMake(0,0,155,139)];
currentImageView.image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
NSLog(#"get size : %#",NSStringFromCGSize(currentImageView.image.size));
NSLog: get size : {155,139}
Last, I try to restore the size by this code. but its blurry.
UIGraphicsBeginImageContext (OriginalSize);
[currentImageView.image drawInRect:CGRectMake(0,0,OriginalSize.width,OriginalSize.height)];
currentImageView.image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
NSLog(#"get size : %#",NSStringFromCGSize(cnurrentImageView.image.size));
NSLog: get size : {1024,352}
What happened?
Have any suggestions and advice?
You are making two mistakes,
1.Reducing the image size and setting it to a larger size imageView,
this can be fixed with the code,
imageView.contentMode = UIViewContentModeScaleAspectFit;
2.You have resize the original image to a smaller size, then
resizing the smaller image to original size, this will stretch the
image.
FOR 1 : first step,
You can use this code so that the image quality doesn't loose:
Instead of
UIGraphicsBeginImageContext
Use
UIGraphicsBeginImageContextWithOptions(currentImageView.frame.size, NO, 0);
as the size you want is smaller and the imageview is larger so you also need to set
YourimageView.contentMode = UIViewContentModeScaleAspectFit;
For Second Step :
Use original image to resize it.

iOS: How to get a piece of a stretched image?

The generic problem I'm facing is this:
I have a stretchable 50x50 PNG. I'm stretching it to 300x100. I want to get three UIImages of size 100x100 cut from the stretched image, A, B & C in the picture below:
I'm trying to do it like this:
// stretchedImage is the 50x50 UIImage, abcImageView is the 300x100 UIImageView
UIImage *stretchedImage = [abcImageView.image stretchableImageWithLeftCapWidth:25 topCapHeight:25];
CGImageRef image = CGImageCreateWithImageInRect(stretchedImage.CGImage, bButton.frame);
UIImage *result = [UIImage imageWithCGImage:image];
[bButton setBackgroundImage:result forState:UIControlStateSelected];
CGImageRelease(image);
I'm trying to crop the middle 100 ("B") using CGImageCreateWithImageInRect, but this is not right, since stretchedImage is 50x50, not 300x100. How do I get the 300x100 image to crop from? If the original image was 300x100 there would be no problem, but then I would lose the advantage of stretchable image.
I guess to generalize the problem even more, the question would be as simple as: if you scale or stretch an image to a bigger image view, how do you get the scaled/stretched image?
Background for the specific task I'd like to apply the solution for (if you can come up with an alternative solution):
I'm trying to implement a UI that's similar to the one you see during a call in native iPhone call application: a plate containing buttons for mute, speaker, hold, etc. Some of them are toggle type buttons with a different background color for selected state.
I have two graphics for the whole plate, for non-selected and selected states. I'm stretching both images to the desired size. For the buttons in selected state I want to get a piece of the stretched selected graphic.
You should be able to do this by rendering abcImageView to a UIImage
UIGraphicsBeginImageContextWithOptions(abcImageView.bounds.size, NO, 0.f);
[abcImageView.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage *image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
Then, you can crop the image like this (given cropRect):
CGImageRef cgImage = CGImageCreateWithImageInRect(image.CGImage, cropRect);
UIImage *croppedImage = [UIImage imageWithCGImage:cgImage];
// Do something with the image.
CGImageRelease(cgImage);

UIImagePickerController image Scaling and maintain its quality

Image Size captured using Camera return's image of size 720*960.
The captured Image is displayed in a UIImageView of 320*436, like this.
UIImageView *imgView=[[UIImageView alloc] initWithFrame:CGRectMake(0.0,0.0,320.0,436.0)];
imgView.image=img;//Image received from camera.
[self.view addSubView:imgView];
This, works fine image 720*960 is scaled to 320*436 and displayed.
Now, from here actual problem starts. I have another image of size 72*72. This image is overlapped with the image received from camera at some arbitrary coordinates.
CGRectMake(0.0,0.0,72.0,72.0);
I am not able to find a better way to handle scaling and applying a overlay of another Image, at the same time maintain its quality.
The image needs to be send to a server.
Use the following code to scale images:
-(UIImage*) imageWithImage:(UIImage*)image scaledToSize:(CGSize)newSize
{
UIGraphicsBeginImageContext(newSize);
[image drawInRect:CGRectMake(0,0,newSize.width,newSize.height)];
UIImage *newImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return newImage;
}

Crop UIImage according to Image Resolution

I have one UIImageView. Its content mode is set to AspectFit.
[imageView setContentMode:UIViewContentModeScaleAspectFit].
I need to crop a subImage from this image. This is the code which crops the image:
CGImageRef imageRef = CGImageCreateWithImageInRect([imageView.image CGImage], customRect);
UIImage *cropped = [UIImage imageWithCGImage:imageRef];
CGImageRelease(imageRef);
where customRect is the rectangle from which I need to crop the image.
This is how I calculate it:
CGRect customRect = CGRectMake((cropView.frame.origin.x/xFactor),
(cropView.frame.origin.y/yFactor),
(cropView.frame.size.width/xFactor),
(cropView.frame.size.height/yFactor));
The problem comes in cropping. CGImageCreateWithImageInRect crops the given area according to the actual image size which, in some cases, is larger than the image view size. I tried using other approaches such as UIGraphics:getImageFromCurrentImageContext but these do not keep the image quality as much as it degrades them.

IOS : Reduce image size without reducing image quality

I am displaying an image in tableview cell (Image name saved in a plist). Before setting it to the cell, I am resizing the image to
imageSize = CGSizeMake(32, 32);
But, after resizing the image, quality is also getting degraded in retina display.
I have both the images added to the project (i.e. 1x and #2x).
This is how I am reducing the image size to 32x32.
+ (UIImage *)scale:(UIImage *)image toSize:(CGSize)size
{
UIGraphicsBeginImageContext(size);
[image drawInRect:CGRectMake(0, 0, size.width, size.height)];
UIImage *scaledImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return scaledImage;
}
Any pointers on this is very much appreciated.
Thanks
try this : instead of UIGraphicsBeginImageContext(size);use UIGraphicsBeginImageContextWithOptions(size,NO,0.0);
from what i understand what you're doing there is resizing the image to 32x32 (in points) no matter what the resolution. the UIGraphicsBeginImageContextWithOptions scales the image to the scale of the device's screen..so you have the image resized to 32x32 points but the resolution is kept for retina display
(note that this is what i understood from apple's uikit reference..it may not be so..but it should)
read here

Resources