CGRect image resizing - ios

I'm using CGRect to display an image. I'd like the CGRect to use the width and height of the image without me specifying it.
can this:
CGRectMake(0.0f, 40.0f, 480.0f, 280.0f);
become this:
CGRectMake(0.0f, 40.0f, myImage.width, myImage.height);
some images get distorted when I specify the parameters.
here's the code:
CGRect myImageRect = CGRectMake(0.0f, 40.0f, 480.0f, 280.0f);
UIImageView *myImage = [[UIImageView alloc] initWithFrame:myImageRect];
[myImage setImage:[UIImage imageNamed:recipe.img]];
thanks for any help.

Once you have a UIImage, you can find its size by looking at the size property:
UIImage * image = [UIImage imageNamed:recipe.img];
CGRect rect = CGRectMake(0.0f, 40.0f, image.size.width, image.size.height);
UIImageView * imageView = [[UIImageView alloc] initWithFrame:rect];
[imageView setImage:image];

This category on UIImage might be helpful.
Use it like this: aImage =[aImage imageByScalingProportionallyToSize: myImageRect]
#implementation UIImage (Extras)
- (UIImage *)imageByScalingProportionallyToSize:(CGSize)targetSize {
UIImage *sourceImage = self;
UIImage *newImage = nil;
CGSize imageSize = sourceImage.size;
CGFloat width = imageSize.width;
CGFloat height = imageSize.height;
CGFloat targetWidth = targetSize.width;
CGFloat targetHeight = targetSize.height;
CGFloat scaleFactor = 0.0;
CGFloat scaledWidth = targetWidth;
CGFloat scaledHeight = targetHeight;
CGPoint thumbnailPoint = CGPointMake(0.0,0.0);
if (CGSizeEqualToSize(imageSize, targetSize) == NO) {
CGFloat widthFactor = targetWidth / width;
CGFloat heightFactor = targetHeight / height;
if (widthFactor < heightFactor)
scaleFactor = widthFactor;
else
scaleFactor = heightFactor;
scaledWidth = width * scaleFactor;
scaledHeight = height * scaleFactor;
// center the image
// if (widthFactor < heightFactor) {
// thumbnailPoint.y = (targetHeight - scaledHeight) * 0.5;
// } else if (widthFactor > heightFactor) {
// thumbnailPoint.x = (targetWidth - scaledWidth) * 0.5;
// }
//thumbnailPoint.x
}
// this is actually the interesting part:
UIGraphicsBeginImageContext(targetSize);
CGRect thumbnailRect = CGRectZero;
thumbnailRect.origin = thumbnailPoint;
thumbnailRect.size.width = scaledWidth;
thumbnailRect.size.height = scaledHeight;
[sourceImage drawInRect:thumbnailRect];
newImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
if(newImage == nil) NSLog(#"could not scale image");
return newImage ;
}
#end;

Related

iOS Image Resize

I want to compress a (900 * 900) picture to (600*600) on an iPhone 6 plus.
If I use:
UIGraphicsBeginImageContextWithOptions(CGSizeMake(600, 600), NO, 0)
the generated UIImage becomes (1800 * 1800).
If I use:
UIGraphicsBeginImageContextWithOptions(CGSizeMake(200, 200), NO, 0)
or
UIGraphicsBeginImageContextWithOptions(CGSizeMake(600, 600), NO, 1)
the resulting image will be blurred, which is obviously wrong.
Is there a good way to solve this problem?
This is the code
- (UIImage*)imageByScalingForSize:(CGSize)targetSize withSourceImage:(UIImage *)sourceImage
{
UIImage *newImage = nil;
//Omit scaledWidth/scaledHeight calculation
CGFloat scaledWidth = targetSize.width;
CGFloat scaledHeight = targetSize.height;
//UIGraphicsBeginImageContext(targetSize);
UIGraphicsBeginImageContextWithOptions(targetSize, NO, 0);
CGRect thumbnailRect = CGRectZero;
thumbnailRect.origin = CGPointMake(0, 0);
thumbnailRect.size.width= scaledWidth;
thumbnailRect.size.height = scaledHeight;
[sourceImage drawInRect:thumbnailRect];
newImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return newImage;
}
Try this lib!
UIImage+Zoom.h
#import <UIKit/UIKit.h>
enum {
enSvResizeScale, // image scaled to fill
enSvResizeAspectFit, // image scaled to fit with fixed aspect. remainder is transparent
enSvResizeAspectFill, // image scaled to fill with fixed aspect. some portion of content may be cliped
};
typedef NSInteger SvResizeMode;
#interface UIImage (Zoom)
- (UIImage*)resizeImageToSize:(CGSize)newSize resizeMode:(SvResizeMode)resizeMode;
#end
UIImage+Zoom.m
#import "UIImage+Zoom.h"
#implementation UIImage (Zoom)
/*
* #brief resizeImage
* #param newsize the dimensions(pixel) of the output image
*/
- (UIImage*)resizeImageToSize:(CGSize)newSize resizeMode:(SvResizeMode)resizeMode
{
CGRect drawRect = [self caculateDrawRect:newSize resizeMode:resizeMode];
UIGraphicsBeginImageContext(newSize);
CGContextRef context = UIGraphicsGetCurrentContext();
CGContextClearRect(context, CGRectMake(0, 0, newSize.width, newSize.height));
CGContextSetInterpolationQuality(context, 0.8);
[self drawInRect:drawRect blendMode:kCGBlendModeNormal alpha:1];
UIImage *image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return image;
}
// caculate drawrect respect to specific resize mode
- (CGRect)caculateDrawRect:(CGSize)newSize resizeMode:(SvResizeMode)resizeMode
{
CGRect drawRect = CGRectMake(0, 0, newSize.width, newSize.height);
CGFloat imageRatio = self.size.width / self.size.height;
CGFloat newSizeRatio = newSize.width / newSize.height;
switch (resizeMode) {
case enSvResizeScale:
{
// scale to fill
break;
}
case enSvResizeAspectFit: // any remain area is white
{
CGFloat newHeight = 0;
CGFloat newWidth = 0;
if (newSizeRatio >= imageRatio) { // max height is newSize.height
newHeight = newSize.height;
newWidth = newHeight * imageRatio;
}
else {
newWidth = newSize.width;
newHeight = newWidth / imageRatio;
}
drawRect.size.width = newWidth;
drawRect.size.height = newHeight;
drawRect.origin.x = newSize.width / 2 - newWidth / 2;
drawRect.origin.y = newSize.height / 2 - newHeight / 2;
break;
}
case enSvResizeAspectFill:
{
CGFloat newHeight = 0;
CGFloat newWidth = 0;
if (newSizeRatio >= imageRatio) { // max height is newSize.height
newWidth = newSize.width;
newHeight = newWidth / imageRatio;
}
else {
newHeight = newSize.height;
newWidth = newHeight * imageRatio;
}
drawRect.size.width = newWidth;
drawRect.size.height = newHeight;
drawRect.origin.x = newSize.width / 2 - newWidth / 2;
drawRect.origin.y = newSize.height / 2 - newHeight / 2;
break;
}
default:
break;
}
return drawRect;
}
If I use:
UIGraphicsBeginImageContextWithOptions(CGSizeMake(600, 600), NO, 0)
the generated UIImage becomes (1800 * 1800).
That is not quite true. It's true that the bitmap underlying the image is 1800 by 1800, but that is not the only thing about the image that it's important; the UIImage also has a scale and that scale is 3. Thus, the UIImage is in fact treated (with respect to the screen geometry) as 600 by 600, plus it is triple-resolution to match the triple-resolution of the Plus screen — which is exactly what you want.
(The scaling will work best, of course, if you also start with the triple-resolution version of your original image. But that's a different matter.)

Instagram iOS hooks with non-square images

Now that the Instagram app can handle and post non-square images from within that app, I was hoping that I could send non-square images to the Instagram app from my app using the same provided iPhone hooks I've been using (https://instagram.com/developer/iphone-hooks/?hl=en). However, it still seems to be cropping my images to square and not giving me the option to expand them to their non-square size (unlike when I load a non-square photo from the library from directly within the Instagram app and it lets me expand it to its non-square original dimensions). Anyone had any luck sending non-square images? I'm hoping there's some tweak that will make it work.
I too was hoping that they would take non-square photos with the update, but you're stuck with the old solution for posting non-square photos.... matte them to square with white.
https://github.com/ShareKit/ShareKit/blob/master/Classes/ShareKit/Sharers/Services/Instagram/SHKInstagram.m
- (UIImage *)imageByScalingImage:(UIImage*)image proportionallyToSize:(CGSize)targetSize {
UIImage *sourceImage = image;
UIImage *newImage = nil;
CGSize imageSize = sourceImage.size;
CGFloat width = imageSize.width;
CGFloat height = imageSize.height;
CGFloat targetWidth = targetSize.width;
CGFloat targetHeight = targetSize.height;
CGFloat scaleFactor = 0.0;
CGFloat scaledWidth = targetWidth;
CGFloat scaledHeight = targetHeight;
CGPoint thumbnailPoint = CGPointMake(0.0,0.0);
if (CGSizeEqualToSize(imageSize, targetSize) == NO) {
CGFloat widthFactor = targetWidth / width;
CGFloat heightFactor = targetHeight / height;
if (widthFactor < heightFactor)
scaleFactor = widthFactor;
else
scaleFactor = heightFactor;
scaledWidth = width * scaleFactor;
scaledHeight = height * scaleFactor;
// center the image
if (widthFactor < heightFactor) {
thumbnailPoint.y = (targetHeight - scaledHeight) * 0.5;
} else if (widthFactor > heightFactor) {
thumbnailPoint.x = (targetWidth - scaledWidth) * 0.5;
}
}
// this is actually the interesting part:
UIGraphicsBeginImageContext(targetSize);
[(UIColor*)SHKCONFIG(instagramLetterBoxColor) set];
CGContextFillRect(UIGraphicsGetCurrentContext(), CGRectMake(0,0,targetSize.width,targetSize.height));
CGRect thumbnailRect = CGRectZero;
thumbnailRect.origin = thumbnailPoint;
thumbnailRect.size.width = scaledWidth;
thumbnailRect.size.height = scaledHeight;
[sourceImage drawInRect:thumbnailRect];
newImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
if(newImage == nil) NSLog(#"could not scale image");
return newImage ;
}

Cropping UIimage in IOS

I am very new to IOS and the first task given to me is Image cropping. means If I am using an image as banner image and the given frame size is more then or smaller thn the size of the image . my code should automatically resize image in respective aspect ratio of the image and then set image in that frame.
I have done so much R&D and after that i have written code.
-(UIImage *)MyScaleNEwMethodwithImage:(UIImage *)image andframe:(CGRect)frame{
float bmHeight= image.size.height;
float bmWidth= image.size.width;
UIImage *RecievedImage=image;
if (bmHeight>bmWidth) {
float ratio = frame.size.height/frame.size.width;
float newbmHeight = ratio*bmWidth;
float cropedHeight= (bmHeight-newbmHeight)/2;
if (cropedHeight<0) {
float ratio1= frame.size.width/frame.size.height;
float newbmHeight1= (ratio1*bmHeight);
float cropedimageHeight1 = (bmWidth- newbmHeight1)/2;
CGRect cliprect = CGRectMake(cropedimageHeight1, 0,bmWidth-cropedimageHeight1,bmHeight);
CGImageRef imref = CGImageCreateWithImageInRect([image CGImage],cliprect);
UIImage *newSubImage = [UIImage imageWithCGImage:imref];
return newSubImage;
}
else
{
CGRect cliprect = CGRectMake(0,cropedHeight,bmWidth,bmHeight-cropedHeight);
CGImageRef imref = CGImageCreateWithImageInRect([image CGImage],cliprect);
UIImage *newSubImage = [UIImage imageWithCGImage:imref];
return newSubImage;
}
}
else
{
float ratio = frame.size.height/frame.size.width;
float newbmHeight = ratio*bmHeight;
float cropedHeight= (bmHeight-newbmHeight)/4;
if (cropedHeight<0) {
float ratio1= frame.size.width/frame.size.height;
float newbmHeight1= (ratio1*bmWidth);
float cropedimageHeight1 = (bmWidth- newbmHeight1)/2;
UIImageView *DummyImage=[[UIImageView alloc]initWithFrame:CGRectMake(0,cropedimageHeight1,bmWidth,(bmHeight-cropedimageHeight1))];
[DummyImage setImage:RecievedImage];
CGImageRef imageRef = CGImageCreateWithImageInRect([DummyImage.image CGImage], CGRectMake(0,cropedimageHeight1/2,bmWidth/2,(bmHeight-cropedimageHeight1)/2));
// or use the UIImage wherever you like
[DummyImage setImage:[UIImage imageWithCGImage:imageRef]];
CGImageRelease(imageRef);
UIImage *ScaledImage=[UIImage imageWithCGImage:imageRef];
return ScaledImage;
} else {
UIImageView *DummyImage=[[UIImageView alloc]initWithFrame:CGRectMake(cropedHeight,0,bmWidth-cropedHeight,bmHeight)];
[DummyImage setImage:RecievedImage];
CGImageRef imageRef = CGImageCreateWithImageInRect([DummyImage.image CGImage],CGRectMake(cropedHeight,2*cropedHeight,(bmWidth-cropedHeight),bmHeight/2));
// or use the UIImage wherever you like
[DummyImage setImage:[UIImage imageWithCGImage:imageRef]];
CGImageRelease(imageRef);
UIImage *ScaledImage=[UIImage imageWithCGImage:imageRef];
return ScaledImage;
}
}
}
In my frame i am getting required image but when screen changes i can see full image. i want to cut the unwanted image.
Thise piece of code may help you out
-(CGRect) cropImage:(CGRect)frame
{
NSAssert(self.contentMode == UIViewContentModeScaleAspectFit, #"content mode should be aspect fit");
CGFloat wScale = self.bounds.size.width / self.image.size.width;
CGFloat hScale = self.bounds.size.height / self.image.size.height;
float x, y, w, h, offset;
if (wScale<hScale) {
offset = (self.bounds.size.height - (self.image.size.height*widthScale))/2;
x = frame.origin.x / wScale;
y = (frame.origin.y-offset) / wScale;
w = frame.size.width / wScale;
h = frame.size.height / wScale;
} else {
offset = (self.bounds.size.width - (self.image.size.width*heightScale))/2;
x = (frame.origin.x-offset) / hScale;
y = frame.origin.y / hScale;
w = frame.size.width / hScale;
h = frame.size.height / hScale;
}
return CGRectMake(x, y, w, h);
}
The code which I had referred to crop the image as per aspect ratio is :
typedef enum {
MGImageResizeCrop,
MGImageResizeCropStart,
MGImageResizeCropEnd,
MGImageResizeScale
} MGImageResizingMethod;
- (UIImage *)imageToFitSize:(CGSize)fitSize method:(MGImageResizingMethod)resizeMethod
{
float imageScaleFactor = 1.0;
#if __IPHONE_OS_VERSION_MAX_ALLOWED >= 40000
if ([self respondsToSelector:#selector(scale)]) {
imageScaleFactor = [self scale];
}
#endif
float sourceWidth = [self size].width * imageScaleFactor;
float sourceHeight = [self size].height * imageScaleFactor;
float targetWidth = fitSize.width;
float targetHeight = fitSize.height;
BOOL cropping = !(resizeMethod == MGImageResizeScale);
// Calculate aspect ratios
float sourceRatio = sourceWidth / sourceHeight;
float targetRatio = targetWidth / targetHeight;
// Determine what side of the source image to use for proportional scaling
BOOL scaleWidth = (sourceRatio <= targetRatio);
// Deal with the case of just scaling proportionally to fit, without cropping
scaleWidth = (cropping) ? scaleWidth : !scaleWidth;
// Proportionally scale source image
float scalingFactor, scaledWidth, scaledHeight;
if (scaleWidth) {
scalingFactor = 1.0 / sourceRatio;
scaledWidth = targetWidth;
scaledHeight = round(targetWidth * scalingFactor);
} else {
scalingFactor = sourceRatio;
scaledWidth = round(targetHeight * scalingFactor);
scaledHeight = targetHeight;
}
float scaleFactor = scaledHeight / sourceHeight;
// Calculate compositing rectangles
CGRect sourceRect, destRect;
if (cropping) {
destRect = CGRectMake(0, 0, targetWidth, targetHeight);
float destX, destY;
if (resizeMethod == MGImageResizeCrop) {
// Crop center
destX = round((scaledWidth - targetWidth) / 2.0);
destY = round((scaledHeight - targetHeight) / 2.0);
} else if (resizeMethod == MGImageResizeCropStart) {
// Crop top or left (prefer top)
if (scaleWidth) {
// Crop top
destX = 0.0;
destY = 0.0;
} else {
// Crop left
destX = 0.0;
destY = round((scaledHeight - targetHeight) / 2.0);
}
} else if (resizeMethod == MGImageResizeCropEnd) {
// Crop bottom or right
if (scaleWidth) {
// Crop bottom
destX = round((scaledWidth - targetWidth) / 2.0);
destY = round(scaledHeight - targetHeight);
} else {
// Crop right
destX = round(scaledWidth - targetWidth);
destY = round((scaledHeight - targetHeight) / 2.0);
}
}
sourceRect = CGRectMake(destX / scaleFactor, destY / scaleFactor,
targetWidth / scaleFactor, targetHeight / scaleFactor);
} else {
sourceRect = CGRectMake(0, 0, sourceWidth, sourceHeight);
destRect = CGRectMake(0, 0, scaledWidth, scaledHeight);
}
// Create appropriately modified image.
UIImage *image = nil;
#if __IPHONE_OS_VERSION_MAX_ALLOWED >= 40000
if ([[[UIDevice currentDevice] systemVersion] floatValue] >= 4.0) {
UIGraphicsBeginImageContextWithOptions(destRect.size, NO, 0.0); // 0.0 for scale means "correct scale for device's main screen".
CGImageRef sourceImg = CGImageCreateWithImageInRect([self CGImage], sourceRect); // cropping happens here.
image = [UIImage imageWithCGImage:sourceImg scale:0.0 orientation:self.imageOrientation]; // create cropped UIImage.
[image drawInRect:destRect]; // the actual scaling happens here, and orientation is taken care of automatically.
CGImageRelease(sourceImg);
image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
}
#endif
if (!image) {
// Try older method.
CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();
CGContextRef context = CGBitmapContextCreate(NULL, fitSize.width, fitSize.height, 8, (fitSize.width * 4),
colorSpace, kCGImageAlphaPremultipliedLast);
CGImageRef sourceImg = CGImageCreateWithImageInRect([self CGImage], sourceRect);
CGContextDrawImage(context, destRect, sourceImg);
CGImageRelease(sourceImg);
CGImageRef finalImage = CGBitmapContextCreateImage(context);
CGContextRelease(context);
CGColorSpaceRelease(colorSpace);
image = [UIImage imageWithCGImage:finalImage];
CGImageRelease(finalImage);
}
return image;
}
Check if this helps you.

When using avfoundation - crop visible portion of captured image represented in display layer

I am using PBJVision library to capture images.
Under the hood it is using AVFoundation.
I set up the camera to use the following options
PBJVision *camera = [[PBJVision alloc] init];
self.camera = camera;
self.camera.delegate = self;
[self.camera setCameraMode:PBJCameraModePhoto];
[self.camera setCameraDevice:PBJCameraDeviceFront];
[self.camera setCameraOrientation:PBJCameraOrientationPortrait];
[self.camera setFocusMode:PBJFocusModeAutoFocus];
[self.camera setPresentationFrame:self.previewView.frame];
[self.camera previewLayer].frame = self.previewView.bounds;
[self.camera previewLayer].videoGravity = AVLayerVideoGravityResizeAspectFill;
[self.previewView.layer addSublayer:[self.camera previewLayer]];
because the preview layer gravity is set as AVLayerVideoGravityResizeAspectFill the captured image isn't identical to the previewed image.
How can I crop it according to the video gravity?
Based on Erica Sadun's excellent Cookbook, adding the code below to your view controller will allow you to do
UIImage *newImage = [self.applyAspectFillImage: image InRect: self.previewView.bounds];
You can obtain the maximum image size by using the smaller edge (width or height) of the original photo image to size your destination rectangle.
CGFloat scaleW = image.size.width / previewView.bounds.size.width;
CGRect destRect = CGRectMake(0, 0, image.size.width, preview.bounds.size.height * scaleW);
UIImage *newImage = [self.applyAspectFillImage: image InRect: destRect];
The code:
CGRect CGRectCenteredInRect(CGRect rect, CGRect mainRect)
{
CGFloat xOffset = CGRectGetMidX(mainRect)-CGRectGetMidX(rect);
CGFloat yOffset = CGRectGetMidY(mainRect)-CGRectGetMidY(rect);
return CGRectOffset(rect, xOffset, yOffset);
}
// Calculate the destination scale for filling
CGFloat CGAspectScaleFill(CGSize sourceSize, CGRect destRect)
{
CGSize destSize = destRect.size;
CGFloat scaleW = destSize.width / sourceSize.width;
CGFloat scaleH = destSize.height / sourceSize.height;
return MAX(scaleW, scaleH);
}
CGRect CGRectAspectFillRect(CGSize sourceSize, CGRect destRect)
{
CGSize destSize = destRect.size;
CGFloat destScale = CGAspectScaleFill(sourceSize, destRect);
CGFloat newWidth = sourceSize.width * destScale;
CGFloat newHeight = sourceSize.height * destScale;
CGFloat dWidth = ((destSize.width - newWidth) / 2.0f);
CGFloat dHeight = ((destSize.height - newHeight) / 2.0f);
CGRect rect = CGRectMake (dWidth, dHeight, newWidth, newHeight);
return rect;
}
- (UIImage *) applyAspectFillImage: (UIImage *) image InRect: (CGRect) bounds
{
CGRect destRect;
UIGraphicsBeginImageContext(bounds.size);
CGRect rect = CGRectAspectFillRect(image.size, bounds);
destRect = CGRectCenteredInRect(rect, bounds);
[image drawInRect: destRect];
UIImage *newImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return newImage;
}

how to draw UIImage image such as fit out style

I want to get the thumbnail from the jpg file and display as a UIImage. The display style is fit out mode, such as the UIViewContentModeScaleAspectFill. below is the sample code, but it is too complicate.
-(UIImage*)scaleToSize:(CGSize)size
{
CGFloat width = CGImageGetWidth(self.CGImage);
CGFloat height = CGImageGetHeight(self.CGImage);
NSLog(#"size w=%f, h=%f, image w=%f, h=%f", size.width, size.height, width, height);
float verticalRadio = size.height*1.0/height;
float horizontalRadio = size.width*1.0/width;
float radio = 1;
if(verticalRadio>1 && horizontalRadio>1)
{
radio = verticalRadio > horizontalRadio ? horizontalRadio : verticalRadio;
}
else
{
radio = verticalRadio < horizontalRadio ? verticalRadio : horizontalRadio;
}
width = width*radio;
height = height*radio;
NSLog(#"width=%f, height=%f", width, height);
int xPos = (size.width - width)/2;
int yPos = (size.height-height)/2;
NSLog(#"xpos=%d, ypos=%d", xPos, yPos);
CGSize sz = CGSizeMake(width, height);
UIGraphicsBeginImageContext(sz);
//
[self drawInRect:CGRectMake(0, 0, width, height)];
UIImage* scaledImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
CGRect rt = CGRectMake(-xPos, -yPos, size.width, size.height);
UIImage* thub = [scaledImage getSubImage:rt];
scaledImage = nil;
return thub;
}
- (UIImage *)getSubImage:(CGRect) rect{
CGImageRef subImageRef = CGImageCreateWithImageInRect(self.CGImage, rect);
CGRect smallBounds = CGRectMake(rect.origin.x, rect.origin.y, CGImageGetWidth(subImageRef), CGImageGetHeight(subImageRef));
NSLog(#"small bounds x=%f, y=%f, w=%f, h=%f", smallBounds.origin.x, smallBounds.origin.y, smallBounds.size.width, smallBounds.size.height);
UIGraphicsBeginImageContext(smallBounds.size);
CGContextRef context = UIGraphicsGetCurrentContext();
CGContextDrawImage(context, smallBounds, subImageRef);
UIImage* smallImg = [UIImage imageWithCGImage:subImageRef];
UIGraphicsEndImageContext();
return smallImg;
}
I have update the method:
-(UIImage*)getSubImage:(CGSize)size
{
CGFloat width = CGImageGetWidth(self.CGImage);
CGFloat height = CGImageGetHeight(self.CGImage);
float verticalRadio = size.height*1.0/height;
float horizontalRadio = size.width*1.0/width;
float radio = 1;
radio = verticalRadio < horizontalRadio ? horizontalRadio : verticalRadio;
CGRect displayRect = CGRectMake((size.width - width*radio)/2.0f,
(size.height-height*radio)/2.0f,
width*radio,
height*radio);
// create a bitmap context and then set the context to current using
UIGraphicsBeginImageContext(size);
//clip
UIRectClip(displayRect);
//draw the bitmap in rect
[self drawInRect:displayRect];
// from context to get a UIIMage
UIImage* scaledImage = UIGraphicsGetImageFromCurrentImageContext();
// move context out of buffer
UIGraphicsEndImageContext();
NSLog(#"getSubImage ---->");
// return.
return scaledImage;
}

Resources