I want to get the thumbnail from the jpg file and display as a UIImage. The display style is fit out mode, such as the UIViewContentModeScaleAspectFill. below is the sample code, but it is too complicate.
-(UIImage*)scaleToSize:(CGSize)size
{
CGFloat width = CGImageGetWidth(self.CGImage);
CGFloat height = CGImageGetHeight(self.CGImage);
NSLog(#"size w=%f, h=%f, image w=%f, h=%f", size.width, size.height, width, height);
float verticalRadio = size.height*1.0/height;
float horizontalRadio = size.width*1.0/width;
float radio = 1;
if(verticalRadio>1 && horizontalRadio>1)
{
radio = verticalRadio > horizontalRadio ? horizontalRadio : verticalRadio;
}
else
{
radio = verticalRadio < horizontalRadio ? verticalRadio : horizontalRadio;
}
width = width*radio;
height = height*radio;
NSLog(#"width=%f, height=%f", width, height);
int xPos = (size.width - width)/2;
int yPos = (size.height-height)/2;
NSLog(#"xpos=%d, ypos=%d", xPos, yPos);
CGSize sz = CGSizeMake(width, height);
UIGraphicsBeginImageContext(sz);
//
[self drawInRect:CGRectMake(0, 0, width, height)];
UIImage* scaledImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
CGRect rt = CGRectMake(-xPos, -yPos, size.width, size.height);
UIImage* thub = [scaledImage getSubImage:rt];
scaledImage = nil;
return thub;
}
- (UIImage *)getSubImage:(CGRect) rect{
CGImageRef subImageRef = CGImageCreateWithImageInRect(self.CGImage, rect);
CGRect smallBounds = CGRectMake(rect.origin.x, rect.origin.y, CGImageGetWidth(subImageRef), CGImageGetHeight(subImageRef));
NSLog(#"small bounds x=%f, y=%f, w=%f, h=%f", smallBounds.origin.x, smallBounds.origin.y, smallBounds.size.width, smallBounds.size.height);
UIGraphicsBeginImageContext(smallBounds.size);
CGContextRef context = UIGraphicsGetCurrentContext();
CGContextDrawImage(context, smallBounds, subImageRef);
UIImage* smallImg = [UIImage imageWithCGImage:subImageRef];
UIGraphicsEndImageContext();
return smallImg;
}
I have update the method:
-(UIImage*)getSubImage:(CGSize)size
{
CGFloat width = CGImageGetWidth(self.CGImage);
CGFloat height = CGImageGetHeight(self.CGImage);
float verticalRadio = size.height*1.0/height;
float horizontalRadio = size.width*1.0/width;
float radio = 1;
radio = verticalRadio < horizontalRadio ? horizontalRadio : verticalRadio;
CGRect displayRect = CGRectMake((size.width - width*radio)/2.0f,
(size.height-height*radio)/2.0f,
width*radio,
height*radio);
// create a bitmap context and then set the context to current using
UIGraphicsBeginImageContext(size);
//clip
UIRectClip(displayRect);
//draw the bitmap in rect
[self drawInRect:displayRect];
// from context to get a UIIMage
UIImage* scaledImage = UIGraphicsGetImageFromCurrentImageContext();
// move context out of buffer
UIGraphicsEndImageContext();
NSLog(#"getSubImage ---->");
// return.
return scaledImage;
}
Related
Hi I am working with the UIimage, I need to crop my image to specific size, Suppose my specific size width is equal to device width and height is fixed 200.
I tried in Google lot of examples but non of those not working for me, I tried below samples
-(UIImage *)resizeImage:(UIImage *)image
{
float h=200;
float w=400;
float actualHeight = image.size.height;
float actualWidth = image.size.width;
float maxHeight = h;
float maxWidth = w;
float imgRatio = actualWidth/actualHeight;
float maxRatio = maxWidth/maxHeight;
float compressionQuality = 0.8;//50 percent compression
if (actualHeight > maxHeight || actualWidth > maxWidth)
{
if(imgRatio < maxRatio)
{
//adjust width according to maxHeight
imgRatio = maxHeight / actualHeight;
actualWidth = imgRatio * actualWidth;
actualHeight = maxHeight;
}
else if(imgRatio > maxRatio)
{
//adjust height according to maxWidth
imgRatio = maxWidth / actualWidth;
actualHeight = imgRatio * actualHeight;
actualWidth = maxWidth;
}
else
{
actualHeight = maxHeight;
actualWidth = maxWidth;
}
}
CGRect rect = CGRectMake(0.0, 0.0, actualWidth, actualHeight);
UIGraphicsBeginImageContext(rect.size);
//UIGraphicsBeginImageContextWithOptions(rect.size,NO,0.0);
[image drawInRect:rect];
UIImage *img = UIGraphicsGetImageFromCurrentImageContext();
NSData *imageData = UIImageJPEGRepresentation(img, compressionQuality);
UIGraphicsEndImageContext();
return [UIImage imageWithData:imageData];
}
Below is my actual image
After cropping I am getting like below
All of my examples are stretching the image But i want with out stretching need to scale the image.Where i am missing?Please help me.
The code you are using will resize your image, We use it for image compression or image resizing.
If you want to crop an Image there are two way.
1)Allow user to edit when image is selected.
You can achieve this by adding allowEditing property to picker like,
pickerView.allowEditing = true
This will give you rectangle to crop image, you can retrieve edited image from didFinish method of picker view
2)Programatically
To crop image programatically you need to pass frame for cropping.
Here is the function i am using
func croppImage(originalImage:UIImage, toRect rect:CGRect) -> UIImage{
var imageRef:CGImageRef = CGImageCreateWithImageInRect(originalImage.CGImage, rect)
var croppedimage:UIImage = UIImage(CGImage:imageRef)
return croppedimage
}
The above code will return you cropped image.
These will crop your image into size you required.
- (CGImageRef)newTransformedImage:(CGAffineTransform)transform
sourceImage:(CGImageRef)sourceImage
sourceSize:(CGSize)sourceSize
sourceOrientation:(UIImageOrientation)sourceOrientation
outputWidth:(CGFloat)outputWidth
cropSize:(CGSize)cropSize
imageViewSize:(CGSize)imageViewSize
{
CGImageRef source = [self newScaledImage:sourceImage
withOrientation:sourceOrientation
toSize:sourceSize
withQuality:kCGInterpolationNone];
CGFloat aspect = cropSize.height/cropSize.width;
CGSize outputSize = CGSizeMake(outputWidth, outputWidth*aspect);
CGContextRef context = CGBitmapContextCreate(NULL,
outputSize.width,
outputSize.height,
CGImageGetBitsPerComponent(source),
0,
CGImageGetColorSpace(source),
CGImageGetBitmapInfo(source));
CGContextSetFillColorWithColor(context, [[UIColor clearColor] CGColor]);
CGContextFillRect(context, CGRectMake(0, 0, outputSize.width, outputSize.height));
CGAffineTransform uiCoords = CGAffineTransformMakeScale(outputSize.width / cropSize.width,
outputSize.height / cropSize.height);
uiCoords = CGAffineTransformTranslate(uiCoords, cropSize.width/2.0, cropSize.height / 2.0);
uiCoords = CGAffineTransformScale(uiCoords, 1.0, -1.0);
CGContextConcatCTM(context, uiCoords);
CGContextConcatCTM(context, transform);
CGContextScaleCTM(context, 1.0, -1.0);
CGContextDrawImage(context, CGRectMake(-imageViewSize.width/2.0,
-imageViewSize.height/2.0,
imageViewSize.width,
imageViewSize.height)
, source);
CGImageRef resultRef = CGBitmapContextCreateImage(context);
CGContextRelease(context);
CGImageRelease(source);
return resultRef;
}
- (CGImageRef)newScaledImage:(CGImageRef)source withOrientation:(UIImageOrientation)orientation toSize:(CGSize)size withQuality:(CGInterpolationQuality)quality
{
CGSize srcSize = size;
CGFloat rotation = 0.0;
switch(orientation)
{
case UIImageOrientationUp: {
rotation = 0;
} break;
case UIImageOrientationDown: {
rotation = M_PI;
} break;
case UIImageOrientationLeft:{
rotation = M_PI_2;
srcSize = CGSizeMake(size.height, size.width);
} break;
case UIImageOrientationRight: {
rotation = -M_PI_2;
srcSize = CGSizeMake(size.height, size.width);
} break;
default:
break;
}
CGContextRef context = CGBitmapContextCreate(NULL,
size.width,
size.height,
8, //CGImageGetBitsPerComponent(source),
0,
CGImageGetColorSpace(source),
(CGBitmapInfo)kCGImageAlphaNoneSkipFirst //CGImageGetBitmapInfo(source)
);
CGContextSetInterpolationQuality(context, quality);
CGContextTranslateCTM(context, size.width/2, size.height/2);
CGContextRotateCTM(context,rotation);
CGContextDrawImage(context, CGRectMake(-srcSize.width/2 ,
-srcSize.height/2,
srcSize.width,
srcSize.height),
source);
CGImageRef resultRef = CGBitmapContextCreateImage(context);
CGContextRelease(context);
return resultRef;
}
///where you want to crop.
CGImageRef imageRef = [self newTransformedImage:transform
sourceImage:self.image.CGImage
sourceSize:self.image.size
sourceOrientation:self.image.imageOrientation
outputWidth:self.image.size.width
cropSize:self.photoView.cropView.frame.size
imageViewSize:self.photoView.photoContentView.bounds.size];
UIImage *image = [UIImage imageWithCGImage:imageRef];
CGImageRelease(imageRef);
Thank you. Enjoy Coding :)
Hi I am developing an iOS app. I have an UIImageView with a image associated with it. I am changing its dimensions in viewDidLoad() method.
Initially when I change the dimension I am able to resize the image size on view. However after I crop the image(using Photoshop) accordingly to the shape of the object in the image(i.e getting rid of unwanted part of the image). My resize method doesn't seem to work i.e the size of the image is not changing though I call the same method.
The method I am using for resizing is given below.
-(void)initXYZ{
CGSize size;
CGFloat x,y;
x = 0+myImageView1.frame.size.width;
y = myImageView2.center.y;
size.width = _myImageView2.frame.size.width/2;
size.height = _myImageView2.frame.size.width/2;
UIImage *image = [UIImage imageNamed:#"xyz.png"];
image = [HomeViewController imageWithImage:image scaledToSize:size xCord:x yCord:y];}
Utility method is given below
+(UIImage *)imageWithImage:(UIImage *)image scaledToSize:(CGSize)newSize xCord:(CGFloat)X yCord:(CGFloat)Y{
UIGraphicsBeginImageContextWithOptions(newSize,NO,0.0);
[image drawInRect:CGRectMake(0, 0, newSize.width, newSize.height)];
UIImage *newImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return newImage;}
Try this...
+ (UIImage*)imageWithImage:(UIImage*)image
scaledToSize:(CGSize)newSize
{
UIGraphicsBeginImageContext( newSize );
[image drawInRect:CGRectMake(0,0,newSize.width,newSize.height)];
UIImage* newImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return newImage;
}
OR
+ (UIImage*)imageWithImage:(UIImage*)image
scaledToSize:(CGSize)newSize {
CGRect newRect = CGRectIntegral(CGRectMake(0, 0, newSize.width, newSize.height));
CGImageRef imageRef = image.CGImage;
UIGraphicsBeginImageContextWithOptions(newSize, NO, 0);
CGContextRef context = UIGraphicsGetCurrentContext();
CGContextSetInterpolationQuality(context, kCGInterpolationHigh);
CGAffineTransform flipV = CGAffineTransformMake(1, 0, 0, -1, 0, newSize.height);
CGContextConcatCTM(context, flipV);
CGContextDrawImage(context, newRect, imageRef);
CGImageRef newImageRef = CGBitmapContextCreateImage(context);
UIImage *newImage = [UIImage imageWithCGImage:newImageRef];
CGImageRelease(newImageRef);
UIGraphicsEndImageContext();
return newImage;
}
Try this:
- (UIImage*)resizeAndStoreImages:(UIImage*)img
{
UIImage *chosenImage = img;
NSData *imageData = UIImageJPEGRepresentation(chosenImage, 1.0);
int resizedImgMaxHeight = 500;
int resizedImgMaxWidth = 500;
UIImage *resizedImageData;
if (chosenImage.size.height > chosenImage.size.width && chosenImage.size.height > resizedImgMaxHeight) { // portrait
int width = (chosenImage.size.width / chosenImage.size.height) * resizedImgMaxHeight;
CGRect rect = CGRectMake( 0, 0, width, resizedImgMaxHeight);
UIGraphicsBeginImageContext(rect.size);
[chosenImage drawInRect:rect];
UIImage *pic1 = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
resizedImageData = [UIImage imageWithData:UIImageJPEGRepresentation(pic1, 1.0)];
pic1 = nil;
} else if (chosenImage.size.width > chosenImage.size.height && chosenImage.size.width > resizedImgMaxWidth) { // landscape
int height = (chosenImage.size.height / chosenImage.size.width) * resizedImgMaxWidth;
CGRect rect = CGRectMake( 0, 0, resizedImgMaxWidth, height);
UIGraphicsBeginImageContext(rect.size);
[chosenImage drawInRect:rect];
UIImage *pic1 = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
resizedImageData = [UIImage imageWithData:UIImageJPEGRepresentation(pic1, 1.0)];
pic1 = nil;
} else {
if (chosenImage.size.height > resizedImgMaxHeight) {
int width = (chosenImage.size.width / chosenImage.size.height) * resizedImgMaxHeight;
CGRect rect = CGRectMake( 0, 0, width, resizedImgMaxHeight);
UIGraphicsBeginImageContext(rect.size);
[chosenImage drawInRect:rect];
UIImage *pic1 = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
resizedImageData = [UIImage imageWithData:UIImageJPEGRepresentation(pic1, 1.0)];
pic1 = nil;
} else {
resizedImageData = [UIImage imageWithData:imageData];
}
}
return resizedImageData;
}
Adjust the resizedImgMaxHeight and resizedImgMaxWidth as per your need
I am very new to IOS and the first task given to me is Image cropping. means If I am using an image as banner image and the given frame size is more then or smaller thn the size of the image . my code should automatically resize image in respective aspect ratio of the image and then set image in that frame.
I have done so much R&D and after that i have written code.
-(UIImage *)MyScaleNEwMethodwithImage:(UIImage *)image andframe:(CGRect)frame{
float bmHeight= image.size.height;
float bmWidth= image.size.width;
UIImage *RecievedImage=image;
if (bmHeight>bmWidth) {
float ratio = frame.size.height/frame.size.width;
float newbmHeight = ratio*bmWidth;
float cropedHeight= (bmHeight-newbmHeight)/2;
if (cropedHeight<0) {
float ratio1= frame.size.width/frame.size.height;
float newbmHeight1= (ratio1*bmHeight);
float cropedimageHeight1 = (bmWidth- newbmHeight1)/2;
CGRect cliprect = CGRectMake(cropedimageHeight1, 0,bmWidth-cropedimageHeight1,bmHeight);
CGImageRef imref = CGImageCreateWithImageInRect([image CGImage],cliprect);
UIImage *newSubImage = [UIImage imageWithCGImage:imref];
return newSubImage;
}
else
{
CGRect cliprect = CGRectMake(0,cropedHeight,bmWidth,bmHeight-cropedHeight);
CGImageRef imref = CGImageCreateWithImageInRect([image CGImage],cliprect);
UIImage *newSubImage = [UIImage imageWithCGImage:imref];
return newSubImage;
}
}
else
{
float ratio = frame.size.height/frame.size.width;
float newbmHeight = ratio*bmHeight;
float cropedHeight= (bmHeight-newbmHeight)/4;
if (cropedHeight<0) {
float ratio1= frame.size.width/frame.size.height;
float newbmHeight1= (ratio1*bmWidth);
float cropedimageHeight1 = (bmWidth- newbmHeight1)/2;
UIImageView *DummyImage=[[UIImageView alloc]initWithFrame:CGRectMake(0,cropedimageHeight1,bmWidth,(bmHeight-cropedimageHeight1))];
[DummyImage setImage:RecievedImage];
CGImageRef imageRef = CGImageCreateWithImageInRect([DummyImage.image CGImage], CGRectMake(0,cropedimageHeight1/2,bmWidth/2,(bmHeight-cropedimageHeight1)/2));
// or use the UIImage wherever you like
[DummyImage setImage:[UIImage imageWithCGImage:imageRef]];
CGImageRelease(imageRef);
UIImage *ScaledImage=[UIImage imageWithCGImage:imageRef];
return ScaledImage;
} else {
UIImageView *DummyImage=[[UIImageView alloc]initWithFrame:CGRectMake(cropedHeight,0,bmWidth-cropedHeight,bmHeight)];
[DummyImage setImage:RecievedImage];
CGImageRef imageRef = CGImageCreateWithImageInRect([DummyImage.image CGImage],CGRectMake(cropedHeight,2*cropedHeight,(bmWidth-cropedHeight),bmHeight/2));
// or use the UIImage wherever you like
[DummyImage setImage:[UIImage imageWithCGImage:imageRef]];
CGImageRelease(imageRef);
UIImage *ScaledImage=[UIImage imageWithCGImage:imageRef];
return ScaledImage;
}
}
}
In my frame i am getting required image but when screen changes i can see full image. i want to cut the unwanted image.
Thise piece of code may help you out
-(CGRect) cropImage:(CGRect)frame
{
NSAssert(self.contentMode == UIViewContentModeScaleAspectFit, #"content mode should be aspect fit");
CGFloat wScale = self.bounds.size.width / self.image.size.width;
CGFloat hScale = self.bounds.size.height / self.image.size.height;
float x, y, w, h, offset;
if (wScale<hScale) {
offset = (self.bounds.size.height - (self.image.size.height*widthScale))/2;
x = frame.origin.x / wScale;
y = (frame.origin.y-offset) / wScale;
w = frame.size.width / wScale;
h = frame.size.height / wScale;
} else {
offset = (self.bounds.size.width - (self.image.size.width*heightScale))/2;
x = (frame.origin.x-offset) / hScale;
y = frame.origin.y / hScale;
w = frame.size.width / hScale;
h = frame.size.height / hScale;
}
return CGRectMake(x, y, w, h);
}
The code which I had referred to crop the image as per aspect ratio is :
typedef enum {
MGImageResizeCrop,
MGImageResizeCropStart,
MGImageResizeCropEnd,
MGImageResizeScale
} MGImageResizingMethod;
- (UIImage *)imageToFitSize:(CGSize)fitSize method:(MGImageResizingMethod)resizeMethod
{
float imageScaleFactor = 1.0;
#if __IPHONE_OS_VERSION_MAX_ALLOWED >= 40000
if ([self respondsToSelector:#selector(scale)]) {
imageScaleFactor = [self scale];
}
#endif
float sourceWidth = [self size].width * imageScaleFactor;
float sourceHeight = [self size].height * imageScaleFactor;
float targetWidth = fitSize.width;
float targetHeight = fitSize.height;
BOOL cropping = !(resizeMethod == MGImageResizeScale);
// Calculate aspect ratios
float sourceRatio = sourceWidth / sourceHeight;
float targetRatio = targetWidth / targetHeight;
// Determine what side of the source image to use for proportional scaling
BOOL scaleWidth = (sourceRatio <= targetRatio);
// Deal with the case of just scaling proportionally to fit, without cropping
scaleWidth = (cropping) ? scaleWidth : !scaleWidth;
// Proportionally scale source image
float scalingFactor, scaledWidth, scaledHeight;
if (scaleWidth) {
scalingFactor = 1.0 / sourceRatio;
scaledWidth = targetWidth;
scaledHeight = round(targetWidth * scalingFactor);
} else {
scalingFactor = sourceRatio;
scaledWidth = round(targetHeight * scalingFactor);
scaledHeight = targetHeight;
}
float scaleFactor = scaledHeight / sourceHeight;
// Calculate compositing rectangles
CGRect sourceRect, destRect;
if (cropping) {
destRect = CGRectMake(0, 0, targetWidth, targetHeight);
float destX, destY;
if (resizeMethod == MGImageResizeCrop) {
// Crop center
destX = round((scaledWidth - targetWidth) / 2.0);
destY = round((scaledHeight - targetHeight) / 2.0);
} else if (resizeMethod == MGImageResizeCropStart) {
// Crop top or left (prefer top)
if (scaleWidth) {
// Crop top
destX = 0.0;
destY = 0.0;
} else {
// Crop left
destX = 0.0;
destY = round((scaledHeight - targetHeight) / 2.0);
}
} else if (resizeMethod == MGImageResizeCropEnd) {
// Crop bottom or right
if (scaleWidth) {
// Crop bottom
destX = round((scaledWidth - targetWidth) / 2.0);
destY = round(scaledHeight - targetHeight);
} else {
// Crop right
destX = round(scaledWidth - targetWidth);
destY = round((scaledHeight - targetHeight) / 2.0);
}
}
sourceRect = CGRectMake(destX / scaleFactor, destY / scaleFactor,
targetWidth / scaleFactor, targetHeight / scaleFactor);
} else {
sourceRect = CGRectMake(0, 0, sourceWidth, sourceHeight);
destRect = CGRectMake(0, 0, scaledWidth, scaledHeight);
}
// Create appropriately modified image.
UIImage *image = nil;
#if __IPHONE_OS_VERSION_MAX_ALLOWED >= 40000
if ([[[UIDevice currentDevice] systemVersion] floatValue] >= 4.0) {
UIGraphicsBeginImageContextWithOptions(destRect.size, NO, 0.0); // 0.0 for scale means "correct scale for device's main screen".
CGImageRef sourceImg = CGImageCreateWithImageInRect([self CGImage], sourceRect); // cropping happens here.
image = [UIImage imageWithCGImage:sourceImg scale:0.0 orientation:self.imageOrientation]; // create cropped UIImage.
[image drawInRect:destRect]; // the actual scaling happens here, and orientation is taken care of automatically.
CGImageRelease(sourceImg);
image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
}
#endif
if (!image) {
// Try older method.
CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();
CGContextRef context = CGBitmapContextCreate(NULL, fitSize.width, fitSize.height, 8, (fitSize.width * 4),
colorSpace, kCGImageAlphaPremultipliedLast);
CGImageRef sourceImg = CGImageCreateWithImageInRect([self CGImage], sourceRect);
CGContextDrawImage(context, destRect, sourceImg);
CGImageRelease(sourceImg);
CGImageRef finalImage = CGBitmapContextCreateImage(context);
CGContextRelease(context);
CGColorSpaceRelease(colorSpace);
image = [UIImage imageWithCGImage:finalImage];
CGImageRelease(finalImage);
}
return image;
}
Check if this helps you.
I am using PBJVision library to capture images.
Under the hood it is using AVFoundation.
I set up the camera to use the following options
PBJVision *camera = [[PBJVision alloc] init];
self.camera = camera;
self.camera.delegate = self;
[self.camera setCameraMode:PBJCameraModePhoto];
[self.camera setCameraDevice:PBJCameraDeviceFront];
[self.camera setCameraOrientation:PBJCameraOrientationPortrait];
[self.camera setFocusMode:PBJFocusModeAutoFocus];
[self.camera setPresentationFrame:self.previewView.frame];
[self.camera previewLayer].frame = self.previewView.bounds;
[self.camera previewLayer].videoGravity = AVLayerVideoGravityResizeAspectFill;
[self.previewView.layer addSublayer:[self.camera previewLayer]];
because the preview layer gravity is set as AVLayerVideoGravityResizeAspectFill the captured image isn't identical to the previewed image.
How can I crop it according to the video gravity?
Based on Erica Sadun's excellent Cookbook, adding the code below to your view controller will allow you to do
UIImage *newImage = [self.applyAspectFillImage: image InRect: self.previewView.bounds];
You can obtain the maximum image size by using the smaller edge (width or height) of the original photo image to size your destination rectangle.
CGFloat scaleW = image.size.width / previewView.bounds.size.width;
CGRect destRect = CGRectMake(0, 0, image.size.width, preview.bounds.size.height * scaleW);
UIImage *newImage = [self.applyAspectFillImage: image InRect: destRect];
The code:
CGRect CGRectCenteredInRect(CGRect rect, CGRect mainRect)
{
CGFloat xOffset = CGRectGetMidX(mainRect)-CGRectGetMidX(rect);
CGFloat yOffset = CGRectGetMidY(mainRect)-CGRectGetMidY(rect);
return CGRectOffset(rect, xOffset, yOffset);
}
// Calculate the destination scale for filling
CGFloat CGAspectScaleFill(CGSize sourceSize, CGRect destRect)
{
CGSize destSize = destRect.size;
CGFloat scaleW = destSize.width / sourceSize.width;
CGFloat scaleH = destSize.height / sourceSize.height;
return MAX(scaleW, scaleH);
}
CGRect CGRectAspectFillRect(CGSize sourceSize, CGRect destRect)
{
CGSize destSize = destRect.size;
CGFloat destScale = CGAspectScaleFill(sourceSize, destRect);
CGFloat newWidth = sourceSize.width * destScale;
CGFloat newHeight = sourceSize.height * destScale;
CGFloat dWidth = ((destSize.width - newWidth) / 2.0f);
CGFloat dHeight = ((destSize.height - newHeight) / 2.0f);
CGRect rect = CGRectMake (dWidth, dHeight, newWidth, newHeight);
return rect;
}
- (UIImage *) applyAspectFillImage: (UIImage *) image InRect: (CGRect) bounds
{
CGRect destRect;
UIGraphicsBeginImageContext(bounds.size);
CGRect rect = CGRectAspectFillRect(image.size, bounds);
destRect = CGRectCenteredInRect(rect, bounds);
[image drawInRect: destRect];
UIImage *newImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return newImage;
}
I'm using CGRect to display an image. I'd like the CGRect to use the width and height of the image without me specifying it.
can this:
CGRectMake(0.0f, 40.0f, 480.0f, 280.0f);
become this:
CGRectMake(0.0f, 40.0f, myImage.width, myImage.height);
some images get distorted when I specify the parameters.
here's the code:
CGRect myImageRect = CGRectMake(0.0f, 40.0f, 480.0f, 280.0f);
UIImageView *myImage = [[UIImageView alloc] initWithFrame:myImageRect];
[myImage setImage:[UIImage imageNamed:recipe.img]];
thanks for any help.
Once you have a UIImage, you can find its size by looking at the size property:
UIImage * image = [UIImage imageNamed:recipe.img];
CGRect rect = CGRectMake(0.0f, 40.0f, image.size.width, image.size.height);
UIImageView * imageView = [[UIImageView alloc] initWithFrame:rect];
[imageView setImage:image];
This category on UIImage might be helpful.
Use it like this: aImage =[aImage imageByScalingProportionallyToSize: myImageRect]
#implementation UIImage (Extras)
- (UIImage *)imageByScalingProportionallyToSize:(CGSize)targetSize {
UIImage *sourceImage = self;
UIImage *newImage = nil;
CGSize imageSize = sourceImage.size;
CGFloat width = imageSize.width;
CGFloat height = imageSize.height;
CGFloat targetWidth = targetSize.width;
CGFloat targetHeight = targetSize.height;
CGFloat scaleFactor = 0.0;
CGFloat scaledWidth = targetWidth;
CGFloat scaledHeight = targetHeight;
CGPoint thumbnailPoint = CGPointMake(0.0,0.0);
if (CGSizeEqualToSize(imageSize, targetSize) == NO) {
CGFloat widthFactor = targetWidth / width;
CGFloat heightFactor = targetHeight / height;
if (widthFactor < heightFactor)
scaleFactor = widthFactor;
else
scaleFactor = heightFactor;
scaledWidth = width * scaleFactor;
scaledHeight = height * scaleFactor;
// center the image
// if (widthFactor < heightFactor) {
// thumbnailPoint.y = (targetHeight - scaledHeight) * 0.5;
// } else if (widthFactor > heightFactor) {
// thumbnailPoint.x = (targetWidth - scaledWidth) * 0.5;
// }
//thumbnailPoint.x
}
// this is actually the interesting part:
UIGraphicsBeginImageContext(targetSize);
CGRect thumbnailRect = CGRectZero;
thumbnailRect.origin = thumbnailPoint;
thumbnailRect.size.width = scaledWidth;
thumbnailRect.size.height = scaledHeight;
[sourceImage drawInRect:thumbnailRect];
newImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
if(newImage == nil) NSLog(#"could not scale image");
return newImage ;
}
#end;