I have an UIImageView (red squares) that will display a UIImage that must be scaled (I can receive images greater or smaller that the UIImageView). After scaling it, the showed part of the UIImage is the center of it.
What I need is to show the part of the image in the blue squares, how can I archive it?
I'm only able to get the image size (height and width), but it display the original size, when it's supposed to be the scaled one.
self.viewIm = [[UIImageView alloc] initWithFrame:CGRectMake(100, 100, 120, 80)];
self.viewIm.backgroundColor = [UIColor greenColor];
self.viewIm.layer.borderColor = [UIColor redColor].CGColor;
self.viewIm.layer.borderWidth = 5.0;
UIImage *im = [UIImage imageNamed:#"benjen"];
self.viewIm.image = im;
self.viewIm.contentMode = UIViewContentModeScaleAspectFill;
// self.viewim.clipsToBounds = YES;
[self.view addSubview:self.viewIm];
To do what you're trying to do, I'd recommend looking into CALayer's contentsRect property.
Since seeing your answer, I've been trying to work out the proper solution for a while, but the mathematics escapes me because contentsRect:'s x and y parameters seem sort of mysterious... But here's some code that may point you in the right direction...
float imageAspect = self.imageView.image.size.width/self.imageView.image.size.height;
float imageViewAspect = self.imageView.frame.size.width/self.imageView.frame.size.height;
if (imageAspect > imageViewAspect) {
float scaledImageWidth = self.imageView.frame.size.height * imageAspect;
float offsetWidth = -((scaledImageWidth-self.imageView.frame.size.width)/2);
self.imageView.layer.contentsRect = CGRectMake(offsetWidth/self.imageView.frame.size.width, 0.0, 1.0, 1.0);
} else if (imageAspect < imageViewAspect) {
float scaledImageHeight = self.imageView.frame.size.width * imageAspect;
float offsetHeight = ((scaledImageHeight-self.imageView.frame.size.height)/2);
self.imageView.layer.contentsRect = CGRectMake(0.0, offsetHeight/self.imageView.frame.size.height, 1.0, 1.0);
}
Try something like this:
CGRect cropRect = CGRectMake(0,0,200,200);
CGImageRef imageRef = CGImageCreateWithImageInRect([ImageToCrop CGImage],cropRect);
UIImage *image = [UIImage imageWithCGImage:imageRef];
CGImageRelease(imageRef);
I found a very good approximation on this answer. In that, the category resize the image, and use the center point to crop after that. I adapt it to crop using (0,0) as origin point. As I don't really need a category, I use it as a single method.
- (UIImage *)imageByScalingAndCropping:(UIImage *)image forSize:(CGSize)targetSize {
UIImage *sourceImage = image;
UIImage *newImage = nil;
CGFloat scaleFactor = 0.0;
CGFloat scaledWidth = targetSize.width;
CGFloat scaledHeight = targetSize.height;
if (CGSizeEqualToSize(image.size, targetSize) == NO) {
if ((targetSize.width / image.size.width) > (targetSize.height / image.size.height)) {
scaleFactor = targetSize.width / image.size.width; // scale to fit height
} else {
scaleFactor = targetSize.height / image.size.height; // scale to fit width
}
scaledWidth = image.size.width * scaleFactor;
scaledHeight = image.size.height * scaleFactor;
}
UIGraphicsBeginImageContext(targetSize); // this will crop
CGRect thumbnailRect = CGRectZero;
thumbnailRect.origin = CGPointZero;
thumbnailRect.size.width = scaledWidth;
thumbnailRect.size.height = scaledHeight;
[sourceImage drawInRect:thumbnailRect];
newImage = UIGraphicsGetImageFromCurrentImageContext();
if(newImage == nil) {
NSLog(#"could not scale image");
}
//pop the context to get back to the default
UIGraphicsEndImageContext();
return newImage;
}
And my call is something like this:
self.viewIm = [[UIImageView alloc] initWithFrame:CGRectMake(100, 100, 120, 80)];
self.viewIm.image = [self imageByScalingAndCropping:[UIImage imageNamed:#"benjen"] forSize:CGSizeMake(120, 80)];
[self.view addSubview:self.viewIm];
I've spent some time on this and finally created a Swift 3.2 solution (based on one of my answers on another thread, as well as one of the answers above). This code only allows for Y translation of the image, but with some tweaks anyone should be able to add horizontal translation as well ;)
let yOffset: CGFloat = 20
myImageView.contentMode = .scaleAspectFill
//scale image to fit the imageView's width (maintaining aspect ratio), but allow control over the image's Y position
UIGraphicsBeginImageContextWithOptions(myImageView.frame.size, myImageView.isOpaque, 0.0)
let ratio = myImage.size.width / myImage.size.height
let newHeight = myImageView.frame.width / ratio
let rect = CGRect(x: 0, y: -yOffset, width: myImageView.frame.width, height: newHeight)
myImage.draw(in: rect)
let newImage = UIGraphicsGetImageFromCurrentImageContext() ?? myImage
UIGraphicsEndImageContext()
//set the new image
myImageView.image = newImage
Now you can adjust how far down or up you need the image to be by changing the yOffset.
Related
I am using the code below first to create an image thumb (using a category) and then tailor the thumb to the VC in question, for example, make it round.
Somehow, the aspect ratio of images is not getting preserved with some getting squashed vertically...so a face looks like a sideways oval while others get squashed horizontally, so a round ball looks like an upright football. In the code for individual VCs, I am using UIViewContentModeScaleAspectFill and setting clip to bounds to yes but to no avail. Also tried checking these in Storybord but still no luck.
Can anyone see what might be wrong with code below?
//code in viewDidLoad
UIImage *thumbnail = [selectedImage createThumbnailToFillSize:CGSizeMake(side, side)];
//see createThumbNail method below
self.contactImage.image = thumbnail;
//image has been selected and trimmed to thumb. Now format it
CGSize itemSize = CGSizeMake(64, 64);
UIGraphicsBeginImageContextWithOptions(itemSize, NO, UIScreen.mainScreen.scale);
CGRect imageRect = CGRectMake(0.0, 0.0, itemSize.width, itemSize.height);
self.contactImage.contentMode = UIViewContentModeScaleAspectFill;
self.contactImage.clipsToBounds = YES;
[self.contactImage.image drawInRect:imageRect];
self.contactImage.image = UIGraphicsGetImageFromCurrentImageContext();
self.contactImage.layer.cornerRadius=60.0;
UIGraphicsEndImageContext();
//Generic category to create thumb
-(UIImage *) createThumbnailToFillSize:(CGSize)size
{
CGSize mainImageSize = size;
UIImage *thumb;
CGFloat widthScaler = size.width / mainImageSize.width;
CGFloat heightScaler = size.height / mainImageSize.height;
CGSize repositionedMainImageSize = mainImageSize;
CGFloat scaleFactor;
// Determine if we should shrink based on width or hight
if(widthScaler > heightScaler)
{
// calculate based on width scaler
scaleFactor = widthScaler;
repositionedMainImageSize.height = ceil(size.height / scaleFactor);
}
else {
// calculate based on height scaler
scaleFactor = heightScaler;
repositionedMainImageSize.width = ceil(size.width / heightScaler);
}
UIGraphicsBeginImageContext(size);
CGFloat xInc = ((repositionedMainImageSize.width-mainImageSize.width) / 2.f) *scaleFactor;
CGFloat yInc = ((repositionedMainImageSize.height-mainImageSize.height) / 2.f) *scaleFactor;
[self drawInRect:CGRectMake(xInc, yInc, mainImageSize.width * scaleFactor, mainImageSize.height * scaleFactor)];
thumb = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return thumb;
}
How may I retrieve the image from imageView sized as it is displayed (given the content mode), and not as it is according to native properties?
Code:
UIImageView *imageView = [[UIImageView alloc] initWithFrame:CGRectMake(0, 0, WID, WID)];
imageView.center = CGPointMake(point.x, point.y + Y_OFFSET);
imageView.image = [UIImage imageNamed:#"img"];
imageView.contentMode = UIViewContentModeScaleAspectFit;
You have to draw the image again then save it.
// Image frame size
CGSize size = imageView.bounds.size;
// Grab a new CGContext
UIGraphicsBeginImageContextWithOptions(size, false, 0.0);
// Draw the image
[image drawInRect:CGRectMake(0, 0, size.width, size.height)];
// Grab the new image
UIImage *newImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
The above code draws the image in the frame, stretched to the bounds. If you want any other modes of how it is drawn, you have to calculate them yourself and put the desired stuff in the "Draw the image" line of code.
For example, for aspect fit, check out this algorithm:
- (CGRect) aspectFittedRect:(CGSize)inSize max:(CGRect)maxRect {
float originalAspectRatio = inSize.width / inSize.height;
float maxAspectRatio = maxRect.size.width / maxRect.size.height;
CGRect newRect = maxRect;
if (originalAspectRatio > maxAspectRatio) { // scale by width
newRect.size.height = maxRect.size.height * inSize.height / inSize.width;
newRect.origin.y += (maxRect.size.height - newRect.size.height)/2.0;
} else {
newRect.size.width = maxRect.size.height * inSize.width / inSize.height;
newRect.origin.x += (maxRect.size.width - newRect.size.width)/2.0;
}
return CGRectIntegral(newRect);
}
Just pass in imageView.image.size as inSize and imageView.bounds as maxRect.
Source:
http://iphonedevsdk.com/forum/iphone-sdk-development-advanced-discussion/15001-aspect-fit-algorithm.html
I am using PBJVision library to capture images.
Under the hood it is using AVFoundation.
I set up the camera to use the following options
PBJVision *camera = [[PBJVision alloc] init];
self.camera = camera;
self.camera.delegate = self;
[self.camera setCameraMode:PBJCameraModePhoto];
[self.camera setCameraDevice:PBJCameraDeviceFront];
[self.camera setCameraOrientation:PBJCameraOrientationPortrait];
[self.camera setFocusMode:PBJFocusModeAutoFocus];
[self.camera setPresentationFrame:self.previewView.frame];
[self.camera previewLayer].frame = self.previewView.bounds;
[self.camera previewLayer].videoGravity = AVLayerVideoGravityResizeAspectFill;
[self.previewView.layer addSublayer:[self.camera previewLayer]];
because the preview layer gravity is set as AVLayerVideoGravityResizeAspectFill the captured image isn't identical to the previewed image.
How can I crop it according to the video gravity?
Based on Erica Sadun's excellent Cookbook, adding the code below to your view controller will allow you to do
UIImage *newImage = [self.applyAspectFillImage: image InRect: self.previewView.bounds];
You can obtain the maximum image size by using the smaller edge (width or height) of the original photo image to size your destination rectangle.
CGFloat scaleW = image.size.width / previewView.bounds.size.width;
CGRect destRect = CGRectMake(0, 0, image.size.width, preview.bounds.size.height * scaleW);
UIImage *newImage = [self.applyAspectFillImage: image InRect: destRect];
The code:
CGRect CGRectCenteredInRect(CGRect rect, CGRect mainRect)
{
CGFloat xOffset = CGRectGetMidX(mainRect)-CGRectGetMidX(rect);
CGFloat yOffset = CGRectGetMidY(mainRect)-CGRectGetMidY(rect);
return CGRectOffset(rect, xOffset, yOffset);
}
// Calculate the destination scale for filling
CGFloat CGAspectScaleFill(CGSize sourceSize, CGRect destRect)
{
CGSize destSize = destRect.size;
CGFloat scaleW = destSize.width / sourceSize.width;
CGFloat scaleH = destSize.height / sourceSize.height;
return MAX(scaleW, scaleH);
}
CGRect CGRectAspectFillRect(CGSize sourceSize, CGRect destRect)
{
CGSize destSize = destRect.size;
CGFloat destScale = CGAspectScaleFill(sourceSize, destRect);
CGFloat newWidth = sourceSize.width * destScale;
CGFloat newHeight = sourceSize.height * destScale;
CGFloat dWidth = ((destSize.width - newWidth) / 2.0f);
CGFloat dHeight = ((destSize.height - newHeight) / 2.0f);
CGRect rect = CGRectMake (dWidth, dHeight, newWidth, newHeight);
return rect;
}
- (UIImage *) applyAspectFillImage: (UIImage *) image InRect: (CGRect) bounds
{
CGRect destRect;
UIGraphicsBeginImageContext(bounds.size);
CGRect rect = CGRectAspectFillRect(image.size, bounds);
destRect = CGRectCenteredInRect(rect, bounds);
[image drawInRect: destRect];
UIImage *newImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return newImage;
}
I am creating an iPhone app which has image cropping feature. In this, I am getting the photos from the UIImagePickerController and passing it for cropping. There it has a scrollview and the selected image will be added as a subview to the scrollview. And I am using a UIButton for selecting the area for cropping. User can move the button over the imageview and place it anywhere, and when click on CROP button, the area similar to the frame size of the button should be cropped from the imageview.
I used the following code, but it is not returning the actual image.
CGRect clippedRect = CGRectMake(self.scrollView.frame.origin.x+90, self.scrollView.frame.origin.y, self.scrollView.frame.size.width-180, self.scrollView.frame.size.height-220);
CGImageRef imageRef = CGImageCreateWithImageInRect([self.myPhoto CGImage], clippedRect);
UIImage *newImage = [UIImage imageWithCGImage:imageRef];
CGImageRelease(imageRef);
self.imageView.image = newImage;
also used
- (UIImage *)cropImage:(UIImage *)oldImage {
CGSize imageSize = self.cropFrame.frame.size;
UIGraphicsBeginImageContextWithOptions( CGSizeMake( imageSize.width, imageSize.height), NO, 0.);
[oldImage drawAtPoint:CGPointMake( xPosition, yPosition)
blendMode:kCGBlendModeCopy
alpha:1.];
UIImage *croppedImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return croppedImage;
}
but the result image is not the exact image as per the button frame. I am getting the image from another area.
Updated code
- (void)loadPhoto{
CGFloat w = self.myPhoto.size.width;
CGFloat h = self.myPhoto.size.height;
CGRect imageViewFrame = CGRectMake(0.0f, 0.0f, roundf(w / 2.0f), roundf(h / 2.0f));
self.scrollView.contentSize = imageViewFrame.size;
UIImageView *iv = [[UIImageView alloc] initWithFrame:imageViewFrame];
iv.contentMode = UIViewContentModeScaleAspectFit;
iv.image = self.myPhoto;
[self.view addSubview:iv];
self.imageView = iv;
[iv release];
}
CGRect crop;//= CGRectMake(10, 10, 360, 360);
crop.origin.x = self.cropFrame.frame.origin.x;
crop.origin.y = self.cropFrame.frame.origin.y;
crop.size.width = roundf(self.cropFrame.frame.size.width * 2.0f); //self.cropFrame.frame.size.width * 2;
crop.size.height = roundf(self.cropFrame.frame.size.height * 2.0f); //self.cropFrame.frame.size.height * 2;
NSLog(#"Rect: %#", NSStringFromCGRect(crop));
self.imageView.image = [self croppedImage:crop];
- (UIImage *)croppedImage:(CGRect)bounds {
CGImageRef imageRef = CGImageCreateWithImageInRect([self.imageView.image CGImage], bounds);
UIImage *croppedImage = [UIImage imageWithCGImage:imageRef scale:1.0 orientation:self.myPhoto.imageOrientation];
CGImageRelease(imageRef);
return croppedImage;
}
Please help to find a solution.
The iOS has a default feature for cropping images.Try this code.
picker.allowsEditing = YES;
and also check this controller for cropping..this is exactly the one you are looking for I think https://github.com/barrettj/BJImageCropper .Hope this helps you..
Since you are using a scrollView that allows the image to be scrolled, you need to adjust your crop rect to the scrollView's position:
float zoomScale = self.scrollView.zoomScale;
int cropX = (self.scrollView.contentOffset.x-imageView.frame.origin.x)/zoomScale;
int cropY = (self.scrollView.contentOffset.y-imageView.frame.origin.y)/zoomScale;
You could use this crop tool that I made. It essentially gives you an interface to allow the user to select the crop area. I think it is in line with that you are looking for.
https://github.com/nicholjs/BFCropInterface
Believing you have solve this problem. Me too had this when tried cropping functionality
Set image.size as the imageView.size & scrollView.contentSize. Below code will give the rect to crop
cropRect.origin = scrollView.contentOffset;
cropRect.size = scrollView.bounds.size;
cropRect.origin.x /= scrollView.zoomScale;
cropRect.origin.y /= scrollView.zoomScale;
cropRect.size.width /= scrollView.zoomScale;
cropRect.size.height /= scrollView.zoomScale;
If planning to show the full image first on visible rect. Setting the imageView.size & scrollView.contentSize to visible view size will give crop image of some other area. Instead try finding the zoom scale by
CGFloat dxWidth = viewCrop.frame.size.width / imageView.image.size.width;
CGFloat dxHeight = viewCrop.frame.size.height / imageView.image.size.height;
CGFloat zoomScale = fmaxf(dWidth, dHeight)
and apply (if by adding subView then after addSubView)
scrollView.minimumZoomScale = zoomScale; // to disable further zoom-out
[scrollView setZoomScale: zoomScale];
I want to resize a UIImage with maintaining its Aspect Ratio. I have written the following code, but it is not working as expected.
Code
-(UIImage * ) scaleImage: (UIImage * ) image toSize: (CGSize) targetSize {
CGFloat scaleFactor = 1.0;
if (image.size.width > targetSize.width || image.size.height > targetSize.height)
if (!((scaleFactor = (targetSize.width / image.size.width)) > (targetSize.height / image.size.height))) //scale to fit width, or
scaleFactor = targetSize.height / image.size.height; // scale to fit heigth.
UIGraphicsBeginImageContext(targetSize);
CGRect rect = CGRectMake((targetSize.width - image.size.width * scaleFactor) / 2, (targetSize.height - image.size.height * scaleFactor) / 2,
image.size.width * scaleFactor, image.size.height * scaleFactor);
[image drawInRect: rect];
UIImage * scaledImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return scaledImage;
}
What exactly is wrong here?
I'm using something similar to this in a few projects:
- (UIImage*) scaleImage:(UIImage*)image toSize:(CGSize)newSize {
CGSize scaledSize = newSize;
float scaleFactor = 1.0;
if( image.size.width > image.size.height ) {
scaleFactor = image.size.width / image.size.height;
scaledSize.width = newSize.width;
scaledSize.height = newSize.height / scaleFactor;
}
else {
scaleFactor = image.size.height / image.size.width;
scaledSize.height = newSize.height;
scaledSize.width = newSize.width / scaleFactor;
}
UIGraphicsBeginImageContextWithOptions( scaledSize, NO, 0.0 );
CGRect scaledImageRect = CGRectMake( 0.0, 0.0, scaledSize.width, scaledSize.height );
[image drawInRect:scaledImageRect];
UIImage* scaledImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return scaledImage;
}
If this is for displaying in a UI, you can use Interface Builder and specify the "Aspect Fit" property.
You can also do this in code by setting the content mode to UIViewContentModeScaleAspectFit:
imageView.contentMode = UIViewContentModeScaleAspectFit;
This method takes an image and a max dimension. If the max original image dimension is less than the specified max dimension, you get the original back so it won't get blown up. Otherwise, you get a new image having a max dimension of the one specified and the other determined by the original aspect ratio. So as an example, if you give it a 1024x768 image and max dimension of 640 you get back a 640x480 version, if you give it a 768x1024 image and max dimension of 640 you get back a 480x640 version.
- (UIImage *)resizeImage:(UIImage *)image
withMaxDimension:(CGFloat)maxDimension
{
if (fmax(image.size.width, image.size.height) <= maxDimension) {
return image;
}
CGFloat aspect = image.size.width / image.size.height;
CGSize newSize;
if (image.size.width > image.size.height) {
newSize = CGSizeMake(maxDimension, maxDimension / aspect);
} else {
newSize = CGSizeMake(maxDimension * aspect, maxDimension);
}
UIGraphicsBeginImageContextWithOptions(newSize, NO, 1.0);
CGRect newImageRect = CGRectMake(0.0, 0.0, newSize.width, newSize.height);
[image drawInRect:newImageRect];
UIImage *newImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return newImage;
}
In my case (I wanted to have an image in a UIView) and keep the ratio with auto layout the way described here was great. The most important part of the article for me was (almost quoting):
constraint = [NSLayoutConstraint constraintWithItem:self.imageView
attribute:NSLayoutAttributeHeight
relatedBy:NSLayoutRelationEqual
toItem:self.imageView
attribute:NSLayoutAttributeWidth
multiplier:ration
constant:0.0f];
[self.imageView addConstraint:constraint];
the only thing is to calculate the ratio according to the actual width and height of image. I found this way easy and elegant.
- (UIImage *)imageWithImage:(UIImage*)image scaledToSize:(CGSize)newSize {
UIGraphicsBeginImageContext(newSize);
float ratio = newSize.width/image.size.width;
[image drawInRect:CGRectMake(0, 0, newSize.width, ratio * image.size.height)];
UIImage *newImage = UIGraphicsGetImageFromCurrentImageContext();
NSLog(#"New Image Size : (%f, %f)", newImage.size.width, newImage.size.height);
UIGraphicsEndImageContext();
return newImage;
}
In Swift 5 you can use scaleAspectFit on the contentMode attribute.
let photo = UIImage(imageLiteralResourceName: "your_img_name")
let photoImageView = UIImageView(image: photo)
photoImageView.contentMode = .scaleAspectFit