Resizing the image to Aspect Fit - ios

I am trying to resize images using the following popular code and it is resizing the image but it is resizing the image as Scale to Fill, I would like to resize them as Aspect Fit. How do I do that?
func resizeImage(image: UIImage, newSize: CGSize) -> (UIImage) {
let newRect = CGRect(x: 0, y: 0, width: newSize.width, height: newSize.height).integral
UIGraphicsBeginImageContextWithOptions(newSize, true, 0)
let context = UIGraphicsGetCurrentContext()
// Set the quality level to use when rescaling
context!.interpolationQuality = CGInterpolationQuality.default
let flipVertical = CGAffineTransform(a: 1, b: 0, c: 0, d: -1, tx: 0, ty: newSize.height )
context!.concatenate(flipVertical)
// Draw into the context; this scales the image
context?.draw(image.cgImage!, in: CGRect(x: 0.0,y: 0.0, width: newRect.width, height: newRect.height))
let newImageRef = context!.makeImage()! as CGImage
let newImage = UIImage(cgImage: newImageRef)
// Get the resized image from the context and a UIImage
UIGraphicsEndImageContext()
return newImage
}
I have already set the content mode of image to Aspect Fit but still it is not working.
This is how I called the above code in my collection view controller
cell.imageView.image = UIImage(named: dogImages[indexPath.row])?.resizeImage(image: UIImage(named: dogImages[indexPath.row])
I manually selected my image in storyboard and set its content mode to apsect fit

Did you tried setting the newSize in aspect ratio of original Image size. If you want width fix calculate the height as per width and if you want height fix then calculate width as per height
Calculate height when width is fix:
let fixedWidth: CGFloat = 200
let newHeight = fixedWidth * image.size.height / image.size.width
let convertedImage = resizeImage(image: image, newSize: CGSize(width: fixedWidth, height: newHeight))
Calculate width when height is fix:
let fixedheight: CGFloat = 200
let newWidth = fixedheight * image.size.width / image.size.height
let convertedImage = resizeImage(image: image, newSize: CGSize(width: newWidth, height: fixedheight))
You can use this resized image with aspect fit ratio.
also check the answer: https://stackoverflow.com/a/8858464/2677551, that may help

func scaleImageAspectFit(newSize: CGSize) -> UIImage? {
var scaledImageRect: CGRect = CGRect.zero
let aspectWidth: CGFloat = newSize.width / size.width
let aspectHeight: CGFloat = newSize.height / size.height
let aspectRatio: CGFloat = min(aspectWidth, aspectHeight)
scaledImageRect.size.width = size.width * aspectRatio
scaledImageRect.size.height = size.height * aspectRatio
scaledImageRect.origin.x = (newSize.width - scaledImageRect.size.width) / 2.0
scaledImageRect.origin.y = (newSize.height - scaledImageRect.size.height) / 2.0
UIGraphicsBeginImageContextWithOptions(newSize, false, 0)
if UIGraphicsGetCurrentContext() != nil {
draw(in: scaledImageRect)
let scaledImage = UIGraphicsGetImageFromCurrentImageContext()
UIGraphicsEndImageContext()
return scaledImage
}
return nil
}
Usage :
let resizedImage = oldImage.scaleImageAspectFit(newSize: CGSize(width: nexSize.width, height: nexSize.height))

Related

White line appears on image when image is converted to data using jpegData?

When I convert UIImage to data using jpegData compression it will add a thin line on top of the image, why is that so?
Also I'm using this func for resizing image.
After resizing Im converting to jpegData.
func resizeImage(image: UIImage, targetSize: CGSize , percentage:Double) -> UIImage {
let size = image.size
let widthRatio = targetSize.width / size.width
let heightRatio = targetSize.height / size.height
// Figure out what our orientation is, and use that to form the rectangle
var newSize: CGSize
if(widthRatio > heightRatio) {
newSize = CGSize(width: size.width * heightRatio, height: size.height * heightRatio)
} else {
newSize = CGSize(width: size.width * widthRatio, height: size.height * widthRatio)
}
// This is the rect that we've calculated out and this is what is actually used below
let rect = CGRect(x: 0, y: 0, width: newSize.width, height: newSize.height)
// Actually do the resizing to the rect using the ImageContext stuff
UIGraphicsBeginImageContextWithOptions(newSize, false, percentage)
// UIGraphicsBeginImageContextWithOptions(newSize , false , )
image.draw(in: rect)
let newImage = UIGraphicsGetImageFromCurrentImageContext()!
UIGraphicsEndImageContext()
return newImage
}
let image = resizeImage.jpegData(compressionQuality: 0.60)!

Resize transparent image (UIImage) without getting black background

I tried to resize my UIImage which contains transparent image using next solution but it returns image without transparency, instead that transparent area become black color
extension UIImage{
func resizeImageWith(newSize: CGSize) -> UIImage {
let horizontalRatio = newSize.width / size.width
let verticalRatio = newSize.height / size.height
let ratio = max(horizontalRatio, verticalRatio)
let newSize = CGSize(width: size.width * ratio, height: size.height * ratio)
UIGraphicsBeginImageContextWithOptions(newSize, true, 0)
draw(in: CGRect(origin: CGPoint(x: 0, y: 0), size: newSize))
let newImage = UIGraphicsGetImageFromCurrentImageContext()
UIGraphicsEndImageContext()
return newImage!
}
}
You are setting the opaque property to true. If you would like it to be transparent you need to set it to false:
UIGraphicsBeginImageContextWithOptions(newSize, false, 0)
Note that UIGraphicsBeginImageContextWithOptions returns an optional image so you should change your return type also and you can use defer to end your context after returning the result:
extension UIImage {
func resizeImageWith(newSize: CGSize) -> UIImage? {
let horizontalRatio = newSize.width / size.width
let verticalRatio = newSize.height / size.height
let ratio = max(horizontalRatio, verticalRatio)
let newSize = CGSize(width: size.width * ratio, height: size.height * ratio)
UIGraphicsBeginImageContextWithOptions(newSize, false, 0)
defer { UIGraphicsEndImageContext() }
draw(in: CGRect(origin: .zero, size: newSize))
return UIGraphicsGetImageFromCurrentImageContext()
}
}

Is it possible to use the aspect fill content mode combined with the top content mode in UIImageView?

For a UIImageView, I am using the aspect fill content mode. It's OK, but for some images, it will cut from the top because I used clipsToBounds = true. So here is what I want: I want to make the two filters active at the same time, for example:
Here is an image view that I set to aspect fill:
...and an image view I set using contentMode = .top:
So I want to merge these two content modes. Is it possible? Thanks in advance.
Update: device scaling is now properly handled, thanks to budidino for that!
You should resize the image, so that it will have the width of your image view, but by keeping its aspect ratio. After that, set the image view's content mode to .top and enable clipping to bounds for it.
The resizeTopAlignedToFill function is a modified version of this answer.
func setImageView() {
imageView.contentMode = .top
imageView.clipsToBounds = true
let image = <custom image>
imageView.image = image.resizeTopAlignedToFill(newWidth: imageView.frame.width)
}
extension UIImage {
func resizeTopAlignedToFill(newWidth: CGFloat) -> UIImage? {
let newHeight = size.height * newWidth / size.width
let newSize = CGSize(width: newWidth, height: newHeight)
UIGraphicsBeginImageContextWithOptions(newSize, false, UIScreen.main.scale)
draw(in: CGRect(origin: .zero, size: newSize))
let newImage = UIGraphicsGetImageFromCurrentImageContext()
UIGraphicsEndImageContext()
return newImage
}
}
Try this:
imageView.contentMode = UIViewContentModeTop;
imageView.image = [UIImage imageWithCGImage:image.CGImage scale:image.size.width / imageView.frame.size.width orientation:UIImageOrientationUp];
This is my solution. First, you set comtentMode to top then you resize image
func resize(toWidth scaledToWidth: CGFloat) -> UIImage {
let image = self
let oldWidth = image.size.width
let scaleFactor = scaledToWidth / oldWidth
let newHeight = image.size.height * scaleFactor
let newWidth = oldWidth * scaleFactor
let scaledSize = CGSize(width:newWidth, height:newHeight)
UIGraphicsBeginImageContextWithOptions(scaledSize, false, 0)
image.draw(in: CGRect(x: 0, y: 0, width: scaledSize.width, height: scaledSize.height))
let scaledImage = UIGraphicsGetImageFromCurrentImageContext()
UIGraphicsEndImageContext()
return scaledImage!
}
imageView.contentMode = .top
imageView.image = imageView.resize(toWidth:imageView.frame.width)
The accepted answer was scaling down the image and therefore lowering the quality on #2x and #3x devices. This should produce the same result with better image quality:
extension UIImage {
func resizeTopAlignedToFill(containerSize: CGSize) -> UIImage? {
let scaleTarget = containerSize.height / containerSize.width
let scaleOriginal = size.height / size.width
if scaleOriginal <= scaleTarget { return self }
let newHeight = size.width * scaleTarget
let newSize = CGSize(width: size.width, height: newHeight)
UIGraphicsBeginImageContextWithOptions(newSize, false, scale)
self.draw(in: CGRect(origin: .zero, size: newSize))
let newImage = UIGraphicsGetImageFromCurrentImageContext()
UIGraphicsEndImageContext()
return newImage
}
}
then to use it just:
imageView.contentMode = .scaleAspectFill
imageView.clipsToBounds = true
imageView.image = UIImage(named: "portrait").resizeTopAlignedToFill(containerSize: imageView.frame.size)
as a matter of performance I would suggest to put the UIImageView with the full image size / aspect ratio into a ContainerView with clipsToBounds set to true.

iOS: Get displayed image size in pixels

In my app, I'm displaying an image of a rectangle from the assets library. The image is 100x100 pixels. I'm only using the 1x slot for this asset.
I want to display this image at 300x300 pixels. Doing this using points is quite simple but I can't figure out how to get UIImageView to set the size in pixels.
Alternatively, if I can't set the size in pixels to display, I'd like to get the size in pixels that the image is being displayed.
I have tried using .scale on the UIImageView and UIImage instances, but it's always 1. Even though I have set constraints to 150 and 300.
To get size in pixels of UIImageView:
let widthInPixels = imageView.frame.width * UIScreen.mainScreen().scale
let heightInPixels = imageView.frame.height * UIScreen.mainScreen().scale
To get size in pixels of UIImage:
let widthInPixels = image.size.width * image.scale
let heightInPixels = image.size.height * image.scale
Swift 5
Take a Look here
// This extension to save ImageView as UImage
extension UIView {
func asImage() -> UIImage {
let renderer = UIGraphicsImageRenderer(bounds: bounds)
return renderer.image { rendererContext in
layer.render(in: rendererContext.cgContext)
}
}
}
// this extension to resize image
extension UIImage {
func resizeImage(targetSize: CGSize) -> UIImage {
let size = self.size
let widthRatio = targetSize.width / size.width
let heightRatio = targetSize.height / size.height
let newSize = widthRatio > heightRatio ? CGSize(width: size.width * heightRatio, height: size.height * heightRatio) : CGSize(width: size.width * widthRatio, height: size.height * widthRatio)
let rect = CGRect(x: 0, y: 0, width: newSize.width, height: newSize.height)
UIGraphicsBeginImageContextWithOptions(newSize, false, 1.0)
self.draw(in: rect)
let newImage = UIGraphicsGetImageFromCurrentImageContext()
UIGraphicsEndImageContext()
return newImage!
}
}
// This extension will display image with 300X300 pixels
extension UIImage {
func Size300X300() -> UIImage? {
let imageView = UIImageView()
imageView.contentMode = .scaleAspectFit
imageView.frame = CGRect(x: 0, y: 0, width: 300, height: 300)
let image = imageView.asImage()
let newImage = image.resizeImage(targetSize: CGSize(width:300, height: 300))
return newImage
}
}
let image = YOURIMAGE.Size300X300()
imageView.image = image!

Resize UIImage without squashing

I would like to resize a UIImage to be able to upload it to parse.com. I would like to resize it without squashing it. How can I make sure that it is small enough to upload to parse.com (10485760 bytes) but not squash it to a set size.
This is the code I tried below but obviously sets the image size exactly.
Any ideas?
var newSize:CGSize = CGSize(width: 600,height: 600)
let rect = CGRectMake(0,0, newSize.width, newSize.height)
UIGraphicsBeginImageContextWithOptions(newSize, false, 0.0)
// image is a variable of type UIImage
profileImage?.drawInRect(rect)
profileImage = UIGraphicsGetImageFromCurrentImageContext()
UIGraphicsEndImageContext()
Try using this method to scale your image while maintaining the aspect ratio:
func scaleImage(image: UIImage, maxDimension: CGFloat) -> UIImage {
var scaledSize = CGSize(width: maxDimension, height: maxDimension)
var scaleFactor: CGFloat
if image.size.width > image.size.height {
scaleFactor = image.size.height / image.size.width
scaledSize.width = maxDimension
scaledSize.height = scaledSize.width * scaleFactor
} else {
scaleFactor = image.size.width / image.size.height
scaledSize.height = maxDimension
scaledSize.width = scaledSize.height * scaleFactor
}
UIGraphicsBeginImageContext(scaledSize)
image.drawInRect(CGRectMake(0, 0, scaledSize.width, scaledSize.height))
let scaledImage = UIGraphicsGetImageFromCurrentImageContext()
UIGraphicsEndImageContext()
return scaledImage
}
In this case, you would call it like so:
profileImage = scaleImage(profileImage, newSize: CGSizeMake(600, 600))

Resources