UIImageView in UITableViewCell low resolution after resizing - ios

I really got frustrated with auto layout and UITableViewCell stuff. I have a UITableView with dynamic heighted cells. Cells can have an image inside them. Image should fit to UIImageView's width.
Images are larger than UIImageView's size. So, after downloading the image, I resize the image and then put it inside the UIImageView. However, resizing the image with respect to UIImageView's width leads to low resolution as UIScreen's scale is greater than 1. So I tried to increase the expected size of the image after the operation (size= (scale * imageViewWidth, scale * imageViewHeight)).
However, this time, cell's height becomes 2 times bigger. What I'm trying to do is basically, while keeping UIImageView's height the same, increasing the resolution.
class ProgressImageView : UIImageView{
func setImage(imagePath:String?, placeholder:String, showProgress:Bool, showDetail:Bool){
if imagePath == nil{
self.image = UIImage(named: placeholder)
}else{
if showProgress{
self.setShowActivityIndicatorView(true)
self.setIndicatorStyle(.Gray)
}
let placeholderImage = UIImage(named: placeholder)
SDWebImageManager.sharedManager().downloadImageWithURL(NSURL(string: imagePath!), options: SDWebImageOptions.RefreshCached, progress: nil, completed: { (image, error, _, _, _) in
//resize image to fit width
if error != nil{
self.image = placeholderImage
}else {
self.setImageFitWidth(image)
}
})
}
}
}
extension UIImageView{
func setImageFitWidth(image: UIImage){
let w = self.bounds.size.width //* UIScreen.mainScreen().scale
print ("ImageView size:\(self.bounds.size) scaled width:\(w)")
self.image = image.convertToWidth(w)
}
}
extension UIImage{
func convertToWidth(let width: CGFloat) -> UIImage{
print ("Old image size:\(self.size)")
let ratio = self.size.width / width
let height = self.size.height / ratio
let size = CGSizeMake(width, height)
print ("New image size:\(size)")
UIGraphicsBeginImageContext(size)
self.drawInRect(CGRectMake(0, 0, size.width, size.height))
let image = UIGraphicsGetImageFromCurrentImageContext()
UIGraphicsEndImageContext()
return image
}
}
As you can see in setImageFitWidth method, without UIScreen.mainScreen().scale it works just fine except the resolution and with the scale, UIImageView doubles in size. By the way, I tried all possible options of the content mode (aspect fit, aspect fill etc.) and I am using auto layout.
Result in low resolution:
Result in high resolution:
I want my screen to be just like in the first ss but with resolution of the image in second ss.

Try using UIGraphicsBeginImageContextWithOptions(size: size, opaque: YES, scale: 0)
to draw your image.
Scale 0 means the function will get the proper scale factor from the device.

Related

Quality get reduced when convert imageview to image

In photo editor screen , I have imageview and it has background image and on top of imageview I add elements like text (label), stickers(images) etc. , Now for the final image containing all elements added on imageview , I am getting image from below code
clipRect is rect for background image inside imageview, image is aspectfit in imageview
Below is code inside UIView extension which has function to generate image out of view.
self == uiview
let op_format = UIGraphicsImageRendererFormat()
op_format.scale = 1.0
let renderer = UIGraphicsImageRenderer(bounds: CGRect(origin: clipRect!.origin, size: outputSize), format: op_format)
let originalBound = self.bounds
self.bounds = CGRect(origin: clipRect!.origin, size: clipRect!.size)
var finalImage = renderer.image { ctx in
self.drawHierarchy(in: CGRect(origin: self.bounds.origin, size: outputSize), afterScreenUpdates: true)
}
self.bounds = CGRect(origin: originalBound.origin, size: originalBound.size)
Issue here is quality of final image quality is very poor as compared to original background image.
Don't set the scale of your UIGraphicsImageRendererFormat to 1. That forces the output image to #1x (non-retina). For most (all?) iOS devices, that will cause a 2X or 3X loss of resolution. Leave the scale value of the UIGraphicsImageRendererFormat at the default value.
you can take screenshot as well
// Convert a uiview to uiimage
func captureView() -> UIImage {
// Gradually increase the number for high resolution.
let scale = 1.0
UIGraphicsBeginImageContextWithOptions(bounds.size, opaque, scale)
layer.renderInContext(UIGraphicsGetCurrentContext()!)
let image:UIImage = UIGraphicsGetImageFromCurrentImageContext()
UIGraphicsEndImageContext()
return image
}

How to reduce the image size coming from web services?

In the below shown image i am getting the images from web services and passing it to a table view but when scrolling up and down the image size was increasing and it is overlapping on labels and i had given constraints also can anyone help me how to avoid this ?
Hey you don't need to resize image.
First Set fix height width of your image view with constraints in tableview cell
Second Set imageview to aspectFit.
imageView.contentMode = UIViewContentModeScaleAspectFit;
Constraints add like this of your image view
Hope you will get success using that, if you any query regarding this , just comment will help
Using content mode to fit the image is an option but if you want to crop or resize it or compress image check the below code.
Call like let imageData = image.compressImage(rate: 0.5) and then you can provide data to write the image if needed.
func compressImage(rate: CGFloat) -> Data? {
return UIImageJPEGRepresentation(self, rate)
}
OR If you want to crop the image then,
func croppedImage(_ bound: CGRect) -> UIImage? {
guard self.size.width > bound.origin.x else {
print("X coordinate is larger than the image width")
return nil
}
guard self.size.height > bound.origin.y else {
print("Y coordinate is larger than the image height")
return nil
}
let scaledBounds: CGRect = CGRect(x: bound.x * self.scale, y: bound.y * self.scale, width: bound.w * self.scale, height: bound.h * self.scale)
let imageRef = self.cgImage?.cropping(to: scaledBounds)
let croppedImage: UIImage = UIImage(cgImage: imageRef!, scale: self.scale, orientation: UIImageOrientation.up)
return croppedImage
}
Make sure to add the above methods to UIImage Extension.
I had placed a view on table view cell and then I had placed all the elements and given constraints for view and elements in it then my problem has been reduced and working perfectly when scrolling also and is as shown in image below.
here is the layout for this screen

Resize Image without distorting or cropping it

Users upload an image of any size and we need to resize it so it becomes a square without distorting or cropping the image. Basically, it should do something similar to the "Aspect Fit" content mode in an image view. So if we have a 200x100px png image, I want to make it 200x200px and have the extra 100px in the height be transparent space. It should not crop the image to 200x200.
I tried to use this image processor but it does not do what I want. https://github.com/gavinbunney/Toucan. It only crops the image.
How would I do this in swift and is there a framework that is better than the one I mentioned above to make doing this easier. Basically, I am looking for the simplest way to do this.
Posting this as an answer, along with example usage...
The scaling code is not mine, it's from: https://gist.github.com/tomasbasham/10533743#gistcomment-1988471
Here is code you can run in a playground to test:
import UIKit
import PlaygroundSupport
let container = UIView(frame: CGRect(x: 0, y: 0, width: 800, height: 800))
container.backgroundColor = UIColor.blue
PlaygroundPage.current.liveView = container
// MARK: - Image Scaling.
extension UIImage {
/// Represents a scaling mode
enum ScalingMode {
case aspectFill
case aspectFit
/// Calculates the aspect ratio between two sizes
///
/// - parameters:
/// - size: the first size used to calculate the ratio
/// - otherSize: the second size used to calculate the ratio
///
/// - return: the aspect ratio between the two sizes
func aspectRatio(between size: CGSize, and otherSize: CGSize) -> CGFloat {
let aspectWidth = size.width/otherSize.width
let aspectHeight = size.height/otherSize.height
switch self {
case .aspectFill:
return max(aspectWidth, aspectHeight)
case .aspectFit:
return min(aspectWidth, aspectHeight)
}
}
}
/// Scales an image to fit within a bounds with a size governed by the passed size. Also keeps the aspect ratio.
///
/// - parameter:
/// - newSize: the size of the bounds the image must fit within.
/// - scalingMode: the desired scaling mode
///
/// - returns: a new scaled image.
func scaled(to newSize: CGSize, scalingMode: UIImage.ScalingMode = .aspectFill) -> UIImage {
let aspectRatio = scalingMode.aspectRatio(between: newSize, and: size)
/* Build the rectangle representing the area to be drawn */
var scaledImageRect = CGRect.zero
scaledImageRect.size.width = size.width * aspectRatio
scaledImageRect.size.height = size.height * aspectRatio
scaledImageRect.origin.x = (newSize.width - size.width * aspectRatio) / 2.0
scaledImageRect.origin.y = (newSize.height - size.height * aspectRatio) / 2.0
/* Draw and retrieve the scaled image */
UIGraphicsBeginImageContext(newSize)
draw(in: scaledImageRect)
let scaledImage = UIGraphicsGetImageFromCurrentImageContext()
UIGraphicsEndImageContext()
return scaledImage!
}
}
if let srcimg = UIImage(named: "flags") {
let w = srcimg.size.width
let h = srcimg.size.height
// determine whether width or height is greater
let longer = max(w, h)
// create a Square size
let sz = CGSize(width: longer, height: longer)
// call scaling function to scale the image to the Square dimensions,
// using "aspect fit"
let newImage = srcimg.scaled(to: sz, scalingMode: .aspectFit)
// create a UIImageView with the resulting image
let v = UIImageView(image: newImage)
v.backgroundColor = UIColor.white
// add it to the container view
container.addSubview(v)
}

UIImageView AspectFill failing

I'm rendering a 16:9 pixel image programatically and then use UIImageView with kCAFilterNearest filter and .ScaleAspectFill content mode to display it as a background image. The result I get is this:
On the picture you can see that, for whatever reason, it scales the picture exactly right but slightly moves it up (like, half a scaled pixel) and leaves a line at the bottom, which belongs to the UIImageView that the image is displayed on (it's the only view on this UIViewController and its background color is "navy").
What might be the reason behind this, considering that I use UIScreen.mainScreen().bounds rectangle for the UIImageView?
P.S. ScaleToFill content mode gives kind of the same result (the line at the bottom remains)
I use a function to resize the image to fit on the screen
fnc imageResize (imageObj:UIImage, sizeChange:CGSize)-> UIImage{
let hasAlpha = false
let scale: CGFloat = 0.0 // Automatically use scale factor of main screen
UIGraphicsBeginImageCOntextWithOptions(sizeChange, !hasAlpha, scale)
imageObj.drawInRect(CGRect(origin: CGPointZero, size : sizeChange))
let scaledImage = UIGraphicsGetImageFromCurrentImageContext()
retunr scaledImage
}
and to call
let mainScreenSize: CGRect = UIScreen.mainScreen().bounds
let image: UIImage! = self.imageResize(UIImage(named: "IMAGE_NAME")!, sizeChange: CGSizeMake(mainScreenSize.width, mainScreenSize.height))
the initial image that I used was 16:9 ratio as well

Pasteboard UIImage not using scale

I am building a custom keyboard and am having trouble adding an image to the pasteboard and maintaining the appropriate scale and resolution with in the pasted image. Let me start with a screenshot of the keyboard to illustrate my trouble:
So the picture of the face in the top left of the keyboard is just a UIButton with the original photo set to the background. When the button is pressed the image is resized with the following function:
func imageResize(image:UIImage, size:CGSize)-> UIImage {
let scale = UIScreen.mainScreen().scale
UIGraphicsBeginImageContextWithOptions(size, false, scale)
var context = UIGraphicsGetCurrentContext()
CGContextSetInterpolationQuality(context, kCGInterpolationHigh)
image.drawInRect(CGRect(origin: CGPointZero, size: size))
let scaledImage = UIGraphicsGetImageFromCurrentImageContext()
UIGraphicsEndImageContext()
return scaledImage
}
This function creates a UIImage the same size as the UIButton with the appropriate scale to reflect the device's screen resolution. To verify that the function is correct, I added an UIImageView filled with the scaled image. The scaled image is the image that looks misplaced near the center of the keyboard. I added the UIImageView with this function:
func addImageToBottomRight(image: UIImage) {
var tempImgView = UIImageView(image: image)
self.view.addSubview(tempImgView)
tempImgView.frame.offset(dx: 100.0, dy: 50.0)
}
I have tried a few different methods for adding the image to the pasteboard, but all seem to ignore the scale of the image and display it twice as large as opposed to displaying it at a higher resolution:
let pb = UIPasteboard.generalPasteboard()!
var pngType = UIPasteboardTypeListImage[0] as! String
var jpegType = UIPasteboardTypeListImage[2] as! String
pb.image = image
pb.setData(UIImagePNGRepresentation(image), forPasteboardType: pngType)
pb.setData(UIImageJPEGRepresentation(image, 1.0), forPasteboardType: jpegType)
All three of these methods do not work correctly and produce the same result as illustrated in the screenshot. Does anyone have any suggestions of other methods? To further clarify my goal, I would like the image in the message text box to look identical to both UIImages in the keyboard in terms of size and resolution.
Here are a few properties of the UIImage before and resize in case anyone is curious:
Starting Image Size is (750.0, 750.0)
Size to scale to is: (78.0, 78.0))
Initial Scale: 1.0
Resized Image Size is (78.0, 78.0)
Resized Image Scale: 2.0
I know this is an old post, but thought I share the work around I found for this specific case of copying images and pasting to messaging apps.The thing is, when you send a picture with such apps like iMessages, whatsapp, messenger, etc, the way they display the image is so that it aspect fits to some certain horizontal width (lets say around 260 pts for this demo).
As you can see from the diagram below, if you send 150x150 image #1x resolution in imessage, it will be stretched and displayed in the required 260 width box, making the image grainy.
But if you add an empty margin of width 185 to both the left and right sides of the image, you will end up with an image of size 520x150. Now if you send that sized image in imessage, it will have to fit it in a 260 width box, ending up cramming a 520x150 image in a 260x75 box, in a way giving you a 75x75 image at #2x resolution.
You can add a clear color margin to a UIImage with a code like this
extension UIImage {
func addMargin(withInsets inset: UIEdgeInsets) -> UIImage? {
let finalSize = CGSize(width: size.width + inset.left + inset.right, height: size.height + inset.top + inset.bottom)
let finalRect = CGRect(origin: CGPoint(x: 0, y: 0), size: finalSize)
UIGraphicsBeginImageContextWithOptions(finalSize, false, scale)
UIColor.clear.setFill()
UIGraphicsGetCurrentContext()?.fill(finalRect)
let pictureOrigin = CGPoint(x: inset.left, y: inset.top)
let pictureRect = CGRect(origin: pictureOrigin, size: size)
draw(in: pictureRect)
let finalImage = UIGraphicsGetImageFromCurrentImageContext()
defer { UIGraphicsEndImageContext() }
return finalImage
}
}

Resources