I have images of the size 176 points by 176 points. I am trying to resize them so that they fit into a tableViewCell's ImageView. There are a few problems that I am coming across.
Problems
I don't know what size the image view in the tableViewCell actually is.
If I simply add the image without resizing it, it is so sharp that it looses detail:
If I use this extension on UIImage (below), then the transparent parts of the UIImage turn to black, not what I want:
extension UIImage {
func resizeImageWithBounds(bounds: CGSize) -> UIImage {
let horizontalRatio = bounds.width/size.width
let verticalRatio = bounds.height/size.height
let ratio = max(horizontalRatio, verticalRatio)
let newSize = CGSize(width: size.width * ratio, height: size.height * ratio)
UIGraphicsBeginImageContextWithOptions(newSize, true, 0)
draw(in: CGRect(origin: CGPoint.zero, size: newSize))
let newImage = UIGraphicsGetImageFromCurrentImageContext()
UIGraphicsEndImageContext()
return newImage!
}
}
I am looking for information on how big the UIImageView is and how to best resize an image into it. I really don't want to create another set of assets (I have a lot of images), and I don't think I should have to.
Try changing the contentMode of the cell.
cell.imageView?.contentMode = .scaleAspectFit
Or, if that doesn't work, here's how to fix the issues of your resized images turning black:
UIGraphicsBeginImageContextWithOptions(newSize, false, 0) // <-- changed opaque setting from true to false
Related
In photo editor screen , I have imageview and it has background image and on top of imageview I add elements like text (label), stickers(images) etc. , Now for the final image containing all elements added on imageview , I am getting image from below code
clipRect is rect for background image inside imageview, image is aspectfit in imageview
Below is code inside UIView extension which has function to generate image out of view.
self == uiview
let op_format = UIGraphicsImageRendererFormat()
op_format.scale = 1.0
let renderer = UIGraphicsImageRenderer(bounds: CGRect(origin: clipRect!.origin, size: outputSize), format: op_format)
let originalBound = self.bounds
self.bounds = CGRect(origin: clipRect!.origin, size: clipRect!.size)
var finalImage = renderer.image { ctx in
self.drawHierarchy(in: CGRect(origin: self.bounds.origin, size: outputSize), afterScreenUpdates: true)
}
self.bounds = CGRect(origin: originalBound.origin, size: originalBound.size)
Issue here is quality of final image quality is very poor as compared to original background image.
Don't set the scale of your UIGraphicsImageRendererFormat to 1. That forces the output image to #1x (non-retina). For most (all?) iOS devices, that will cause a 2X or 3X loss of resolution. Leave the scale value of the UIGraphicsImageRendererFormat at the default value.
you can take screenshot as well
// Convert a uiview to uiimage
func captureView() -> UIImage {
// Gradually increase the number for high resolution.
let scale = 1.0
UIGraphicsBeginImageContextWithOptions(bounds.size, opaque, scale)
layer.renderInContext(UIGraphicsGetCurrentContext()!)
let image:UIImage = UIGraphicsGetImageFromCurrentImageContext()
UIGraphicsEndImageContext()
return image
}
I am working on a subclass of UIButton, let’s call it MyButton. In most cases, buttons of this subclass will have both image and title.
I am having troubles configuring MyButton’s imageView size. The goal is to limit the size of the image, so that larger images are resized and fit into a 32-by-32 square.
For testing purposes, I added a 128-by-128-pixel image, and set imageView’s backgroundColor to green.
Below is what the button looked like when I added the image. Both the imageView and its image had the size of 128 by 128 pixels.
After I overrode MyButton’s intrinsicContentSize with CGSize(width: 160, height: 50) and set the button’s imageView’s contentMode to .scaleAspectFit, the imageView resized, but only in height—it’s still 128 pixels wide, which causes the button’s title to truncate.
I’ve seen a lot of articles on how to resize the image inside the imageView using imageEdgeInsets. This is not what I am looking for since in all those articles the imageView preserves its size and only adds padding around the image.
Changing the imageView’s frame or bounds produces no result.
Any thoughts on how I can resize the button’s imageView?
let customButtonImage = MyButton.currentImage
now you have your button image resize it using #jay's extension for resizing image
let newimage = customButtonImage. resizedImage(size:requiredSize)
Reset image to button
self.setImage(newimage, for: .normal) //normal or what state your want
#jays extension
extension UIImage
{
func resizedImage(Size sizeImage: CGSize) -> UIImage?
{
let frame = CGRect(origin: CGPoint.zero, size: CGSize(width: sizeImage.width, height: sizeImage.height))
UIGraphicsBeginImageContextWithOptions(frame.size, false, 0)
self.draw(in: frame)
let resizedImage: UIImage? = UIGraphicsGetImageFromCurrentImageContext()
UIGraphicsEndImageContext()
self.withRenderingMode(.alwaysOriginal)
return resizedImage
}
}
You can resize the image and then ad to button's image. Below you can find resize image code.
extension UIImage
{
func resizedImage(Size sizeImage: CGSize) -> UIImage?
{
let frame = CGRect(origin: CGPoint.zero, size: CGSize(width: sizeImage.width, height: sizeImage.height))
UIGraphicsBeginImageContextWithOptions(frame.size, false, 0)
self.draw(in: frame)
let resizedImage: UIImage? = UIGraphicsGetImageFromCurrentImageContext()
UIGraphicsEndImageContext()
self.withRenderingMode(.alwaysOriginal)
return resizedImage
}
}
Button ui extension.
#IBInspectable var sizeImage: CGSize {
get {
return self.imageView?.image?.size ?? CGSize.zero
}
set {
if let image = self.imageView?.image {
let imgUpdate = image.resizedImage(Size: newValue)
self.setImage(imgUpdate, for: .normal)
}
}
}
I have a UIImageView in created in Inspector that I resize in my code based on a selected image which i get from the web. However on first load of the image, it's being displayed in the images normal resolution instead of the UIImageViews newly created bounds.
Resizing the UIImageView:
fullScreenImage.bounds.size = CGSize(width: scaledWidth, height: scaledHeight)
Setting the UIImageView's image
let imageStringURL = images[indexPath.row].urls!["regular"]
let imageURL = URL(string: imageStringURL!)!
let imageData = try! Data(contentsOf: imageURL)
let image = UIImage(data: imageData)
fullScreenImage.image = image
This is how it looks when the image is first clicked on to enter "fullscreen mode"
This is how it looks the second time i click it
Not really sure why the Image isn't bounding itself within the specified UIImageView bounds
Instead of resizing the bound, you can set the UIViewContentMode property of UIImageView. This will resize the imageView image to fit inside the bounds.
fullScreenImage.contentMode = .scaleAspectFit
I found a workaround solution. I tried setting the contentMode to aspect fit, and i also tried enabling clip to bounds in the inspector, however none of them worked. So I simply looked into just resizing the UIImage itself and placing it into the UIImageView.
I found an extension in another post for a UIImage that resizes it
extension UIImage{
func resizeImage(newSize: CGSize) -> UIImage {
// Guard newSize is different
guard self.size != newSize else { return self }
UIGraphicsBeginImageContextWithOptions(newSize, false, 0.0);
self.draw(in: CGRect(x: 0, y: 0, width: newSize.width, height: newSize.height))
let newImage: UIImage = UIGraphicsGetImageFromCurrentImageContext()!
UIGraphicsEndImageContext()
return newImage
}
}
If anyone knows of a better way please let me know :)
I'm currently trying to add an image into the navigation item for one view. In the view's viewDidLoad(), a function is called with the following code, similar to this post:
let logo = UIImage(named: "Menu_Logo")
let imageView = UIImageView(frame: CGRect(x: 0, y: 0, width: 122, height: 26))
imageView.contentMode = .scaleAspectFit
imageView.clipsToBounds = true
imageView.image = logo
self.navigationItem.titleView = imageView
Instead of giving me the expected size however, the view ends up looking like this:
Removing the UIImage from the UIImageView makes the view sized correctly like this:
This seems like strange behaviour to me, especially since I did set the content mode to .scaleAspectFit. Is there something I am forgetting regarding adding an UIImageView as the navigationItem.titleView?
On UINavigationBar, title view takes its full size, if content is large.
Resize the image rather than UIImageView as following with passing size (122, 26). This will solve your problem.
func imageResize(sizeChange: CGSize) -> UIImage {
let hasAlpha = true
let scale: CGFloat = 0.0 // Use scale factor of main screen
UIGraphicsBeginImageContextWithOptions(sizeChange, !hasAlpha, scale)
self.draw(in: CGRect(origin: CGPoint.zero, size: sizeChange))
let scaledImage = UIGraphicsGetImageFromCurrentImageContext()
return scaledImage!
}
I am building a custom keyboard and am having trouble adding an image to the pasteboard and maintaining the appropriate scale and resolution with in the pasted image. Let me start with a screenshot of the keyboard to illustrate my trouble:
So the picture of the face in the top left of the keyboard is just a UIButton with the original photo set to the background. When the button is pressed the image is resized with the following function:
func imageResize(image:UIImage, size:CGSize)-> UIImage {
let scale = UIScreen.mainScreen().scale
UIGraphicsBeginImageContextWithOptions(size, false, scale)
var context = UIGraphicsGetCurrentContext()
CGContextSetInterpolationQuality(context, kCGInterpolationHigh)
image.drawInRect(CGRect(origin: CGPointZero, size: size))
let scaledImage = UIGraphicsGetImageFromCurrentImageContext()
UIGraphicsEndImageContext()
return scaledImage
}
This function creates a UIImage the same size as the UIButton with the appropriate scale to reflect the device's screen resolution. To verify that the function is correct, I added an UIImageView filled with the scaled image. The scaled image is the image that looks misplaced near the center of the keyboard. I added the UIImageView with this function:
func addImageToBottomRight(image: UIImage) {
var tempImgView = UIImageView(image: image)
self.view.addSubview(tempImgView)
tempImgView.frame.offset(dx: 100.0, dy: 50.0)
}
I have tried a few different methods for adding the image to the pasteboard, but all seem to ignore the scale of the image and display it twice as large as opposed to displaying it at a higher resolution:
let pb = UIPasteboard.generalPasteboard()!
var pngType = UIPasteboardTypeListImage[0] as! String
var jpegType = UIPasteboardTypeListImage[2] as! String
pb.image = image
pb.setData(UIImagePNGRepresentation(image), forPasteboardType: pngType)
pb.setData(UIImageJPEGRepresentation(image, 1.0), forPasteboardType: jpegType)
All three of these methods do not work correctly and produce the same result as illustrated in the screenshot. Does anyone have any suggestions of other methods? To further clarify my goal, I would like the image in the message text box to look identical to both UIImages in the keyboard in terms of size and resolution.
Here are a few properties of the UIImage before and resize in case anyone is curious:
Starting Image Size is (750.0, 750.0)
Size to scale to is: (78.0, 78.0))
Initial Scale: 1.0
Resized Image Size is (78.0, 78.0)
Resized Image Scale: 2.0
I know this is an old post, but thought I share the work around I found for this specific case of copying images and pasting to messaging apps.The thing is, when you send a picture with such apps like iMessages, whatsapp, messenger, etc, the way they display the image is so that it aspect fits to some certain horizontal width (lets say around 260 pts for this demo).
As you can see from the diagram below, if you send 150x150 image #1x resolution in imessage, it will be stretched and displayed in the required 260 width box, making the image grainy.
But if you add an empty margin of width 185 to both the left and right sides of the image, you will end up with an image of size 520x150. Now if you send that sized image in imessage, it will have to fit it in a 260 width box, ending up cramming a 520x150 image in a 260x75 box, in a way giving you a 75x75 image at #2x resolution.
You can add a clear color margin to a UIImage with a code like this
extension UIImage {
func addMargin(withInsets inset: UIEdgeInsets) -> UIImage? {
let finalSize = CGSize(width: size.width + inset.left + inset.right, height: size.height + inset.top + inset.bottom)
let finalRect = CGRect(origin: CGPoint(x: 0, y: 0), size: finalSize)
UIGraphicsBeginImageContextWithOptions(finalSize, false, scale)
UIColor.clear.setFill()
UIGraphicsGetCurrentContext()?.fill(finalRect)
let pictureOrigin = CGPoint(x: inset.left, y: inset.top)
let pictureRect = CGRect(origin: pictureOrigin, size: size)
draw(in: pictureRect)
let finalImage = UIGraphicsGetImageFromCurrentImageContext()
defer { UIGraphicsEndImageContext() }
return finalImage
}
}