I have a CollectionView with cells having UIImageView. Some images in the list have really high resolution of 3000 x 2000 upwards. I'm using AlamofireImage Library to show and cache images but it still has huge spikes. I tried doing
let filter = AspectScaledToFillSizeFilter(size: imageView.frame.size)
imageView.af_setImage(withURL: url, filter: filter)
This didn't have much of a change.
Is there a better way to resize/downgrade resolution on downloading an image but before displaying it as iOS memory spikes are more because of resolution than actually the image file size.
Swift 4.1
This is how I resize my images. You could process them before displaying.
// Resize to ~1.5Kx2K resoultion and compress to <200KB (JPEG 0.2)
private func resizePhoto(_ originalPhoto: UIImage) -> UIImage? {
var size: CGSize
let scale = UIScreen.main.scale
if originalPhoto.size.width > originalPhoto.size.height { // Landscape
size = CGSize(width: 2016/scale, height: 1512/scale)
} else { // Portrait
size = CGSize(width: 1512/scale, height: 2016/scale)
}
UIGraphicsBeginImageContextWithOptions(size, false, 0.0)
originalPhoto.draw(in: CGRect(x: 0, y: 0, width: size.width, height: size.height))
let resizedPhoto = UIGraphicsGetImageFromCurrentImageContext()
UIGraphicsEndImageContext()
let scaledPhotoData = UIImageJPEGRepresentation(resizedPhoto!, 0.2)
//print(">>> Resized data size: \(scaledPhotoData!.count)")
return UIImage(data: scaledPhotoData)
}
Related
I'm building an app that contains iOS 14 widgets. User can change widget content by pressing on Edit Widget, and selecting it from provided intent collection. For INObjectCollection I'm using an array of INObject and for each object I'm setting an image this way:
let image = INImage(url: imageRemoteURL, width: 80, height: 80)
let myObject = MyCustomObject(identifier: "an id", display: "some text", subtitle: nil, image: image)
// MyCustomObject is a subclass of INObject
In the list images are displayed properly, but after selecting an item and opening again the view by pressing on Edit Widget - image is shown all system blue, also big sized. See attached screenshot:
I could find only this topic on the issue but with no solution yet.
If I would use an image from app bundle, then a solution would be to set Original for Render As, but no idea how to fix this when using a remote image URL.
Another problem is image size, which is too large.
Any help would be appreciated.
Thanks to Edward's hints I managed to fix my issues this way:
func getIntentImage(imageLink: String) -> INImage?
{
guard let url = URL(string: imageLink),
let imageData = try? Data(contentsOf: url), // may use image local caching
var image = UIImage(data: imageData) else
{
return nil
}
let maxSize: CGFloat = 80
let size = CGSize(width: maxSize, height: maxSize)
if let resizedImage = image.resizeImage(targetSize: size)
{
image = resizedImage
}
return INImage(uiImage: image.withRenderingMode(.alwaysOriginal))
}
// Image resizer:
extension UIImage
{
func resizeImage(targetSize: CGSize) -> UIImage?
{
let size = self.size
let widthRatio = targetSize.width / size.width
let heightRatio = targetSize.height / size.height
// Figure out what our orientation is, and use that to form the rectangle
var newSize: CGSize
if widthRatio > heightRatio
{
newSize = CGSize(width: size.width * heightRatio, height: size.height * heightRatio)
}
else
{
newSize = CGSize(width: size.width * widthRatio, height: size.height * widthRatio)
}
// This is the rect that we've calculated out and this is what is actually used below
let rect = CGRect(x: 0, y: 0, width: newSize.width, height: newSize.height)
// Actually do the resizing to the rect using the ImageContext stuff
UIGraphicsBeginImageContextWithOptions(newSize, false, 1)
self.draw(in: rect)
let newImage = UIGraphicsGetImageFromCurrentImageContext()
UIGraphicsEndImageContext()
return newImage
}
}
1. Image Size
I had the same issue with INImage's size in intent and found out this https://developer.apple.com/documentation/sirikit/inimage/1648800-imagesize but couldn't find anything about how to use it properly so eventually decided on removing the intent image at all. Maybe it helps you!
2. Blue Tint
You can change the image's default rendering type in your asset catalog or programmatically, and this should remove tint color.
- Asset Catalog
Click on the image you want to change in your asset catalog, then go to the Attributes Inspector, and change the "Render As" attribute to "Original Image".
– Programmatically
INImage itself actually has a way to set rendering mode, but it is set as private and using this API could get your application rejected by Apple. So, if you're planning to release it on App Store keep this in mind.
The "legal" solution is to use INImage(uiImage:) and set the rendering mode on the UIImage.
var renderedImage: UIImage? = UIImage(named:"myImage")?.withRenderingMode(.alwaysOriginal)
let image = INImage(uiImage: renderedImage, width: 40, height: 40)
imageToUpload is 375x500 here. After uploading to Firebase Storage, the width and height was doubled. Is there a way to keep the size unchanged after uploading to Firebase Storage?
if let uploadData = UIImageJPEGRepresentation(imageToUpload, 0.0) {
uploadImageRef.putData(uploadData, metadata: nil) { (metadata, error) in
if error != nil {
print("error")
completion?(false)
} else {
// your uploaded photo url.
// Metadata contains file metadata such as size, content-type.
let size = metadata?.size
// You can also access to download URL after upload.
uploadImageRef.downloadURL { (url, error) in
guard let downloadURL = url else {
// Uh-oh, an error occurred!
completion?(false)
return
}
print("Download url : \(downloadURL)")
completion?(true)
}
}
}
}
Note that I am using the extension below to change image size to 375x500 (size of imageToUpload) before uploading.
extension UIImage {
func resized(to size: CGSize) -> UIImage {
return UIGraphicsImageRenderer(size: size).image { _ in
draw(in: CGRect(origin: .zero, size: size))
}
}
}
let imageToUploadResize:UIImage = image.resized(to: CGSize(width: 375.0, height: 500.0))
As I had mentioned in my comment, IMO, the function in the extension is really being called like a standalone function and not really extending the functionality. I would suggest making it a function to resize a passed in image and return a new image.
Using this function, your image will be resized to the correct size and uploaded at that size (verified that it works)
func resizeImage(image: UIImage, targetSize: CGSize) -> UIImage {
let rect = CGRect(x: 0, y: 0, width: targetSize.width, height: targetSize.height)
UIGraphicsBeginImageContextWithOptions(targetSize, false, 1.0)
image.draw(in: rect)
let newImage = UIGraphicsGetImageFromCurrentImageContext()
UIGraphicsEndImageContext()
return newImage!
}
and then called like this
let updatedSize = CGSize(width: 300.0, height: 400.0)
let resizedImage = self.resizeImage(image: origImage!, targetSize: updatedSize)
Now to address the issue with the extension at a 10k foot level.
It all goes back to points vs how it's rendered on an iDevice. Back with the iPhone 2g, 3g, it was 1-1 for rendering so if you ran your code on that device in the simulator and set the image size to 320x480, it would have been a 320x480 image stored in firebase. However, as screens improved and resolutions went up, so did the rendering, which affects the UIImage.
So if you were to set your project to simulate on an iPhone 6, that same image would be 640x960 (2x) and then to iPhone 8+, the size is 960 x 1440 (3x). (there is upsampling involved so we are ignoring it here).
The UIImage knows what device it's on so that should be taken into consideration.
Again, this is generically speaking and there are a lot of other components involved, in particular, pixels = logicalPoints * scaleFactor
Try using the resizing image code provided by the following resource. Your extension creates a new instance of the image and returns it rather than updating the actual image itself.
If you want to keep on using your extension, I would suggest trying
extension UIImage {
func resized(to size: CGSize) {
self = UIGraphicsImageRenderer(size: size).image { _ in
draw(in: CGRect(origin: .zero, size: size))
}
}
}
This question already has answers here:
How do I resize the UIImage to reduce upload image size
(21 answers)
Closed 5 years ago.
I am trying to reduce the file size of my images as much as possible before I upload them. Right now I am doing this:
if let firstImageData = UIImageJPEGRepresentation(pickedImage, 0.1) {
self.imgArray.append(firstImageData)
}
This will take any image coming from the camera or photo album, make it jpg and reduce its size.
I have set the setting to 0.1 but when I upload my images the size still end up around 300-350kb, is there any way I could resize them even more, I am aiming towards 50-70kb
you can resize your image first to smaller size using these extension by resizing it by percent or width
extension UIImage {
func resizeWithPercent(percentage: CGFloat) -> UIImage? {
let imageView = UIImageView(frame: CGRect(origin: .zero, size: CGSize(width: size.width * percentage, height: size.height * percentage)))
imageView.contentMode = .ScaleAspectFit
imageView.image = self
UIGraphicsBeginImageContextWithOptions(imageView.bounds.size, false, scale)
guard let context = UIGraphicsGetCurrentContext() else { return nil }
imageView.layer.renderInContext(context)
guard let result = UIGraphicsGetImageFromCurrentImageContext() else { return nil }
UIGraphicsEndImageContext()
return result
}
func resizeWithWidth(width: CGFloat) -> UIImage? {
let imageView = UIImageView(frame: CGRect(origin: .zero, size: CGSize(width: width, height: CGFloat(ceil(width/size.width * size.height)))))
imageView.contentMode = .ScaleAspectFit
imageView.image = self
UIGraphicsBeginImageContextWithOptions(imageView.bounds.size, false, scale)
guard let context = UIGraphicsGetCurrentContext() else { return nil }
imageView.layer.renderInContext(context)
guard let result = UIGraphicsGetImageFromCurrentImageContext() else { return nil }
UIGraphicsEndImageContext()
return result
}
}
to use it just call it like this
myImage = myImage.resizeWithWidth(700)!
Now next you can still compress it using compression ratio of your choice
let compressData = UIImageJPEGRepresentation(myImage, 0.5) //max value is 1.0 and minimum is 0.0
let compressedImage = UIImage(data: compressData!)
You only can change size of image (less size = less data) and compress it by using image compression algorithms like JPEG, there is no other way (better algorythm = less size in same quality).
I heard google improved JPEG algorithm using neural networks lately (Google’s TensorFlow)
I am resizing and compressing my photos an unusual result.
When I choose the image from photo album, the image compresses and resizes fine. However, If I do it on a image that was passed from the camera, the image becomes oddly small (And unwatchable). What I have done as a test is assign some compression and resizing function in my button that takes an image either from a camera source or photo album. Below are my code and console output
#IBAction func testBtnPressed(sender: AnyObject) {
let img = selectedImageView.image!
print("before resize image \(img.dataLengh_kb)kb size \(img.size)")
let resizedImg = img.resizeWithWidth(1080)
print("1080 After resize image \(resizedImg!.dataLengh_kb)kb size \(resizedImg!.size)")
let compressedImageData = resizedImg!.mediumQualityJPEGNSData
print("Compress to medium quality = \(compressedImageData.length / 1024)kb")
}
extension UIImage {
var mediumQualityJPEGNSData: NSData { return UIImageJPEGRepresentation(self, 0.5)! }
func resizeWithWidth(width: CGFloat) -> UIImage? {
let imageView = UIImageView(frame: CGRect(origin: .zero, size: CGSize(width: width, height: CGFloat(ceil(width/size.width * size.height)))))
imageView.contentMode = .ScaleAspectFit
imageView.image = self
UIGraphicsBeginImageContextWithOptions(imageView.bounds.size, false, scale)
guard let context = UIGraphicsGetCurrentContext() else { return nil }
imageView.layer.renderInContext(context)
guard let result = UIGraphicsGetImageFromCurrentImageContext() else { return nil }
UIGraphicsEndImageContext()
return result
}
}
When photo was selected from photo album
before resize image 5004kb size (3024.0, 3024.0)
1080 After resize image 1023kb size (1080.0, 1080.0)
Compress to medium quality = 119kb
When photo was passed by camera
before resize image 4653kb size (24385.536, 24385.536)
1080 After resize image 25kb size (1080.576, 1080.576)
Compress to medium quality = 4kb
I have replaced the image resizing function with the following one and it worked a lot better
func resizeImage(newHeight: CGFloat) -> UIImage {
let scale = newHeight / self.size.height
let newWidth = self.size.width * scale
UIGraphicsBeginImageContext(CGSizeMake(newWidth, newHeight))
self.drawInRect(CGRectMake(0, 0, newWidth, newHeight))
let newImage = UIGraphicsGetImageFromCurrentImageContext()
UIGraphicsEndImageContext()
return newImage
}
I made custom UITabBar class and tried to set background image.
tabBar.backgroundImage = UIImage(named: "my_image")?.imageWithRenderingMode(.AlwaysOriginal)
I set the image file name to my_image#2x and image file is 640*98
I run on iPhone6 simulator and it seems the image is not wide enough like
Google's "C" is repeated on sample below
Im I using wrong image size or is something else is wrong?
Just redraw the image:
var image = UIImage(named: "my_image")
if let image = image {
var centerImage: Bool = false
var resizeImage: UIImage?
let size = CGSize(width: UIScreen.mainScreen().bounds.size.width, height: 98)
UIGraphicsBeginImageContextWithOptions(size, false, 0)
if centerImage {
//if you want to center image, use this code
image.drawInRect(CGRect(origin: CGPoint(x: (size.width-image.size.width)/2, y: 0), size: image.size))
}
else {
//stretch image
image.drawInRect(CGRect(origin: CGPoint.zero, size: size))
}
resizeImage = UIGraphicsGetImageFromCurrentImageContext()
UIGraphicsEndImageContext()
tabBar.backgroundImage = resizeImage.imageWithRenderingMode(.AlwaysOriginal)
}