Downgrade UIImage quality to max bytes - ios

I have an application where I require a given UIImage to be 300x300px in physical size and less than 500kb in file size.
The image is taken from the device's photo library and resized to 300x300px.
In most cases the resulting photo is less than 500kb, but depending on the original image (such as resolution) it may be larger.
I do not want to downsize the image further, as I require it to be exactly 300px in its largest dimension.
How can I downgrade the quality on those images to a point where they are less than 500kb in file size (PNG representation)?

Related

Why image size get increased after writing in Document directory?

let str = localpath?.path
let image = UIImage(contentsOfFile: str!)
Original Size = 228 kb
Size after writing file = 700 kb
Now, My question is that, why writing file into document directory increases file size to that much large?. I even uploaded this image to my server and obviously getting this larger image sizes. I wanted to upload given image to my server with the original quality, but i dont want this much large sized image(as its actual size on device is totally different), is there any other way by which we can prevent original image size and quality both?.
After Research I found this reason.
Reason -
Let's take this example from WWDC 2018 session - 416_ios_memory_deep_dive. If you have 590kb file size image, the dimension of image is 2048 pixels x 1536 pixels. SO the total memory of this image is 10MB (2048 pixels x 1536 pixels x 4 bytes per pixel) For more details you can take a look at this video.
https://developer.apple.com/videos/play/wwdc2018/416/
I watched the full WWDC video, and I got the reason behind this problem. But what I just want is, get image from imagepickercontroller, show it to user, once user confirms, then upload to my server(via multipart). Now, if I am uploading this image(rendered one), I am getting this larger size on server too. Is there any way, by which I may have rendered image different and uploaded image different(with actual filesize). Even if I am saving the image after rendering, my file size is still same as rendered one.

Why image size get increased after UIImagePNGRepresentation?

I have image file in my photos app of iphone.
I have taken it to my macbook, via airdrop and checked size of the image = 1.9 MB
I saved the same image to Files app of iphone and checked the image size there = 1.9 MB
I have taken same image into my app viaUIImagePickerController,
Used UIImagePNGRepresentation and printed the data count/byte count = 15.8 MB
Used UIImageJPEGRepresentation with compressionRatio 1.0 and printed the data count/byte count = 5.4 MB
Used UIImagePNGRepresentation and saved it to local(DocumentsDirectory) and checked the file size = 15.8 MB
Used UIImageJPEGRepresentation with compressionRatio 1.0 and saved it to local(DocumentsDirectory) and checked the file size = 5.4 MB
Now, My question is that, why UIImagePNGRepresentation OR UIImageJPEGRepresentation increases file size to that much large?. I even uploaded this image to my server and obviously getting this larger image sizes. I wanted to upload given image to my server with the original quality, but i dont want this much large sized image(as its actual size on device is totally different), is there any other way by which we can prevent original image size and quality both?.
Let's take this example from WWDC 2018 session - 416_ios_memory_deep_dive. If you have 590kb file size image, the dimension of image is 2048 pixels x 1536 pixels. SO the total memory of this image is 10MB (2048 pixels x 1536 pixels x 4 bytes per pixel) For more details you can take a look at this video. https://developer.apple.com/videos/play/wwdc2018/416/

How to optimize the memory when displaying same image multiple times?

I have an instance of UIImage with an image with size of 200KB, then I create 5 instances of UIImageView that reference to same this UIImage.
I wonder how much memory allocated in this case - only 200KB (of one UIImage instance) or 1MB (for 5 cloned UIImage instances)? In the case of wasting memory occured, is there effective way to solve it?
A couple of thoughts:
UIImage is a reference type, so when you reference the same image five times, you generally will have one image object in memory. It depends a little upon how you do this. For example, if you use UIImage(data:) each time, or something like that, it's possible to instantiate a new object each time, but if you instantiate only one UIImage and then proceed to use if five times, then you won't see duplicative memory consumption taking place.
As an aside:
You say the image has a size of 200kb. Is that the size of the original asset, or have you figured out that this is how much memory it will take at run time?
The reason I ask is that JPG and PNG files are generally compressed, but when you use it in an image view, it will be uncompressed. The amount of memory that an image takes has little to do with the file size of the original asset, but rather corresponds to the dimensions (in pixels) of the image. So a random PNG that is 676 kb that is 2560 x 1440 pixels may actually require 14mb of memory (four bytes per pixel).
Note, this memory consumption corresponds to the dimension of the image in question, not the dimensions of the image view to which you added it. If you're concerned about memory usage and if the image dimensions exceed the size of the image view (times the device scale), then you might want to consider resizing the image.
In the future, you can answer these questions empirically using Instruments. For example, in the following timeline, at the green signpost, I loaded a UIImage with the 676kb asset with modest memory impact, I set the image view image to use this asset at the purple signpost with a significant memory impact as it uncompressed this 2560 x 1440 px image, and I loaded five more image views with the same image at the orange signpost with negligible further memory impact.

Is it ok to scale down UIImage's?

I've been told not to scale down images, but always to use them in their original resolution.
I have a custom UITableViewCell that has a 'Respond Action' on the cell. The UIImageView in Storyboard is 16x16.
Now originally I was using a 256x256 image, and my UIImageView has an Aspect Ratio constraint, and a height constraint of 16.
Then it was suggested to me to not ever scale down images, but to use an image that fit the exact size I needed. So a 16x16 was designed for me, and I used that. It looks awfully blurry though.
The results are here (The font next to it is 11.0 point to give you an idea of it's size):
What is the correct way to go about this? Should you not scale down images? What is the reason? It looks much better than the 16x16.
Scale your image down before adding it to your project...
Take your 256x256 png image and scale it to 1x, 2x, and 3x size.
1x should equal the size that you need your image to display in the view.
2x and 3x will support retina and retina HD displays.
ex:
Name: image.png | Size: 16x16
Name: image#2x.png | Size: 32x32
Name: image#3x.png | Size: 48x48
Drop these images into your Images.xcassets
see: Specifying High-Resolution Images in iOS
also: Icon and Image Sizes
You don't need to scale down images, but it's a good idea. If you're cramming a 256x256 in a 16x16 means the user is downloading 65kb vs <1kb of data. Or if it's in your bundle, then your app is that much 'heavier'.
But it depends what screen resolution you're using. Even an iPhone 4 uses retina, which means your 16x16 image view should contain a 32x32 image. You should also supply a 48x48 for iPhone 6+ screens. This will make your images look great on each screen size.
Use Images.xcassets to manage your images.
You shouldn't scale down the images ideally, but you also need to cosier the screen resolution. Old devices are 1x, retina devices are 2x and he latest iPhone 6 devices use 3x images which means you should have 16x16, 32x32 and 48x48 images to be used by each device (preferably managed in an image asset). What you're seeing at the moment is a 1x image being scaled up to 2x or 3x so it's blurry. Previously you had a large image scaled down which is bad for memory usage but looks sharp.
If you are using an image from your asset catalog, provided to yourself during development. You don't want to scale down images at runtime as it will add unnecessary overhead to your app. Rather have the assets sized to the required dimensions and add them to your xcimage catalog for 1x, 2x and 3x displays. Think optimisation in terms of memory and bundle size, what's the minimum size required to make it work and look good? If your image is being displayed in a 20x20 square, the optimal sizes will always be 20x20, 40x40 and 60x60 for 1x, 2x and 3x displayed.
Now fetching images from the network is an entirely different ballgame. You will find that sometimes you might want to scale down the image, but only when the image you fetch is bigger than what you need to display, like when you're trying to fit in a 2000x2000px image in a 36x26px square. You want to do the scale down for three reasons:
it will reduce virtual memory usage.
it will reduce storage memory if you persist the image.
you will save yourself from image downsampling artifacts that appear when the system tries to render the image at runtime.
Best thing to do here is to scale down the image as soon as you download from the network and before using it anywhere within your app.
Hope this helps!
There is a purist answer for this and then the answer which saves you from spending your life converting images all day.
The following is a discussion on the topic of scaling.
iOS - Different images based on device OR scaling the same image?
If you want to avoid having to create too many images, then using x2 images with a resolution of x2 the expected point size usage is a reasonable compromise as that covers the majority of current devices. You would only then be scaling down for iPhone 3s and up for iPhone 6+. You only need large XxY images if you expect to use large images. Again I would go for x2 resolution for those. Often worth having an iPhone and iPad version of an image in case your expect usage on iPad is to use a bigger XxY point size.
As noted in the link, the impact of scaling depends on the image. I would only worry about the images you think look bad when scaled. So far I have not found scaling to be an issue and much easier than creating lots of bespoke PNGs.
Scale it down before set it:
Swift extension:
extension UIImage{
// returns a scaled version of the image
func imageScaledToSize(size : CGSize, isOpaque : Bool) -> UIImage{
// begin a context of the desired size
UIGraphicsBeginImageContextWithOptions(size, isOpaque, 0.0)
// draw image in the rect with zero origin and size of the context
let imageRect = CGRect(origin: CGPointZero, size: size)
self.drawInRect(imageRect)
// get the scaled image, close the context and return the image
let scaledImage = UIGraphicsGetImageFromCurrentImageContext()
UIGraphicsEndImageContext()
return scaledImage
}
}
Example:
aUIImageView.image = aUIImage.imageScaledToSize(aUIImageView.bounds.size, isOpaque : false)
Set isOpaque to true if the image has no alpha: drawing will have better performance.

Does UIImage Cache the File or the Rendered Bitmap?

I generally use Fireworks PNGs (with different layers, some hidden, etc.) in my iOS projects (loaded into NIBs in UIImageView instances). Often, I take the PNG and resave it as PNG-32 to make the file smaller, but I'm questioning this now (because then I have to store the Fireworks PNG separately)....
In my cursory tests, a smaller file size does NOT affect the resultant memory use. Is there a relationship, or is it the final rendered bitmap that matters?
Note: I'm not asking about using an image that is too big in pixels. A valid comparison would be a high-quality jpeg that weights 1mb vs. a low-quality jpeg of the same content that weights 100K. Is the memory use the same?
UIImageView does not do any processing so if you set a large image the whole thing is loaded into memory when the imageView needs it regardless of the size of the imageView. So, yes, it does make a difference. You should store the smallest images that work within the imageView.
While your current example is using NIB's, if you were creating an app that displays large images acquired from other sources (e.g. the device camera or an external service) then you would scale those to a display size before using them in a UIImageView.

Resources