UIImage compressing but not resizing to max size - ios

I have below:
extension UIImage{
func compressToMaxSize(bytes: Int) -> UIImage{
var data: Data = UIImageJPEGRepresentation(self, 1.0)!
var compressionRatio: CGFloat = CGFloat(bytes) / CGFloat(data.count)
while data.count > bytes && compressionRatio > 0{
data = UIImageJPEGRepresentation(self, compressionRatio)!
compressionRatio = compressionRatio - 0.05
}
return UIImage(data: data)!
}
}
It does return an UIImage with size less than the bytes parameter, but I am suspicious, because the compressed size is about tenth of the original, even though the compression ratio was around 0.46. What am I missing here?

What am I missing here?
What you're missing is what a JPEG is.
JPEG is lossy. To implement that lossiness, it encodes the image data in a special way — where, for example, things like repeated pixels (same color) are notated in a brief shorthand. The greater the compression, the more the pixels will be considered to be repetitions of one another (exactness of color is thrown away).
Thus, the output size depends upon the image. How detailed is it? How much color repetition is there? An image of just one solid color could compress very tiny, regardless of the compression ratio.

Related

UIImageJPEGRepresentation vs UIImagePNGRepresentation for images in iOS

I am using UIImagePickerController to read images from the photo library. I use the following code to calculate their size:
if let file = info[UIImagePickerControllerOriginalImage] {
let imageValue = (file as? UIImage)!
let data = UIImageJPEGRepresentation(imageValue, 1)
let imageSize = (data?.count)! / 1024
print("imsize in MB: ", Double(imageSize) / 1024.0)
if let imageData = UIImagePNGRepresentation(imageValue) {
let bytes = imageData.count
let KB = Double(bytes) / 1024.0
let MB = Double(KB) / 1024.0
print("we have image size as MB", MB)
}
}
To my surprise both tell a different size for the images, which is also different from size of image. What is happening here and which is more accurate?
A bit confused. Help is much needed to understand this.
Jpeg and Png are different. Here I googled the diffrence between Jpeg and Png on google.
The main difference between JPG and PNG is the compression algorithms that they use. JPG uses a lossy compression algorithm that discards some of the image information in order to reduce the size of the file. ... With PNG, the quality of the image will not change, but the size of the file will usually be larger.

Confused about UIImage's init(data:scale:) method in Swift 3.0

Here is my purpose, I have an image width of 750 and I want to scale it to 128
Then I found an init method of UIImage called
init(data:scale:)
The next is my code
func scaleImage(image:UIImage, ToSpecificWidth targetWidth:Int) -> UIImage{
var scale = Double(targetWidth) / Double(image.size.width)
let scaledImage = UIImage(data: UIImagePNGRepresentation(image)! as Data, scale: CGFloat(scale))
return scaledImage!
}
print("originImageBeforeWidth: \(portrait?.size.width)") // output: 750
let newImage = Tools.scaleImage(image: portrait!, ToSpecificWidth: 128) // the scale is about 0.17
print("newImageWidth: \(newImage.size.width)") // output: 4394.53125
apparently the new width is too far away from my intension
I'm looking for 750 * 0.17 = 128
But I get 750 / 0.17 = 4394
then I change my scaleImage func
Here is the updated code
func scaleImage(image:UIImage, ToSpecificWidth targetWidth:Int) -> UIImage{
var scale = Double(targetWidth) / Double(image.size.width)
scale = 1/scale // new added
let scaledImage = UIImage(data: UIImagePNGRepresentation(image)! as Data, scale: CGFloat(scale))
return scaledImage!
}
print("originImageBeforeWidth: \(portrait?.size.width)") // output: 750
let newImage = Tools.scaleImage(image: portrait!, ToSpecificWidth: 128) // the scale is about 5.88
print("newImageWidth: \(newImage.size.width)") // output: 128
Which is exactly what I want, but the code scale =1/scale doesn't make any sense
What is going on here?
The init method you are trying to use is not for the purpose of resizing an image. And the scale parameter is not a resizing scale. It's the 1x, 2x, 3x scale. Essentially, the only valid values for scale are currently 1.0, 2.0, or 3.0.
While setting the scale to the inverse of what you expected gives you a size property returning your desired result, it is not at all what you should be doing.
There are proper ways to resize an image such as How to Resize image in Swift? as well as others.
According to apple document,
UIImage.scale
The scale factor to assume when interpreting the image data. Applying a scale factor of 1.0 results in an image whose size matches the pixel-based dimensions of the image. Applying a different scale factor changes the size of the image as reported by the size property.
UIImage.size
This value reflects the logical size of the image and takes the image’s current orientation into account. Multiply the size values by the value in the scale property to get the pixel dimensions of the image.
so,
real pixel size = size * scale
That's why you need to set 1/scale.
By the way,
scale only affect the size value of UIImage properly.
It mean it only affect the size for showing on screen, not changing pixel size.
If you want to resize, you can draw with scale and use
UIGraphicsGetImageFromCurrentImageContext()

NSData to UIImage back to NSData resulting length size tripled

I am doing a simple image file size reduction task but got this issue. When using no-loss option for JPEGs, the file size tripled then the original NSData for the image (same resolution). Any suggestion?
Here is the simple code:
let data = someImageData
print(data.length)
let image = UIImage(data: data)!
let newImageData = UIImageJPEGRepresentation(image, 1)
print(newImageData.length)
let newImageData = UIImageJPEGRepresentation(image, 0.8)
print(newImageData.length)
and output:
2604768 //original length
8112955 //new length if compression = 1
2588870 //new length if compression = 0.8
It seems I have to take the 0.8 quality loss to get the same data length. Did I miss something? Pease help.
Edit: I did some more test by converting the data to UIImage then to UIImageJPEGRepresentation(image, 1), the length size of the new data increases every cycle. But if I use UIImageJPEGRepresentation(image, 0.8), the length size of the new data decreases a bit (very little), however, the compound qualify loss should be a concern.
What your code is doing is decompressing/extracting the image into memory with the line --> let image = UIImage(data: data)!
and then recompressing as JPEG, with let newImageData = UIImageJPEGRepresentation(image, 1) with the highest quality ratio (q=1.0). That’s why the image is suddenly so big.
So the moment you get your image as an UIImage from NSData, the decompression/conversion changes the size of the image.
The image's file size will actually be 8112955.

How do I reduce the quality of UIImage without reducing the resolution?

I have an upload limit of 2 MB with my images. So if a user tries to upload an image bigger then 2 MB, I would like to reduce it's size without reducing the resolution.
How do I achieve that? I tried something like this but it didn't work:
var fileSize = UIImageJPEGRepresentation(image, 1)!.length
print("before File size:")
print(fileSize)
while fileSize > MyConstants.MAX_ATTACHMENT_SIZE{
let mydata = UIImageJPEGRepresentation(image, 0.75)
fileSize = mydata!.length
image = UIImage(data: mydata!)!
print("make smaller \(fileSize)")
}
print("after File size:")
print(UIImageJPEGRepresentation(image, 1)!.length)
output:
before File size:
2298429
make smaller 846683
after File size:
2737491
As #Lion said, you will have to play around with the quality to achieve an agreeable file size. I noticed however that:
print(UIImageJPEGRepresentation(image, 1)!.length)
Will print the image length at max quality. This is misleading since inside the while condition, you are achieving smaller file size.
while fileSize > MyConstants.MAX_ATTACHMENT_SIZE{
let mydata = UIImageJPEGRepresentation(image, 0.75)
fileSize = mydata!.length
image = UIImage(data: mydata!)!
print("make smaller \(fileSize)")
}
let mydata = UIImageJPEGRepresentation(image, 0.75)
fileSize = mydata!.length
image = UIImage(data: mydata!)!
This is the right approach. You can set different scale from 0.1 to 1 that how much you want to reduce the quality and size of the image.
print(UIImageJPEGRepresentation(image, 1)!.length) this line again increase image's quality and size or resolution to maximum (because scale = 1)

Downscale UIImage to fit a maximum Base64 length

In my app I have to upload a UIImage to a server in form of a Base64 string where the condition is that the string must not exceed 54999 characters.
The code I currently use generally works but it takes a lot of time, memory and the length of the uploaded image Base64 string is usually very off from 54999. Sometimes even by a factor of 10.
var imageData = UIImagePNGRepresentation(image)
var base64String = imageData.base64EncodedStringWithOptions(.allZeros)
var scaledImage: UIImage = image
var newSize: CGSize = image.size
while ((base64String as NSString).length > 54999)
{
newSize.width *= 0.5
newSize.height *= 0.5
scaledImage = image.imageScaledToFitSize(newSize)
imageData = UIImagePNGRepresentation(scaledImage)
base64String = imageData.base64EncodedStringWithOptions(.allZeros)
}
// proceed to upload Base64 string...
At the beginning I thought I could this the following way but this obviously didn't work at all because there is no linear correlation between the file size and its Base64 length:
let maxLength: NSInteger = 54999
if ((base64String as NSString).length > maxLength)
{
let downScaleFactor: CGFloat = CGFloat(maxLength) / CGFloat((base64String as NSString).length)
var size: CGSize = image.size
size.width *= downScaleFactor
size.height *= downScaleFactor
scaledImage = image.imageScaledToFitSize(size)
imageData = UIImagePNGRepresentation(scaledImage)
base64String = imageData.base64EncodedStringWithOptions(.allZeros)
}
There must be a better way to do this.
This is a tough problem. The base64 encoding creates a predictable decrease in image size (3 bytes becomes 4, or a 33% increase in size)
You can't be sure of the result size from PNG or JPEG compression since the decrease in byte size depends on the image being compressed. (A solid-colored rectangle compresses EXTREMELY well. A cartoon using solid colors compresses quite well, and a continuous tone photograph with lots of detail does not compress as well.)
You will probably have better luck using JPEG images, since you can adjust the compression level on those, and JPEG offers higher compression ratios.
I would suggest experimenting with the lowest image quality you can tolerate (use quality setting .5, look at the image, and adjust up/down from there until you get an image that looks "good enough".)
Then use that compression. You can skip the base64 encoding until the end since you can simply multiply your non base64 byte size by 4/3 to get the size after base 64 encoding.
I would suggest shrinking your image by 1/sqrt(2) (~0.7) instead of 0.5. A decrease of 50% will cause large jumps in image size.

Resources