Hi I am trying to compress a UIImage my code is like so:
var data = UIImageJPEGRepresentation(image, 1.0)
print("Old image size is \(data?.length) bytes")
data = UIImageJPEGRepresentation(image, 0.7)
print("New image size is \(data?.length) bytes")
let optimizedImage = UIImage(data: data!, scale: 1.0)
The print out is:
Old image size is Optional(5951798) bytes
New image size is Optional(1416792) bytes
Later on in the app I upload the UIImage to a webserver (it's converted to NSData first). When I check the size it's the original 5951798 bytes. What could I be doing wrong here?
A UIImage provides a bitmap (uncompressed) version of the image which can be used for rendering the image to screen. The JPEG data of an image can be compressed by reducing the quality (the colour variance in effect) of the image, but this is only in the JPEG data. Once the compressed JPEG is unpacked into a UIImage it again requires the full bitmap size (which is based on the colour format and the image dimensions).
Basically, keep the best quality image you can for display and compress just before upload.
Send this data to server data = UIImageJPEGRepresentation(image, 0.7), instead of sending optimizedimage to server.
You already compressed the image to data. So it will be repeating process(resetting to original size).
Related
I am using UIImagePickerController to read images from the photo library. I use the following code to calculate their size:
if let file = info[UIImagePickerControllerOriginalImage] {
let imageValue = (file as? UIImage)!
let data = UIImageJPEGRepresentation(imageValue, 1)
let imageSize = (data?.count)! / 1024
print("imsize in MB: ", Double(imageSize) / 1024.0)
if let imageData = UIImagePNGRepresentation(imageValue) {
let bytes = imageData.count
let KB = Double(bytes) / 1024.0
let MB = Double(KB) / 1024.0
print("we have image size as MB", MB)
}
}
To my surprise both tell a different size for the images, which is also different from size of image. What is happening here and which is more accurate?
A bit confused. Help is much needed to understand this.
Jpeg and Png are different. Here I googled the diffrence between Jpeg and Png on google.
The main difference between JPG and PNG is the compression algorithms that they use. JPG uses a lossy compression algorithm that discards some of the image information in order to reduce the size of the file. ... With PNG, the quality of the image will not change, but the size of the file will usually be larger.
I want to save an image in database. Therefore I convert it to Data. However during these steps the width and height of the image will change. It is increased in size.
// Original Image Size
print("Original Image Size : \(capturedImage.size)") // Displays (320.0, 427.0)
// Convert to Data
var imageData: Data?
imageData = UIImagePNGRepresentation(capturedImage)
// Store imageData into Db.
// Convert it back
m_CarImgVw.image = UIImage(data: damageImage!.imageData!, scale: 1.0)
print("m_CarImgVw Image Size : \(m_CarImgVw.image.size)") // Displays (640.0, 854.0)
I do not want the imagesize to increase!
If it’s originally an image from your assets, it’s probably #2x, which means the size in pixels (real size) is double the size in pts (displayed size). So the image size isn’t actually increasing, it was 640x854 before and after the transform. It’s just that before the OS automatically scaled it because it was named #2x.
To use the original image scale you can replace 1.0 with capturedImage.scale.
Your problem is in this line:
m_CarImgVw.image = UIImage(data: damageImage!.imageData!, scale: 1.0)
Can you see it?
Hint: It's in scale: 1.0.
It looks like your original image was Retina (or #2x), so it had scale 2.0.
So you should either put your original image scale (damageImage.scale) there, or if you're presenting image on the screen you should use UIScreen's scale.
I am doing a simple image file size reduction task but got this issue. When using no-loss option for JPEGs, the file size tripled then the original NSData for the image (same resolution). Any suggestion?
Here is the simple code:
let data = someImageData
print(data.length)
let image = UIImage(data: data)!
let newImageData = UIImageJPEGRepresentation(image, 1)
print(newImageData.length)
let newImageData = UIImageJPEGRepresentation(image, 0.8)
print(newImageData.length)
and output:
2604768 //original length
8112955 //new length if compression = 1
2588870 //new length if compression = 0.8
It seems I have to take the 0.8 quality loss to get the same data length. Did I miss something? Pease help.
Edit: I did some more test by converting the data to UIImage then to UIImageJPEGRepresentation(image, 1), the length size of the new data increases every cycle. But if I use UIImageJPEGRepresentation(image, 0.8), the length size of the new data decreases a bit (very little), however, the compound qualify loss should be a concern.
What your code is doing is decompressing/extracting the image into memory with the line --> let image = UIImage(data: data)!
and then recompressing as JPEG, with let newImageData = UIImageJPEGRepresentation(image, 1) with the highest quality ratio (q=1.0). That’s why the image is suddenly so big.
So the moment you get your image as an UIImage from NSData, the decompression/conversion changes the size of the image.
The image's file size will actually be 8112955.
I have an app and I need to compress images to max 10mb, I'm currently using:
var imageData = UIImageJPEGRepresentation(thePhoto2.image!, 0.9)
but it would be nice to get all my images in optimal quality.
Following these steps I'm getting an unexpected result:
1) Write UIImage to PNG
2) Read PNG into UIImage
Result: the sizes of the UIImages (before and after PNG) are different. The UIImage created by reading from the PNG has twice the resolution as the original UIImage used to create the PNG.
Swift pseudocode follows:
var initialImage : UIImage;
// Obtain UIImage from UIImagePickerController, Images.xcassets, ...
// As an example, consider a UIImage with size = (420.0, 280.0).
println("Image size = \(initialImage.size)") // Indicates "Image size = (420.0, 280.0)"
// Write image to file
UIImagePNGRepresentation(myImage).writeToFile(imagePath, atomically: true)
// Retrieve image from file
let newImage = UIImage(contentsOfFile: imagePath)!
println("Image size = \(initialImage.size)") // Indicates "Image size = (840.0, 560.0)"!
All is well otherwise. Displaying the newly read image works great.
I'm running this connected to an iPhone 6. Any thoughts would be greatly appreciated.
gary
It's converting it for the retina display. Each point is represented by 2 pixels on the screen so the saved image is double the resolution. This is a good article to reference: http://www.paintcodeapp.com/news/iphone-6-screens-demystified