I am using UIImagePickerController to read images from the photo library. I use the following code to calculate their size:
if let file = info[UIImagePickerControllerOriginalImage] {
let imageValue = (file as? UIImage)!
let data = UIImageJPEGRepresentation(imageValue, 1)
let imageSize = (data?.count)! / 1024
print("imsize in MB: ", Double(imageSize) / 1024.0)
if let imageData = UIImagePNGRepresentation(imageValue) {
let bytes = imageData.count
let KB = Double(bytes) / 1024.0
let MB = Double(KB) / 1024.0
print("we have image size as MB", MB)
}
}
To my surprise both tell a different size for the images, which is also different from size of image. What is happening here and which is more accurate?
A bit confused. Help is much needed to understand this.
Jpeg and Png are different. Here I googled the diffrence between Jpeg and Png on google.
The main difference between JPG and PNG is the compression algorithms that they use. JPG uses a lossy compression algorithm that discards some of the image information in order to reduce the size of the file. ... With PNG, the quality of the image will not change, but the size of the file will usually be larger.
Related
I have an image and I need to save it without loss of quality or resizing. The file size should be as small as possible. But I can't even save it with its original file size.
I have an image.jpg (file size 2.1Mb). I put it in bundle and read in a Swift code. After I extracted it to PNG with .pngData() - and save as .png the file size is 18.1 Mb - which is ridiculous, even if I remove the Alpha Channel - it still will have 9-12 Mb.
If I use .jpegData() - it is still much more than the source image
Only if I get jpeg with the quality 0.65% - I can have 2.1Mb - the same as the source but with visible loss in quality.
Where is the solution? How to optimize it and save without quality losses and without file size inflating?
func foo() {
let bundleURL = Bundle.main.bundleURL
// Source file size: 2.1 MB
let assetURL = bundleURL.appendingPathComponent("sourceBundle.bundle/image.jpg")
let srcUImage = UIImage.init(contentsOfFile: assetURL.relativePath)
if let dataPng = srcUImage!.pngData() {
printSize(data: dataPng) // Size: 18.1 MB
}
if let dataJpg = srcUImage!.jpegData(compressionQuality: 1.0) {
printSize(data: dataJpg) // Size: 5.3 MB
}
if let dataJpg = srcUImage!.jpegData(compressionQuality: 0.65) {
printSize(data: dataJpg) // Size: 2.1 MB
}
}
func printSize(data: Data) {
let bcf = ByteCountFormatter()
bcf.allowedUnits = [.useMB]
bcf.countStyle = .file
let string = bcf.string(fromByteCount: Int64(data.count))
print("Size: \(string)")
}
I have below:
extension UIImage{
func compressToMaxSize(bytes: Int) -> UIImage{
var data: Data = UIImageJPEGRepresentation(self, 1.0)!
var compressionRatio: CGFloat = CGFloat(bytes) / CGFloat(data.count)
while data.count > bytes && compressionRatio > 0{
data = UIImageJPEGRepresentation(self, compressionRatio)!
compressionRatio = compressionRatio - 0.05
}
return UIImage(data: data)!
}
}
It does return an UIImage with size less than the bytes parameter, but I am suspicious, because the compressed size is about tenth of the original, even though the compression ratio was around 0.46. What am I missing here?
What am I missing here?
What you're missing is what a JPEG is.
JPEG is lossy. To implement that lossiness, it encodes the image data in a special way — where, for example, things like repeated pixels (same color) are notated in a brief shorthand. The greater the compression, the more the pixels will be considered to be repetitions of one another (exactness of color is thrown away).
Thus, the output size depends upon the image. How detailed is it? How much color repetition is there? An image of just one solid color could compress very tiny, regardless of the compression ratio.
I am doing a simple image file size reduction task but got this issue. When using no-loss option for JPEGs, the file size tripled then the original NSData for the image (same resolution). Any suggestion?
Here is the simple code:
let data = someImageData
print(data.length)
let image = UIImage(data: data)!
let newImageData = UIImageJPEGRepresentation(image, 1)
print(newImageData.length)
let newImageData = UIImageJPEGRepresentation(image, 0.8)
print(newImageData.length)
and output:
2604768 //original length
8112955 //new length if compression = 1
2588870 //new length if compression = 0.8
It seems I have to take the 0.8 quality loss to get the same data length. Did I miss something? Pease help.
Edit: I did some more test by converting the data to UIImage then to UIImageJPEGRepresentation(image, 1), the length size of the new data increases every cycle. But if I use UIImageJPEGRepresentation(image, 0.8), the length size of the new data decreases a bit (very little), however, the compound qualify loss should be a concern.
What your code is doing is decompressing/extracting the image into memory with the line --> let image = UIImage(data: data)!
and then recompressing as JPEG, with let newImageData = UIImageJPEGRepresentation(image, 1) with the highest quality ratio (q=1.0). That’s why the image is suddenly so big.
So the moment you get your image as an UIImage from NSData, the decompression/conversion changes the size of the image.
The image's file size will actually be 8112955.
I have an upload limit of 2 MB with my images. So if a user tries to upload an image bigger then 2 MB, I would like to reduce it's size without reducing the resolution.
How do I achieve that? I tried something like this but it didn't work:
var fileSize = UIImageJPEGRepresentation(image, 1)!.length
print("before File size:")
print(fileSize)
while fileSize > MyConstants.MAX_ATTACHMENT_SIZE{
let mydata = UIImageJPEGRepresentation(image, 0.75)
fileSize = mydata!.length
image = UIImage(data: mydata!)!
print("make smaller \(fileSize)")
}
print("after File size:")
print(UIImageJPEGRepresentation(image, 1)!.length)
output:
before File size:
2298429
make smaller 846683
after File size:
2737491
As #Lion said, you will have to play around with the quality to achieve an agreeable file size. I noticed however that:
print(UIImageJPEGRepresentation(image, 1)!.length)
Will print the image length at max quality. This is misleading since inside the while condition, you are achieving smaller file size.
while fileSize > MyConstants.MAX_ATTACHMENT_SIZE{
let mydata = UIImageJPEGRepresentation(image, 0.75)
fileSize = mydata!.length
image = UIImage(data: mydata!)!
print("make smaller \(fileSize)")
}
let mydata = UIImageJPEGRepresentation(image, 0.75)
fileSize = mydata!.length
image = UIImage(data: mydata!)!
This is the right approach. You can set different scale from 0.1 to 1 that how much you want to reduce the quality and size of the image.
print(UIImageJPEGRepresentation(image, 1)!.length) this line again increase image's quality and size or resolution to maximum (because scale = 1)
In my app I have to upload a UIImage to a server in form of a Base64 string where the condition is that the string must not exceed 54999 characters.
The code I currently use generally works but it takes a lot of time, memory and the length of the uploaded image Base64 string is usually very off from 54999. Sometimes even by a factor of 10.
var imageData = UIImagePNGRepresentation(image)
var base64String = imageData.base64EncodedStringWithOptions(.allZeros)
var scaledImage: UIImage = image
var newSize: CGSize = image.size
while ((base64String as NSString).length > 54999)
{
newSize.width *= 0.5
newSize.height *= 0.5
scaledImage = image.imageScaledToFitSize(newSize)
imageData = UIImagePNGRepresentation(scaledImage)
base64String = imageData.base64EncodedStringWithOptions(.allZeros)
}
// proceed to upload Base64 string...
At the beginning I thought I could this the following way but this obviously didn't work at all because there is no linear correlation between the file size and its Base64 length:
let maxLength: NSInteger = 54999
if ((base64String as NSString).length > maxLength)
{
let downScaleFactor: CGFloat = CGFloat(maxLength) / CGFloat((base64String as NSString).length)
var size: CGSize = image.size
size.width *= downScaleFactor
size.height *= downScaleFactor
scaledImage = image.imageScaledToFitSize(size)
imageData = UIImagePNGRepresentation(scaledImage)
base64String = imageData.base64EncodedStringWithOptions(.allZeros)
}
There must be a better way to do this.
This is a tough problem. The base64 encoding creates a predictable decrease in image size (3 bytes becomes 4, or a 33% increase in size)
You can't be sure of the result size from PNG or JPEG compression since the decrease in byte size depends on the image being compressed. (A solid-colored rectangle compresses EXTREMELY well. A cartoon using solid colors compresses quite well, and a continuous tone photograph with lots of detail does not compress as well.)
You will probably have better luck using JPEG images, since you can adjust the compression level on those, and JPEG offers higher compression ratios.
I would suggest experimenting with the lowest image quality you can tolerate (use quality setting .5, look at the image, and adjust up/down from there until you get an image that looks "good enough".)
Then use that compression. You can skip the base64 encoding until the end since you can simply multiply your non base64 byte size by 4/3 to get the size after base 64 encoding.
I would suggest shrinking your image by 1/sqrt(2) (~0.7) instead of 0.5. A decrease of 50% will cause large jumps in image size.