In my application, I'm using YPImagepicker for selecting images from library and camera. I want to know image size in MB after selecting the pictures or capturing a photo. Task is to convert the images into data and send to backend via REST API. As of now we are limiting images into 5. So I want to see the size of every images if it is more than 1 Mb need to compress into 1mb.
let imgData = NSData(data: image.jpegData(compressionQuality: 1)!)
var imageSize: Int = imgData.count
print("actual size of image in KB: %f ", Double(imageSize) / 1024.0 / 1024.0)
the above sample I have used to check the size of the image but I'm not seeing the correct file size. For eg, I'm capturing one photo through app and it is getting saved in album when I check the image size it shows 3.4 MB in photo detail but in code I'm not getting the same size. What is best way to achieve this?
Apple doesn’t use JPEG for storing images in your iOS library. They use a proprietary file format with its own lossy compression.
The two file formats will yield different file sizes.
When you load an image from the user’s image library and convert it to JPEG data, it gets re-compressed using JPEG compression with the image quality value you specify. A compressionQuality value of 1.0 will create an image file with the best image quality but the largest file size. That is why your results are bigger than the files from the user’s image library. Try a lower compressionQuality value.
Related
let str = localpath?.path
let image = UIImage(contentsOfFile: str!)
Original Size = 228 kb
Size after writing file = 700 kb
Now, My question is that, why writing file into document directory increases file size to that much large?. I even uploaded this image to my server and obviously getting this larger image sizes. I wanted to upload given image to my server with the original quality, but i dont want this much large sized image(as its actual size on device is totally different), is there any other way by which we can prevent original image size and quality both?.
After Research I found this reason.
Reason -
Let's take this example from WWDC 2018 session - 416_ios_memory_deep_dive. If you have 590kb file size image, the dimension of image is 2048 pixels x 1536 pixels. SO the total memory of this image is 10MB (2048 pixels x 1536 pixels x 4 bytes per pixel) For more details you can take a look at this video.
https://developer.apple.com/videos/play/wwdc2018/416/
I watched the full WWDC video, and I got the reason behind this problem. But what I just want is, get image from imagepickercontroller, show it to user, once user confirms, then upload to my server(via multipart). Now, if I am uploading this image(rendered one), I am getting this larger size on server too. Is there any way, by which I may have rendered image different and uploaded image different(with actual filesize). Even if I am saving the image after rendering, my file size is still same as rendered one.
I'm trying to attach a photo(UIImage) in my iOS application from the Photos app. The original size of the image is ~10MB but when I process it with
image?.jpegData(compressionQuality: 0.7)
the output size of the image becomes ~25MB.
Can anyone suggest as to what might be happening here?
There is one very interesting issue I face at the moment in my iOS application.
The image size got increased by the random number of percentage.
What I have observed is as below
When I choose the same image from the photo library and try to send the image by converting it to data, though the multipart form data API.
The image size also got increased in multiple times of the original image size.
I use the below code to convert the image into data bytes
img.jpegData(compressionQuality: 1.0)
The data length is around 90 MB.
The original image is available here.
Does anyone know where is the issue and how to resolve it?
As storing pictures is going to be one of the more expensive features of Firebase my app will be using I want to make sure I'm doing it efficiently.
The steps I'm taking are the following:
Resize picture the user wants to upload to have width of 500 points (A point represents a pixel on non-retina screens and two pixels on retina screens)
Upload the data for the specified image in PNG format to Firebase storage
Here's my actual code:
let storageRef = FIRStorage.storage().reference().child("\(name).png")
if let uploadData = UIImagePNGRepresentation(profImage.resizeImage(targetSize: CGSize(width: 500, height: Int(500*(profImage.size.height/profImage.size.width))))) {
storageRef.put(uploadData, metadata: nil, completion: nil)
}
The photos are going to be a little less than the width of an iPhone screen when displayed to the user. Is the way I'm storing them efficient or is there a better way to format them?
**Edit: After a bit more research I've found out that JPGs are more efficient than PNG so I'll be switching to that since transparency isn't important for me. See my answer for example.
I've changed the image format from png to jpeg and found it saves a lot of space. Here's a picture of my storage for comparisons:
My code went from using UIImagePNGRepresentation to UIImageJPEGRepresentation with a compression factor of 1. I'm sure if I reduce the compression factor it'll save even more space.
For my iPhone Web App, since the 3G network is very slow, it cannot upload a large photo. Thus, I want to reduce the photo size before I upload it into a server. I tried to use CANVAS HTML 5, which works in Android but not in iOS 7 or 8.
Are there any other methods that I can use to reduce the photo size before upload?
You can use UIImageJPEGRepresentation or UIImagePNGRepresentation to compress the photo.
compressionQuality
The quality of the resulting JPEG image, expressed as a value from 0.0 to 1.0. The value 0.0 represents the maximum compression (or lowest quality) while the value 1.0 represents the least compression (or best quality).
See more info: https://developer.apple.com/library/ios/documentation/UIKit/Reference/UIKitFunctionReference/