As storing pictures is going to be one of the more expensive features of Firebase my app will be using I want to make sure I'm doing it efficiently.
The steps I'm taking are the following:
Resize picture the user wants to upload to have width of 500 points (A point represents a pixel on non-retina screens and two pixels on retina screens)
Upload the data for the specified image in PNG format to Firebase storage
Here's my actual code:
let storageRef = FIRStorage.storage().reference().child("\(name).png")
if let uploadData = UIImagePNGRepresentation(profImage.resizeImage(targetSize: CGSize(width: 500, height: Int(500*(profImage.size.height/profImage.size.width))))) {
storageRef.put(uploadData, metadata: nil, completion: nil)
}
The photos are going to be a little less than the width of an iPhone screen when displayed to the user. Is the way I'm storing them efficient or is there a better way to format them?
**Edit: After a bit more research I've found out that JPGs are more efficient than PNG so I'll be switching to that since transparency isn't important for me. See my answer for example.
I've changed the image format from png to jpeg and found it saves a lot of space. Here's a picture of my storage for comparisons:
My code went from using UIImagePNGRepresentation to UIImageJPEGRepresentation with a compression factor of 1. I'm sure if I reduce the compression factor it'll save even more space.
Related
In my application, I'm using YPImagepicker for selecting images from library and camera. I want to know image size in MB after selecting the pictures or capturing a photo. Task is to convert the images into data and send to backend via REST API. As of now we are limiting images into 5. So I want to see the size of every images if it is more than 1 Mb need to compress into 1mb.
let imgData = NSData(data: image.jpegData(compressionQuality: 1)!)
var imageSize: Int = imgData.count
print("actual size of image in KB: %f ", Double(imageSize) / 1024.0 / 1024.0)
the above sample I have used to check the size of the image but I'm not seeing the correct file size. For eg, I'm capturing one photo through app and it is getting saved in album when I check the image size it shows 3.4 MB in photo detail but in code I'm not getting the same size. What is best way to achieve this?
Apple doesn’t use JPEG for storing images in your iOS library. They use a proprietary file format with its own lossy compression.
The two file formats will yield different file sizes.
When you load an image from the user’s image library and convert it to JPEG data, it gets re-compressed using JPEG compression with the image quality value you specify. A compressionQuality value of 1.0 will create an image file with the best image quality but the largest file size. That is why your results are bigger than the files from the user’s image library. Try a lower compressionQuality value.
There is one very interesting issue I face at the moment in my iOS application.
The image size got increased by the random number of percentage.
What I have observed is as below
When I choose the same image from the photo library and try to send the image by converting it to data, though the multipart form data API.
The image size also got increased in multiple times of the original image size.
I use the below code to convert the image into data bytes
img.jpegData(compressionQuality: 1.0)
The data length is around 90 MB.
The original image is available here.
Does anyone know where is the issue and how to resolve it?
In an app I'm developing, the customer enters in some personal information and signs a box on the screen. I then programmatically screenshot the the full view and convert it to base64.
This is working fine, however, due to the size of the image it takes approximately 15 seconds to convert it to base64 and send it to an API server. The image size at full quality is the same size as the iPad Air resolution (1536x2048).
Once saved on the server as a PNG, the image weighs in at around 410kb.
I want to resize the image capture down to 768x1024 (half of what it is) without losing clarity. I think this will save both time and storage space.
I'm currently taking the "screenshot" using the following function:
func screenCaptureAndGo() {
//Create the UIImage
UIGraphicsBeginImageContextWithOptions(view.frame.size, view.opaque, 0.0)
view.layer.renderInContext(UIGraphicsGetCurrentContext())
let image = UIGraphicsGetImageFromCurrentImageContext()
UIGraphicsEndImageContext()
imageData = UIImagePNGRepresentation(image)
imageDataString = imageData.base64EncodedStringWithOptions(NSDataBase64EncodingOptions.Encoding64CharacterLineLength)
}
I left out the API server stuff as it's not really relevant.
I ended up just using the Scale property. It didn't work the first time, which is why I didn't think to use it again, but after toying with it I got it to work.
I'm using 0.7 as the scale amount, and the images now clock in at 100kb, and it takes 4 seconds to convert and send the base64 string.
I need to reduce the size of UIImage captured from Camera/Gallery & reduce to size to max 200kb.
The UIImage would then saved to a SQLite Database in the app.
I tried using UIImageJPEGRepresentation(image, compression); in while loop. but couldn't crack it.
Thanks for the help..!!!
You need to scale the image down first, straight out of camera it will likely be too big. Something like 1000 pixels on the longest side might be enough, but you can play with it and see what works for you. Then apply JPEG compression to a scaled image.
Also the way JPEG works, it's pointless to run the algorithm over and over again. Just pick a good compression rate and run it only once.
On a tangent, in many cases where you need to save a large data blob in a database, you might find it more memory efficient to save the data into a separate file in the file system, and store the path in the data base.
In my iOS app, I need the user to be able to send email with a GIF image attachment. I implemented this by using MFMailComposeViewController. If the file size of the GIF image is small, everything works OK. However, if the image size is large, iOS asks to reduce the image size. If the user accepts to reduce image size, the animation of GIF is gone. Actually, this is the same problem as asked here: Preventing MFMailComposeViewController from scaling animated GIFs
My understanding is that there is no way to avoid iOS to ask to reduce size. Therefore, the solution I am thinking is as follows: I will pre-compress and generate a new gif with reduced file size before attaching so that it will always be small enough.
So my question is: Is there an image file size that is guarantee to not result in iOS's asking to reduce image size? For example, is there something like "the mail will never ask to reduce image size if the attached image file is less than X KB" and what is X?
I have an answer to the threshold question and a method to reduce an image down and reliably avoid Apple querying if you want to scale the image size down.
Some background:
In my App, I give the user the option of automatically scaling their image down to 1024x768 before E-Mailing it as a way of avoiding Apple's 'do you want to scale your image down?' query.
For a long time, I thought that this amount of descaling was sufficient. But I've discovered that if their image has enough fine detail in it, then even at 1024x768, it can still trigger Apple's query.
So, the code below is how I deal with this problem. Note that if the getMinImgSizFlag is TRUE, I've already descaled the image to 1024x768 elsewhere.automatically
//...convert the UIImage into NSData, as the email controller requires, using
// a default JPG compression value of 90%.
float jpgCompression = 0.9f;
imageAsNSData = UIImageJPEGRepresentation( [self camImage], jpgCompression );
if ( [gDB getMinImgSizFlag] == TRUE )
{
//...if we are here, the user has opted to pre emptively scale their
// image down to 1024x768 to avoid Apple's 'scale the image down?'
// query.
//
// if so, then we will do a bit more testing because even with
// the image scaled down, if it has a lot of fine detail, it may
// still exceed a critical size threashold and trigger the query.
//
// it's been empirically determined that the critical size threashold
// falls between 391K and 394K bytes.
//
// if we determine that the compressed image, at the default JPG
// compression, is still larger than the critical size threashold,
// then we will begin looping and increasing the JPG compression
// until the image size drops below 380K.
//
// the aproximately 10K between our limit, 380K, and Apple's
// critical size threashold allows for the possibility that Apple
// may be including the contribution of the E-Mail's text size into
// its threashold calculations.
while ( [imageAsNSData length] > 380000 )
{
jpgCompression -= 0.05f;
imageAsNSData = UIImageJPEGRepresentation( [self camImage], jpgCompression );
}
}
That's it. I've tested this code and it reliably allows me to avoid Apple's do you want to scale your image down before emailing it query.