iOS huge memory consumption when show image - ios

My application receiving a response from a server. The response contains png representation of some image in its body. The size is few kilobytes. I create an image object with
UIImage(data: response.data)
The image is shown by
imageView.image = image
Right after it, the application consumes around 100Mb more. It does not reproduce on the simulator. How is it possible? How could I fix the issue?

Related

Size of document increases on iOS device compare to the Android/Windows device

There is one very interesting issue I face at the moment in my iOS application.
The image size got increased by the random number of percentage.
What I have observed is as below
When I choose the same image from the photo library and try to send the image by converting it to data, though the multipart form data API.
The image size also got increased in multiple times of the original image size.
I use the below code to convert the image into data bytes
img.jpegData(compressionQuality: 1.0)
The data length is around 90 MB.
The original image is available here.
Does anyone know where is the issue and how to resolve it?

Efficiently store picture in Firebase Storage?

As storing pictures is going to be one of the more expensive features of Firebase my app will be using I want to make sure I'm doing it efficiently.
The steps I'm taking are the following:
Resize picture the user wants to upload to have width of 500 points (A point represents a pixel on non-retina screens and two pixels on retina screens)
Upload the data for the specified image in PNG format to Firebase storage
Here's my actual code:
let storageRef = FIRStorage.storage().reference().child("\(name).png")
if let uploadData = UIImagePNGRepresentation(profImage.resizeImage(targetSize: CGSize(width: 500, height: Int(500*(profImage.size.height/profImage.size.width))))) {
storageRef.put(uploadData, metadata: nil, completion: nil)
}
The photos are going to be a little less than the width of an iPhone screen when displayed to the user. Is the way I'm storing them efficient or is there a better way to format them?
**Edit: After a bit more research I've found out that JPGs are more efficient than PNG so I'll be switching to that since transparency isn't important for me. See my answer for example.
I've changed the image format from png to jpeg and found it saves a lot of space. Here's a picture of my storage for comparisons:
My code went from using UIImagePNGRepresentation to UIImageJPEGRepresentation with a compression factor of 1. I'm sure if I reduce the compression factor it'll save even more space.

UIImageJPEGRepresentation crashes in Share Extension on iPhone SE

I have a share extension that lets you crop an image and then upload it to our service. We call UIImageJPEGRepresentation to get the image's data before we upload it, but causes a crash due to memory excessive memory. This only happens with large images, and (as far as we can tell) on the SE, and didReceiveMemoryWarning is not called first. This is happening when using the Photos app.
Is there anyway to safely call UIImageJPEGRepresentation, or try to determine if the image is too large beforehand?
Why not check for the image file size? If the image file size exceeds a certain quota, then resize it.
let image: Data = UIImagePNGRepresentation(image)
var imageSize: Double = (image.length)/1024 // in KB

Resizing a Screenshot Taken With UIGraphicsBeginImageContextWithOptions

In an app I'm developing, the customer enters in some personal information and signs a box on the screen. I then programmatically screenshot the the full view and convert it to base64.
This is working fine, however, due to the size of the image it takes approximately 15 seconds to convert it to base64 and send it to an API server. The image size at full quality is the same size as the iPad Air resolution (1536x2048).
Once saved on the server as a PNG, the image weighs in at around 410kb.
I want to resize the image capture down to 768x1024 (half of what it is) without losing clarity. I think this will save both time and storage space.
I'm currently taking the "screenshot" using the following function:
func screenCaptureAndGo() {
//Create the UIImage
UIGraphicsBeginImageContextWithOptions(view.frame.size, view.opaque, 0.0)
view.layer.renderInContext(UIGraphicsGetCurrentContext())
let image = UIGraphicsGetImageFromCurrentImageContext()
UIGraphicsEndImageContext()
imageData = UIImagePNGRepresentation(image)
imageDataString = imageData.base64EncodedStringWithOptions(NSDataBase64EncodingOptions.Encoding64CharacterLineLength)
}
I left out the API server stuff as it's not really relevant.
I ended up just using the Scale property. It didn't work the first time, which is why I didn't think to use it again, but after toying with it I got it to work.
I'm using 0.7 as the scale amount, and the images now clock in at 100kb, and it takes 4 seconds to convert and send the base64 string.

are Asset thumbnail pre-saved in iOS

I am writing an app that relays an image saved on the iOS device to a larger screen. Since a fullRes image is too large and takes long to transfer (using an on-device CocoaHTTP server), I am trying to load thumbnail first.
In Windows, we have a thumbs.db, which means that if we access that, there is no image-resizing etc ... its a thumbnail version of the image pre-saved by the OS.
Does the [UIImage imageWithCGImage:asset.aspectRatioThumbnail] for the ALAsset class in iOS does the same action, or does it load the complete hi-res image and then scales it down before returning?
The documentation does not specify but from my experiments reading the thumbnail is 5 times faster than loading the image from disk (even without decoding or scaling it). I assume iOS stores the pre-made thumbnails somewhere for fast access.

Resources