UIImageJPEGRepresentation crashes in Share Extension on iPhone SE - ios

I have a share extension that lets you crop an image and then upload it to our service. We call UIImageJPEGRepresentation to get the image's data before we upload it, but causes a crash due to memory excessive memory. This only happens with large images, and (as far as we can tell) on the SE, and didReceiveMemoryWarning is not called first. This is happening when using the Photos app.
Is there anyway to safely call UIImageJPEGRepresentation, or try to determine if the image is too large beforehand?

Why not check for the image file size? If the image file size exceeds a certain quota, then resize it.
let image: Data = UIImagePNGRepresentation(image)
var imageSize: Double = (image.length)/1024 // in KB

Related

Size of document increases on iOS device compare to the Android/Windows device

There is one very interesting issue I face at the moment in my iOS application.
The image size got increased by the random number of percentage.
What I have observed is as below
When I choose the same image from the photo library and try to send the image by converting it to data, though the multipart form data API.
The image size also got increased in multiple times of the original image size.
I use the below code to convert the image into data bytes
img.jpegData(compressionQuality: 1.0)
The data length is around 90 MB.
The original image is available here.
Does anyone know where is the issue and how to resolve it?

iOS huge memory consumption when show image

My application receiving a response from a server. The response contains png representation of some image in its body. The size is few kilobytes. I create an image object with
UIImage(data: response.data)
The image is shown by
imageView.image = image
Right after it, the application consumes around 100Mb more. It does not reproduce on the simulator. How is it possible? How could I fix the issue?

Large jpeg using more memory than smaller jpeg when loaded in a UIImageView. Why?

The login view in our app uses large background images. To try and save on memory/app size I resized and compressed these images, which reduced their filesize significantly (less than 1mb, down from several mb).
When monitoring my apps memory usage (XCode debugger) there is a clear spike when a modified image is displayed (around 30-40mb). I'd accepted this as normal and simply made sure to release the image asap to limit memory usage.
I've recently started replacing a couple of the images and wanted to preview the new ones before resizing/compressing them. I noticed that these images (one of which is 11mb on disk and 4640x3472 pixels) has no visible effect on app memory usage whatsoever, increasing 'Other Processes' instead (by around 20-30mb).
Can anyone explain what's happening here? I want to confirm it is advisable to continue resizing/compressing the images.
Note that I'm loading the images using UIImage(contentsOfFile:) and I resized/compressed the images using GIMP. The new images have been taken straight from Flickr and unmodified.
Cheers.
The in-memory size of the image (as a UIImage) is different to the compressed on-disk size (your JPEG)
The UIImage takes 4 bytes (RGBA) per pixel x height x with - so for a 4640 x 3472 image, you're looking at 64,440,320 bytes - quite different to the 11MB on disk

Memory Management On Large Image Collection Apps

What's the basic idea behind how large apps that contain many images (especially in a table or collection view) manage caching with dequeued cells? No one really answered my previous question about this iOS Memory Warnings and I'm trying to populate a collection view with 100+ images from Parse. After failing to find a good solution with manual code along with down-scaling my images to super low quality, I've tried a number of different libraries, including SDWebImage, LRImageManager, Haneke, and PINRemoteImage. I just want to get a good idea or approach on tackling my app's memory management. How is it that other apps can load hundreds of "fair" quality photos in a dequeue-able view? Is it rocket science?
The answer is a combination of things. Don't load all your images in memory at the same time. Save them as files to your documents or caches directory, and load them for each cell in cellForRowAtIndexPath, and simply discard the in-memory image when the user scrolls it off-screen.
Also, don't load a large image as a thumbnail. Instead, when you download it, immediately generate a thumbnail version at your target display size and save that (AND the original version, if you need it.) Then you can load and display the thumbnails, with much lower memory impact. Since you're generating the thumbnails for a particular device, you can generate only the thumbnail size you need (retina or non-retina, sized for the screen on the current device. {iPad/4" phone, 5" phone, iPhone 6, 6+, etc})
You should have a properly configures memory cache (NSCache, for instance) that would store images resized for their displayed size, which is good for both memory and drawing performance. Here's a tutorial on how to configure NSCache to store images. And that's exactly what DFImageManager does.
let targetSize = CGSize(width: 100, height: 100)
let request = DFImageRequest(resource: imageURL, targetSize: targetSize, contentMode: .AspectFill, options: nil)
let task = DFImageManager.imageTaskForRequest(request) { (image, _, _, _) -> Void in
// Image is resized to fill 100x100 px target size
// Image is stored into mem cache, next request would finish synchronously
var fetchedImage = image
}.resume()
// NSURLSession stores original data into NSURLCache

are Asset thumbnail pre-saved in iOS

I am writing an app that relays an image saved on the iOS device to a larger screen. Since a fullRes image is too large and takes long to transfer (using an on-device CocoaHTTP server), I am trying to load thumbnail first.
In Windows, we have a thumbs.db, which means that if we access that, there is no image-resizing etc ... its a thumbnail version of the image pre-saved by the OS.
Does the [UIImage imageWithCGImage:asset.aspectRatioThumbnail] for the ALAsset class in iOS does the same action, or does it load the complete hi-res image and then scales it down before returning?
The documentation does not specify but from my experiments reading the thumbnail is 5 times faster than loading the image from disk (even without decoding or scaling it). I assume iOS stores the pre-made thumbnails somewhere for fast access.

Resources