Storing UIImage for later use - ios

I am making what can be described as an image manipulation app. My idea is that the user imports UIImages through the pod YPImagePicker and they are stored in two arrays, one for thumbnails and one for the fullsized images. A UICollectionView is then populated with the thumbnails and when the user taps the thumbnail the fullsized image is displayed in a UIImageView.
I am having memory issues with this solution. The RAM hits 300 MB on an iPhone X when I have roughly 10-12 images imported, which I have understood is too much. I guess it is because I store all the fullsized images in an array? Should I store the fullsized images on the users hard drive and not in RAM-memory? Or is there any way I can access the images from the users photo library and fetch the image when the user taps the thumbnail?

Related

Picking images from camera and album in iOS Swift 5

I'm working in the application where users will select multiple images from album or taking pictures from camera and send it to server via REST API. Size limit is 5MB for all images. Is there any way to determine the size of images while selecting it from UIImagePicker before loading into application?. Also suggest me the better approach do the same
You can’t do it using built in UIImagePickerController. You can write your own asset picker.

Memory issue in storing and retrieving Image from database

I am working on application which include native SQLite database in which I am storing and retrieving images and showing into My application.
Now my problem is Like I am storing lots of images in directory and its path storing into database. So when I retrieve that path from database and load image into application, Memory Increases upto 10-20 Mb per image.
I also tried to store image data into database but same issue, Memory increase 10-20 Mb per image.
Please what should I do for this memory issue ?
Help me with it
Images, when used in the app, may require considerably more memory than the size of the asset in persistent storage might otherwise suggest. Assets are frequently compressed (e.g. JPG or PNG), but when you use the image, they're uncompressed, often requiring 4 bytes per pixel (one byte for red, green, blue, and alpha, respectively). So, for example, a iPhone 7+ full-screen retina image can require 14mb when you use the image. So, the memory-efficient technique is to employ lazy loading, not creating the UIImage objects until you absolutely need them. And, as Jerry suggested, because the amount of memory is determined by the size of the image and not the imageview in which you use the image, if your images have dimensions greater than required by the UIImageView in which you use them (i.e. width and height of the imageview times the "scale" of the device), you may want to resize the image accordingly.
It may be the case that the images you are trying to display are much larger than they need to be on the screen. Try loading the images into memory, creating a version with the size you need, and then using that. Of course, you could implement some caching so you don't have to keep resizing the same images. But then when the original image goes out of scope, its memory will be released. If you always need your images at the same size, try resizing them before storing them in the database, but if you want to support multiple sizes (for different devices, maybe), then store the images at the largest size you need and resize for the others.

Best practice for retrieving many images from Parse?

I have an app which is photo-based, and had a ton of large-scaled resolution images on Parse. I currently have my app set up to grab these images from Parse with a query, and storing each photo into an UIImage array with a loop, and then displaying these photos in a UICollectionView.
It works great, if we are pulling less than 10 photos from Parse. However, if I am retrieving, say 20 photos, when I scroll down my UICollectonView after the photos have been loaded, around the 18th or so photo, my app will crash, and Xcodes console will output "Received memory warning".
What is the best practice for retrieving a large amount of large sized photos from Parse? (If you are displaying them in a UICollectionView)
I would download only the thumbnails. You can even do this using lazy loading (http://www.theappguruz.com/blog/ios-lazy-loading-images) if so inclined.
When the image is clicked on and you want to see it full size, then you download the full size image :)
Set a limit on how many images you want to retain in memory, and use a queue to decide when to store/get rid of the old images.

Camera App iOS memory usage

I'm making a camera app in iOS. Everything is fine except from the memory usage. This is the flow of the "takeImageTask":
AVCaptureStillImageOutput -> NSData -> UIImage (which I crop and do other changes to) -> save to ALAssetsLibrary
I need to keep the UIImage in an array because I need to have them in a UICollectionView, and I have the possibility to show the images in a "big size" view.
This flow uses a lot of memory, so what I thought could be a better solution was to use the AssetURL I get from the AVCaptureStillImageOutput writeImageToSavedPhotosAlbum to fetch the images from the phone storage. My app also have the possibility to fetch images from the photo album by using UIImagePickerController, and I've noticed that this just uses a fraction of the memory compared to the "takeImageTask". I first made the app as an Android version, and there I just stored all the URLs, and used them to present images. Does anyone have any experience using asseturls to present images in iOS?
I need to keep the UIImage in an array because I need to have them in a UICollectionView, and I have the possibility to show the images in a "big size" view.
This is flawed logic. You might want to keep a minimal size thumbnail image in memory for the collection view, but you should even drop these from memory before the count gets too big. And the big images should be loaded from disk on demand because it is relatively infrequent and relatively expensive (using the asset URL is a good plan for this).

Display large images on iOS without precut tiles

I'm building a camera application that saves the image data to a single JPEG file in the sandbox. The images average at about 2mb in size.
Problem : I cannot display the images in a photo viewer because having a few images in memory throws memory warnings and makes scrolling through the images very slow.
I cannot split the image into tiles and save them to disk because that's even more expensive than displaying the single image. I tried splitting the image up into tiles upon capture, but on the 5S it took, on average, 5 1/2 seconds to save all the tiles to disk. It will only get worse from there as the app is executed on older devices. This is very bad because what if the user exists the app in the middle of the save? I will have missing tiles and no uncompressed original file to find missing tiles later.
Question : what's the best way to show a full sized image without causing memory issues and keeping the scrolling fast? Tons of camera applications on the App Store do this and the Photos app does this, there has to be a good solution.
Note : I'm currently showing a thumbnail of the image and then loading the full size image from disk in another thread. Once the full size image loading has finished, I present the full size image on the main thread. This removes the memory issues because I only have one full size image in memory at once, with two thumbnails, but still causes lagging on the scrollview because drawing the full size image in the main thread is still pretty expensive.
I would greatly appreciate any input!
you could..
create a down sized thumb nail..
create a smaller image and save that in a different "sandbox" folder.. and read that for browsing.. then after that load the image if the user wants to look at it full size.
One way to deal with this is to tile the image.
You can save the large decompressed image to "disk" as a series of tiles, and as the user pans around pull out only the tiles you need to actually display. You only ever need 1 tile in memory at a time because you draw it to the screen, then throw it out and load the next tile. (You'll probably want to cache the visible tiles in memory, but that's an implementation detail. Even having the whole image as tiles may relieve memory pressure as you don't need one large contiguous block.)
This is how applications like Photoshop deal with this situation.
Second way which I suggest you is to
check the example from Apple for processing large images called PhotoScroller. The images have already been tiled. If you need an example of tiling an image in Cocoa check out cimgf.com
Hope this will helps you.

Resources