As you can imagine these keywords (download image uitableviewcell) will return results related to problems of "lazy loading" which is not what I want to know.
What I want to know is if there is a way to download the images before the cell is displayed.
Would be great if we could download the images for the visible cells plus 5 or 6 in advance, and continue to download 5/6 images in advance, while the table runs
of course all the links are stored in one array, so it's easy to get them in advance
I'm using AFNetworking for all the network operations
Thanks
Naive solution
If your images have a caching policy then you can prefetch as many as you like and on the next actual call where you load the image in the cell it will load it faster (from the cache). This is a bit naive, since you base everything on the predicate that the server uses caching mechanisms on the client side to reduce the number of network operations (but this is not always the case).
Better Solution
Also, you can implement your own data structure (for example extend NSMutableDictionary) to store, handle the images and download the images whenever you want. For example when you load the table view you can start downloading the images that are going to be displayed plus a few more images for the next cells. However, as #picciano said the scrolling of a table view can be pretty quick so, prefetching is just a waste of your time.
Comment
This kind of prefetching makes sense only when the user is connected to a high speed network. But in case the user is connected to a slow network, then you will create a big chunk of network operations which will take time to complete (not very battery efficient).
Related
I have a collection view that displays 3 images, two labels, and 1 attributed string(strings are of different colors and font sizes and values are not unique for every cell). One of the images is coming from the web and I used AFnetworking to do the downloading and caching. The collection view displays 15 cells simultaneously.
When I scroll I can only achieve 25 frames/sec.
Below are the things I did:
-Processing of data were done ahead and cached to objects
-Image and views are opaque
-Cells and views are reused
I have done all the optimizations I know but I can't achieve at least 55 frames/sec.
If you could share other techniques to speed up the re-use of cells.
I was even thinking of pre-rendering the subviews off screen and cache it somewhere but I am not sure how it is done.
When I run the app on the iPhone it is fast since it only shows at least four cells at a time.
The first thing you need to do is fire up instruments and find out if you're CPU-bound (computation or regular I/O is a bottleneck) or GPU-bound (the graphics card is struggling). . depending on which of these is the issue the solution varies.
Here's a video tutorial that shows how to do this (among other things) . . This one is from Sean Woodhouse # Itty Bitty Apps (they make the fine Reveal tool).
NB: In the context of performance tuning we usually talk about being I/O bound or CPU bound as separate concerns, however I've grouped them together here meaning "due to either slow computation or I/O data is not getting to the graphics card fast enough". . if this is indeed the problem, then the next step is to find out whether it is indeed related to waiting on I/O or the CPU is maxed-out.
Instruments can be really confusing at first, but the above videos helped me to harness its power.
And here's another great tutorial from Anthony Egerton.
What is the size of the image that you use?
One of the optimization technique which would work is that
Resize the image so that it matches the size of the view you are displaying.
I'm working on an iPad-only iOS app that essentially downloads large, high quality images (JPEG) from Dropbox and shows the selected image in a UIScrollView and UIImageView, allowing the user to zoom and pan the image.
The app is mainly used for showing the images to potential clients who are interested in buying them as framed prints. The way it works is that the image is first shown, zoomed and panned to show the potential client if they like the image. If they do like it, they can decide if they want to crop a specific area (while keeping to specific aspect ratios/sizes) and the final image (cropped or not) is then sent as an email attachment to production.
The problem I've been facing for a while now, is that even though the app will only be running on new iPads (ie. more memory etc.), I'm unable to find a method of handling the images so that the app doesn't get a memory warning and then crash.
Most of the images are sized 4256x2832, which brings the memory usage to at least 40MB per image. While I'm only displaying one image at a time, image cropping (which is the main memory/crash problem at the moment) is creating a new cropped image, which in turn momentarily bumps the apps total RAM usage to about 120MB, causing a crash.
So in short: I'm looking for a way to manage very large images, have the ability to crop them and after cropping still have enough memory to send them as email attachments.
I've been thinking about implementing a singleton image manager, which all the views would use and it would only contain one big image at a time, but I'm not sure if that's the right way to go, or even if it'd help in any way.
One way to deal with this is to tile the image. You can save the large decompressed image to "disk" as a series of tiles, and as the user pans around pull out only the tiles you need to actually display. You only ever need 1 tile in memory at a time because you draw it to the screen, then throw it out and load the next tile. (You'll probably want to cache the visible tiles in memory, but that's an implementation detail. Even having the whole image as tiles may relieve memory pressure as you don't need one large contiguous block.) This is how applications like Photoshop deal with this situation.
I ended up sort of solving the problem. Since I couldn't resize the original files in Dropbox (the client has their reasons), I went ahead and used BOSImageResizeOperation, which is essentially just a fast, thread-safe library for quickly resizing images.
Using this library, I noticed that images that previously took 40-60MB of memory per image, now only seemed to take roughly half that. Additionally, the resizing is so quick that the original image gets released from memory so fast, that iOS doesn't execute a memory warning.
With this, I've gotten further with the app and I appreciate all the idea, suggestions and comments. I'm hoping this will get the app done and I can get as far away from large image handling as possible, heh.
When you apply a number of Core Image filters to an image, memory can quickly become a limiting factor (and often leading to a crash of the application). I was therefore wondering what a good approach is to add one filter at a time and wait for each operation to complete.
The example that I am working on involves one photo to which the user can apply various effects/filters. The user is presented with a small thumbnail to get an idea of what each filter looks like. When all the filters are applied at one, the application runs out of its assigned amount of memory and crashes.
In short, how do I go about applying one filter at a time and be notified when the operation is complete so I can apply the next filter to the next thumbnail?
I'm trying to code myself the Apple Photo.app for iOS.
All is good but when I select an album to see all my pictures it's a bit slow to load all my pictures. All my thumbnail are saved in a database. I manage them with Core Data.
So when I select an album I create a specific request and add all my thumbnail in a scrollview. But I have to wait to see all my thumbnail.
In Photos.app when I select an album all pictures are directly loaded.
How has Apple improved that ?
Thanks a lot !
I am assuming that you are using UITableViewCells with 4 images in each, and that you are recycling the cells.
JPEGs take a lot of decompression CPU cycles. There was a specific mention of this at Apple's recent developer conference. Most likely this is what is slowing you down.
Solution: use PNGs and make sure they are absolutely optimized for the size and resolution that is are the minimum requirements for the thumbnail images. Core data should be fast enough to provide smooth scrolling for thousands of thumbnail images.
When developing a mobile app, and letting the user take photos (That later will be shown in full size also) but are also viewed in the table views (mid size) and even in the Google maps pin title view, Should I create a thumbnail/s for every image the user take for the smaller ones? or should I just use the regular image?
I am asking because From the tutorials i saw, and as a web developer, all I could figure out is that when using a web service to get groups of small images you usually get the thumbnails first and only when needed get the Full size image.
But this is an embedded (I know it is not embedded, but i don't have a better way to describe this) app, that all the data sits on the device, So there is no upload performance issues, just memory and processor time issues (loading to view the big HD photos that the cameras take today is very heavy I think.
Any way, What is best practice for this?
Thank you,
Erez
It's all about memory usage balanced with performance. If you don't create thumbnails for each photo, there are only so many photo you can hold in memory before you receive memory warnings or have your app terminated by the system (maybe only 6-8 full size UIImages). To avoid that, you might write the photos out to the file system and keep a reference to their location. But then your tableview scrolling will suffer as it attempts to read photos from the file system for display.
So the solution is to create thumbnails for each photo so that you can store a lot of them in memory without any troubles. Your tableview will perform well as the photos are quickly accessible from memory. You'll also want to write the full size photos to the file system (and keep a reference to their location) to avoid having to store them in memory. When it's time to display the full size image, retrieve it from the file system and store it in memory. When it's no longer needed, release it.
I'm assuming that you're in iOS4, and you are saving the photos in the Asset library, there is already a method for you.
http://developer.apple.com/library/ios/#documentation/AssetsLibrary/Reference/ALAsset_Class/Reference/Reference.html
You're looking for the "thumbnail" method.
So, save the large image, and compute the thumbnail when required, I believe, is the way to go.