I want to display a high resolution image in imageview. Here is what i do
Prepare UIImage in the background
When the file is loaded, switch to main thread
Display the image [ by using UIImageView's setImage: ]
The problem is there is a lag in the 3rd step. It takes few seconds and loads the file. And if its a large image, I get the memory warning and a crash. So is there a way by which i can draw images from top to bottom ( like how the browser does) ? Also I need to preserve the quality of the image.
Is there a way by which i can achieve my requirement. Thanks in advance
And if its a large image, I get the memory warning and a crash.
it seems you are hitting some "hard" (memory related) limit of your device trying to load and display the image into memory at once.
So is there a way by which i can draw images from top to bottom ( like how the browser does)?
You should try with CATiledLayer, which provides a way to draw very large images without incurring a memory hit.
Here you can find a tutorial about it.
You could also give a look at the PhotoScrollerNetwork project:
blazingly fast tile rendering - visually much much faster than Apple's code (which uses png files in the file system)
you supply a single jpeg file or URL and this code does all the tiling for you, quickly and painlessly
UIImage will not decode its underlying image data until it is actually requested, which will happen on the main thread when you assign it to an image view. So although you are trying to load the image in a background thread, the work is being delayed and actually occurs on the main thread.
The workarounds to this issue are pretty hacky and usually involve writing the image to a new graphics context to force the decoding, and then creating a CGImageRef or UIImage from that. This all takes place on the background thread. By the time you ship the new image to the main thread, all the decoding has already taken place and you shouldn't see a delay. This question has some answers which demonstrate this technique.
Drawing images from top-to-bottom is not possible with the standard APIs provided by Apple (as far as I know). You would have to write your own streaming image decoder which would be a significant task.
You can use SDWebImage framework which helps you for :
1.An asynchronous image downloader
2.Automatic image caching .
3.Avoid duplications .
You can download this from:https://github.com/rs/SDWebImage
About the memory warning it is not only due to your application its due to effective memory use from all the application running on the background so try to stop the background running applications.
Related
I have an animation which consists of 250 frames. Each frame is 1080x1920 resolution and in PNG format. I need to take all these frames, animate them with CAKeyframeAnimation and render them on a video using AVFoundation tools.
The issue arises when I try to create a values array for CAKeyframeAnimation. Initializing 250 FullHD images causes quite a big memory spike, which system detects and decides to kill the app.
I've tried playing with autoreleasepool but it doesn't seem to help at all (no wonder, because CAKeyframeAnimation needs to hold on those images).
The problem goes even further: I actually have 10 of such animations! And I need to quickly switch between them because the app allows to preview those animations dynamically (which is done via AVSynchronizedLayer) before exporting everything in a video file.
So my question is: how can I load a big array of UIImage instances without system killing the app because of a memory surge? Is there an asynchronous way of loading them?
how can I load a big array of UIImage instances without system killing the app because of a memory surge?
You can't. What you're trying to do is wrong from the outset.
If the goal is to make a "sprite animation" by using keyframe animation to change a layer's content periodically, the animation should consist of a small number of small images.
If the goal is to make a video based on a succession of images, a keyframe animation is not how to do it in the first place.
I'm looking for solution of animation about 50 images on retina iPad each has 2048*1536 size. I want to animate them on finger move(change images on uiimageview sync with touches moved event). Images loads slowly and animation freezes. I want to find any solution to solve that problem. Thanks.
There are a couple of issues that make this situation very hard to deal with. First, the memory usage of 50 full screen images is very large. For some background on how much memory that actually requires, see this blog post Video and Memory usage on iOS devices. The second issue you have run into is CPU usage. A retina iPad has multiple CPUs, but decoding huge PNG images still takes a lot of CPU cycles and that will prevent the animations from running smoothly. So, the only way you will get this to work well is to avoid decoding the image data at runtime and also avoid holding all the decoded data in memory because that would crash the device. The best solution is to simply mmap() all the decoded data and decode it ahead of time, that makes it possible to blit image data into CoreGraphics without actually having to copy the data. If you would like to use my library that does all that, it is linked at the bottom of the blog post.
I'm working on an iPad-only iOS app that essentially downloads large, high quality images (JPEG) from Dropbox and shows the selected image in a UIScrollView and UIImageView, allowing the user to zoom and pan the image.
The app is mainly used for showing the images to potential clients who are interested in buying them as framed prints. The way it works is that the image is first shown, zoomed and panned to show the potential client if they like the image. If they do like it, they can decide if they want to crop a specific area (while keeping to specific aspect ratios/sizes) and the final image (cropped or not) is then sent as an email attachment to production.
The problem I've been facing for a while now, is that even though the app will only be running on new iPads (ie. more memory etc.), I'm unable to find a method of handling the images so that the app doesn't get a memory warning and then crash.
Most of the images are sized 4256x2832, which brings the memory usage to at least 40MB per image. While I'm only displaying one image at a time, image cropping (which is the main memory/crash problem at the moment) is creating a new cropped image, which in turn momentarily bumps the apps total RAM usage to about 120MB, causing a crash.
So in short: I'm looking for a way to manage very large images, have the ability to crop them and after cropping still have enough memory to send them as email attachments.
I've been thinking about implementing a singleton image manager, which all the views would use and it would only contain one big image at a time, but I'm not sure if that's the right way to go, or even if it'd help in any way.
One way to deal with this is to tile the image. You can save the large decompressed image to "disk" as a series of tiles, and as the user pans around pull out only the tiles you need to actually display. You only ever need 1 tile in memory at a time because you draw it to the screen, then throw it out and load the next tile. (You'll probably want to cache the visible tiles in memory, but that's an implementation detail. Even having the whole image as tiles may relieve memory pressure as you don't need one large contiguous block.) This is how applications like Photoshop deal with this situation.
I ended up sort of solving the problem. Since I couldn't resize the original files in Dropbox (the client has their reasons), I went ahead and used BOSImageResizeOperation, which is essentially just a fast, thread-safe library for quickly resizing images.
Using this library, I noticed that images that previously took 40-60MB of memory per image, now only seemed to take roughly half that. Additionally, the resizing is so quick that the original image gets released from memory so fast, that iOS doesn't execute a memory warning.
With this, I've gotten further with the app and I appreciate all the idea, suggestions and comments. I'm hoping this will get the app done and I can get as far away from large image handling as possible, heh.
I want to allow the user to select a photo, without limiting the size, and then edit it.
My idea is to create a thumbnail of the large photo with the same size as the screen for editing, and then, when the editing is finished, use the large photo to make the same edit that was performed on the thumbnail.
When I use UIGraphicsBeginImageContext to create a thumbnail image, it will cause a memory issue.
I know it's hard to edit the whole large image directly due to hardware limits, so I want to know if there is a way I can downsample the large image to less then 2048*2048 wihout memory issues?
I found that there is a BitmapFactory Class which has an inSampleSize option which can downsample a photo in Android platform. How can this be done on iOS?
You need to handle the image loading using UIImage which doesn't actually load the image into memory and then create a bitmap context at the size of the resulting image that you want (so this will be the amount of memory used). Then you need to iterate a number of times drawing tiles from the original image (this is where parts of the image data are loaded into memory) using CGImageCreateWithImageInRect into the destination context using CGContextDrawImage.
See this sample code from Apple.
Large images don't fit in memory. So loading them into memory to then resize them doesn't work.
To work with very large images you have to tile them. Lots of solutions out there already for example see if this can solve your problem:
https://github.com/dhoerl/PhotoScrollerNetwork
I implemented my own custom solution but that was specific to our environment where we had an image tiler running server side already & I could just request specific tiles of large images (madea server, it's really cool)
The reason tiling works is that basically you only ever keep the visible pixels in memory, and there isn't that many of those. All tiles not currently visible are factored out to the disk cache, or flash memory cache as it were.
Take a look at this work by Trevor Harmon. It improved my app's performance.I believe it will work for you too.
https://github.com/coryalder/UIImage_Resize
I am not quite sure whether it is beneficial to draw the visual elements of my app with Core Graphics instead of providing the images. In terms of memory preservation and runtime speed which way is better ?
In terms of memory preservation and runtime speed which way is better?
+UIImage:imageNamed: is most efficient. It caches images, i.e. only one copy of an image is in memory and the image is decoded (from its PNG, JPEG, TIFF, etc. data) when it is needed and kept around for future reuse. If you are worried about memory use, iOS will purge the UIImage cache if you are running low or go into the background.
Using Core Graphics to draw an image does not do any caching for you, unless you write the code to draw your image into a context, save the context as a bitmap, cache the bitmap and then reuse it later on. So you end up drawing the same thing over and over every time it is needed. For example, if you override UIView's -drawRect: to draw imagery, then during animations it will be called for every single frame (60 times a second). This needlessly burns CPU cycles and battery life.
Bottom line is it depends on what your app is and does.
If you dont need your images to Change or animate much ,then you shoud directly use an image.Dont worry so much about performace unless you have like 100 images in a single view controller.
If iPhone can handle games like need for speed , to run an app with various images is an easy task.
Hope this helps.