App crashed when I display a large image by UIImageView - ios

I set a image with 10000 * 10000 pixels to UIImageView from network by SDWebImage, and App crashed because it allocated too much memory. I tried to resize the image that had been loaded by SDWebImage, so I add the code below:
UIGraphicsBeginImageContext(size);
CGContextRef context = UIGraphicsGetCurrentContext();
CGContextClearRect(context, CGRectMake(0, 0, size.width, size.height));
CGContextSetInterpolationQuality(context, 0.8);
[self drawInRect:drawRect blendMode:kCGBlendModeNormal alpha:1];
UIImage *image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
Although the image size was smaller, my app crashed due to the same reason.
It seems that there are some rendering action during the resizing action, the memory would rose to 600M and fell to 87M in a little while.
How can I resize a image without rendering?
It seems that display the image by UIWebView did not exist the problem. How it works?
Any help and suggestions will be highly appreciable。
Resolution:
https://developer.apple.com/library/ios/samplecode/LargeImageDownsizing/
The Resolution does work for jpg but not for png

You can't unpack the image into memory because it's too big. This is what image tiling is for, so you would download a set of tiles for the image based on the part you're currently looking at (the zoom position and scale).
I.e. if you're looking at the whole image you get 1 tile which is zoomed out and therefore low quality and small size. As you zoom in you get back other small size images which show less of the image 'area'.
The web view is likely using the image format to download a relatively small image size that is a scaled down version of the whole image, so it doesn't need to unpack the whole image to memory. It can do this because it knows your image is 10,000x10,000 but that it is going to be displayed on the page at 300x300 (for example).

Did you try to use : UIImageJPEGRepresentation (or UIImagePNGRepresentation)
You can make your image size smaller with it.

Related

Show image of 10000x8000 pixels in iOS app

I'm trying to show image in my iOS APP.The image is of 10000x8000, which is much higher than iPhone's screen resolution, if I add it to UIImageView ,the APP will receive memory warnings and lead to crash.Can any one give me a advice about how to deal with it? Thanks a lot.
This is the sample app from apple which displays large size image, it uses TileImageView scale the image and reuse tiles according to present zoom.
https://developer.apple.com/library/ios/samplecode/LargeImageDownsizing/Introduction/Intro.html
Or if you simple want scale down the image, you could use this.
+ (UIImage*)imageWithImage:(UIImage*)image
scaledToSize:(CGSize)newSize;
{
UIGraphicsBeginImageContext( newSize );
[image drawInRect:CGRectMake(0,0,newSize.width,newSize.height)];
UIImage* newImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return newImage;
}
This is just a side note and not an answer.
#AmitTandel's answer will surely help you but,
remember, you should not always request for this high resolution image instead you should implement a logic which stores this high resolution image to somewhere after reducing it using #Amit's answer. And then at the time of re-requesting for the same image url, you should first look up in to your stored directory, if that image not exist only then you may request for an image.
But yes, I'm not suggesting you to follow my answer as you will require to download this very high resolution sized image (which is actually none of use), so if you're getting this image from your server, you can ask them to give you reduced size for images. I've heard that, they can reduce images to different sizes to help apps showing specific images at variant places.

iOS: Instruments shows imageio_png_data is 300x larger in size than its actual image size

I have an image that is only 28KB in size:
I'm adding it to my view using this code:
UIImageView *background = [UIImageView new];
background.frame = CGRectMake(0, 0, 1080, 1920);
background.image = [UIImage imageNamed:#"Submit.png"];
[self.view addSubview:background];
Now I'm profiling with Instruments Allocation and "Marking Generation" right before and right after the image is allocated:
Instruments indicates that it took 7.92MB to load the image into memory.
I'm seeing the same issue with other images as well.
Why is ImageIO_PNG_Data at 7.92MB when the image is only 28KB in size?
#matt and #dan really did a good job of explaining why an uncompressed image should take up literally 300X memory of the actual PNG image size to display on the screen. What makes this issue worse is that the iOS caches these images and does NOT release them from cache EVER, even on memory warnings.
So here's a way to prevent image caching on iOS to save up a ton of memory, just use imageWithContentsOfFile instead of imageNamed:
Replace:
background.image = [UIImage imageNamed:#"Submit.png"];
With:
background.image = [UIImage imageWithContentsOfFile:[[[NSBundle mainBundle] bundlePath] stringByAppendingString:#"/Submit.png"]];
and now the ImageIO_PNG_Data's will be released when the view controller is dismissed.
It's all right here:
https://developer.apple.com/library/ios/documentation/UIKit/Reference/UIImage_Class/#//apple_ref/occ/clm/UIImage/imageNamed:
If you have an image file that will only be displayed once and wish to
ensure that it does not get added to the system’s cache, you should
instead create your image using imageWithContentsOfFile:. This will
keep your single-use image out of the system image cache, potentially
improving the memory use characteristics of your app.
It's because a PNG is compressed data describing what the image looks like, so a PNG that is nothing but a solid color is tiny because it is easy to describe. But the bitmap is the bitmap - just a grid of pixels - and depends purely on the dimensions of the image (which, in your case, is immense).

Objective-C How does snapchat make the text on top of an image/video so sharp and not pixelated?

In my app, it allows users to place text on top of images like snapchat, then they are allowed to save the image to their device. I simply add the text view on top of the image and take a picture of the image using the code:
UIGraphicsBeginImageContext(imageView.layer.bounds.size);
[imageView.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage* savedImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
But when I compare the text on my image, to the text from a snapchat image...it is significantly different. Snapchat's word text on top of image is significantly sharper then mine. Mine looks very pixelated. Also I am not compressing the image at all, just saving the image as is using ALAssetLibrary.
Thank You
When you use UIGraphicsBeginImageContext, it defaults to a 1x scale (i.e. non-retina resolution). You probably want:
UIGraphicsBeginImageContextWithOptions(imageView.layer.bounds.size, YES, 0);
Which will use the same scale as the screen (probably 2x). The final parameter is the scale of the resulting image; 0 means "whatever the screen is".
If your imageView is scaled to the size of the screen, then I think your jpeg will also be limited to that resolution. If setting the scale on UIGraphicsBeginImageContextWithOptions does not give you enough resolution, you can do your drawing in a larger offscreen image. Something like:
UIGraphicsBeginImageContext(imageSize);
[image drawInRect:CGRectMake(0,0,imageSize.width,imageSize.height)];
CGContextScaleCTM(UIGraphicsGetCurrentContext(),scale,scale);
[textOverlay.layer renderInContext:UIGraphicsGetCurrentContext()];
newImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
You need to set the "scale" value to scale the textOverlay view, which is probably at screen size, to the offscreen image size.
Alternatively, probably simpler, you can start with a larger UIImageView, but put it within another UIView to scale it to fit on screen. Do the same with your text overlay view. Then, your code for creating composite should work, at whatever resolution you choose for the UIImageView.

Drawing retina versus non-retina images

UIImage 1: Loaded from a file with the #2x modifier with size 400x400, thus UIImage 1 will report its size as 200x200
UIImage 2: Loaded from a file without the #2x modifier with size 400x400, thus UIImage 2 will report its size as 400x400
I then create 2 images from the above applying the code below to each
UIGraphicsBeginImageContextWithOptions(CGSizeMake(400,400), YES, 1.0);
[image drawInRect:CGRectMake(0, 0, 400, 400)];
UIImage *rescaledI = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
Considering the above, can I expect the image quality for both resulting images to be exactly the same? (I am trying to determine if drawing a 200x200 retina image to a 400x400 non-retina context will degrade quality versus drawing the same image not loaded as a retina image)
Just return the current image's size.
UIImage *image1 = [UIImage imagedNamed:#"myimage.png"];
//access width and height like this
image1.size.width;
image1.size.height;
UIGraphicsBeginImageContextWithOptions(CGSizeMake(image1.size.width, image1.size.height), YES, 1.0);
[image drawInRect:CGRectMake(0, 0, image1.size.width, image1.size.height)];
UIImage *rescaledI = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
Of course if you should replace image1 with whatever image you are trying to get the size of. Some switch statement or if statement should do the trick for you.
Never hardcode sizes/dimensions/locations etc. You should always pull that info dynamically by asking your image its size. Then you can change the size of your image without fear of having to locate the hardcoded size in your application.
The image is always 400*400 pixels: the difference is that in a retina display 400*400 pixels cover less space, that is exactly half (200*200 Core Graphics points). If you are not applying any transformation to the image will stay exactly the same.
The code you wrote renders the image as is because you are overriding the device scale factor and setting it to always 1 (1 pixel to 1 point).
You should use two images, one twice as big, if you want your image to cover the same amount of screen on both retina and non retina devices.

Resize UIImage in Place

I have a category (very popular code found on web) to UIImage to do various image manipulation.
- (UIImage *)imageScaledToSize:(CGSize)newSize {
UIGraphicsBeginImageContext(newSize);
[self drawInRect:CGRectMake(0, 0, newSize.width, newSize.height)];
UIImage *newImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return newImage;
}
One aspect that I am making heavy use of is scaling an image down. My app can take quite large images and scales them down to a "working" size. However, there are still times when the app crashes due to memory. This is because the category creates a new scaled image from the original. Therefore, the original HUGE image is still resident while the new smaller (but still big) image is created.
So, my question is, is there a way to load this large original image and rescale it in place? That is, rescale the original without creating a new image, and not allocing more memory?
Yes, and there is even a complete working Apple sample project that does this for you.
As far as I know there is no limitation on what size image it can scale down. Of course though, the larger the image the more time consuming the process is.

Resources