Resizing and compressing the image which am getting from url - ios

In my project am trying to compressing the images (coming from the server).I used HJManager for compressing. In that they are compressing the uiimage to user mentioned size, thats cool but taking too much of app memory.For 30 images it is taking almost 400 mb of live bytes. and its getting crashing. If you want to see my code, i'll paste it. But i want to know the best way to compress(mbs to kbs) the image which am getting from the server without disturbing the UI.

Switch to SDWebImage. Much better caching library with smaller memory footprint than HJManager and more modern, it uses blocks with completion handler. Last but not least, it is actively maintained which means that even if there is an issue, the community patches it.
You can resize the image easily with a method like this:
+ (UIImage*)imageWithImage:(UIImage*)image
scaledToSize:(CGSize)newSize;
{
UIGraphicsBeginImageContext( newSize );
[image drawInRect:CGRectMake(0,0,newSize.width,newSize.height)];
UIImage* newImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return newImage;
}

Related

App crashed when I display a large image by UIImageView

I set a image with 10000 * 10000 pixels to UIImageView from network by SDWebImage, and App crashed because it allocated too much memory. I tried to resize the image that had been loaded by SDWebImage, so I add the code below:
UIGraphicsBeginImageContext(size);
CGContextRef context = UIGraphicsGetCurrentContext();
CGContextClearRect(context, CGRectMake(0, 0, size.width, size.height));
CGContextSetInterpolationQuality(context, 0.8);
[self drawInRect:drawRect blendMode:kCGBlendModeNormal alpha:1];
UIImage *image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
Although the image size was smaller, my app crashed due to the same reason.
It seems that there are some rendering action during the resizing action, the memory would rose to 600M and fell to 87M in a little while.
How can I resize a image without rendering?
It seems that display the image by UIWebView did not exist the problem. How it works?
Any help and suggestions will be highly appreciable。
Resolution:
https://developer.apple.com/library/ios/samplecode/LargeImageDownsizing/
The Resolution does work for jpg but not for png
You can't unpack the image into memory because it's too big. This is what image tiling is for, so you would download a set of tiles for the image based on the part you're currently looking at (the zoom position and scale).
I.e. if you're looking at the whole image you get 1 tile which is zoomed out and therefore low quality and small size. As you zoom in you get back other small size images which show less of the image 'area'.
The web view is likely using the image format to download a relatively small image size that is a scaled down version of the whole image, so it doesn't need to unpack the whole image to memory. It can do this because it knows your image is 10,000x10,000 but that it is going to be displayed on the page at 300x300 (for example).
Did you try to use : UIImageJPEGRepresentation (or UIImagePNGRepresentation)
You can make your image size smaller with it.

Show image of 10000x8000 pixels in iOS app

I'm trying to show image in my iOS APP.The image is of 10000x8000, which is much higher than iPhone's screen resolution, if I add it to UIImageView ,the APP will receive memory warnings and lead to crash.Can any one give me a advice about how to deal with it? Thanks a lot.
This is the sample app from apple which displays large size image, it uses TileImageView scale the image and reuse tiles according to present zoom.
https://developer.apple.com/library/ios/samplecode/LargeImageDownsizing/Introduction/Intro.html
Or if you simple want scale down the image, you could use this.
+ (UIImage*)imageWithImage:(UIImage*)image
scaledToSize:(CGSize)newSize;
{
UIGraphicsBeginImageContext( newSize );
[image drawInRect:CGRectMake(0,0,newSize.width,newSize.height)];
UIImage* newImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return newImage;
}
This is just a side note and not an answer.
#AmitTandel's answer will surely help you but,
remember, you should not always request for this high resolution image instead you should implement a logic which stores this high resolution image to somewhere after reducing it using #Amit's answer. And then at the time of re-requesting for the same image url, you should first look up in to your stored directory, if that image not exist only then you may request for an image.
But yes, I'm not suggesting you to follow my answer as you will require to download this very high resolution sized image (which is actually none of use), so if you're getting this image from your server, you can ask them to give you reduced size for images. I've heard that, they can reduce images to different sizes to help apps showing specific images at variant places.

How to find UIImage Bottleneck

I have an app that uses UIImage objects. Up to this point, I've been using image objects initialized using something like this:
UIImage *image = [UIImage imageNamed:imageName];
using an image in my app bundle. I've been adding functionality to allow users to use imagery from the camera or their library using UIImagePickerController. These images, obviously, can't be in my app bundle, so I initialize the UIImage object a different way:
UIImage *image = [UIImage imageWithContentsOfFile:pathToFile];
This is done after first resizing the image to a size similar to the other files in my app bundle, in both pixel dimensions and total bytes, both using Jpeg format (interestingly, PNG was much slower, even for the same file size). In other words, the file pointed to by pathToFile is a file of similar size as an image in the bundle (pixel dimensions match, and compression was chosen so byte count was similar).
The app goes through a loop making small pieces from the original image, among other things that are not relevant to this post. My issue is that going through the loop using an image created the second way takes much longer than using an image created the first way.
I realize the first method caches the image, but I don't think that's relevant, unless I'm not understanding how the caching works. If it is the relevant factor, how can I add caching to the second method?
The relevant portion of code that is causing the bottleneck is this:
[image drawInRect:self.imageSquare];
Here, self is a subclass of UIImageView. Its property imageSquare is simply a CGRect defining what gets drawn. This portion is the same for both methods. So why is the second method so much slower with similar sized UIImage object?
Is there something I could be doing differently to optimize this process?
EDIT: I change access to the image in the bundle to imageWithContentsOfFile and the time to perform the loop changed from about 4 seconds to just over a minute. So it's looking like I need to find some way to do caching like imageNamed does, but with non-bundled files.
UIImage imageNamed doesn't simply cache the image. It caches an uncompressed image. The extra time spent was not caused by reading from local storage to RAM but by decompressing the image.
The solution was to create a new uncompressed UIImage object and use it for the time sensitive portion of the code. The uncompressed object is discarded when that section of code is complete. For completeness, here is a copy of the class method to return an uncompressed UIImage object from a compressed one, thanks to another thread. Note that this assumes data is in CGImage. That is not always true for UIImage objects.
+(UIImage *)decompressedImage:(UIImage *)compressedImage
{
CGImageRef originalImage = compressedImage.CGImage;
CFDataRef imageData = CGDataProviderCopyData(
CGImageGetDataProvider(originalImage));
CGDataProviderRef imageDataProvider = CGDataProviderCreateWithCFData(imageData);
CFRelease(imageData);
CGImageRef image = CGImageCreate(
CGImageGetWidth(originalImage),
CGImageGetHeight(originalImage),
CGImageGetBitsPerComponent(originalImage),
CGImageGetBitsPerPixel(originalImage),
CGImageGetBytesPerRow(originalImage),
CGImageGetColorSpace(originalImage),
CGImageGetBitmapInfo(originalImage),
imageDataProvider,
CGImageGetDecode(originalImage),
CGImageGetShouldInterpolate(originalImage),
CGImageGetRenderingIntent(originalImage));
CGDataProviderRelease(imageDataProvider);
UIImage *decompressedImage = [UIImage imageWithCGImage:image];
CGImageRelease(image);
return decompressedImage;
}

Uploading NSData UIImageJPEGRepresentation too big

I'm uploading an image taken on the iPhone(Converted to NSData) via FTP on an iOS application using https://github.com/lloydsargent/BlackRaccoon.
dataImage = UIImageJPEGRepresentation(image, 0.5);
The problem is that the image size is about 40mb and every time the app finish uploading, Xcode Crashes. How can I make the image smaller?
This method will allow you to resize an image in Objective-C programmatically.
+ (UIImage *)imageWithImage:(UIImage *)image scaledToSize:(CGSize)newSize {
UIGraphicsBeginImageContextWithOptions(newSize, YES, [UIScreen mainScreen].scale);
[image drawInRect:CGRectMake(0,0,newSize.width,newSize.height)];
UIImage* newImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return newImage;
}
Basically I recommend you resize the image and then upload it if the issue is the API you are using. That being said, using the NSURLConnection class methods, I have been able to upload files larger than 40mb. You can check out a post I just made out about how to use those methods to upload files with a PHP back-end here:
Strange issue while posting image from iPhone
Make sure to mark my answer as correct if it helps!

Resize UIImage in Place

I have a category (very popular code found on web) to UIImage to do various image manipulation.
- (UIImage *)imageScaledToSize:(CGSize)newSize {
UIGraphicsBeginImageContext(newSize);
[self drawInRect:CGRectMake(0, 0, newSize.width, newSize.height)];
UIImage *newImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return newImage;
}
One aspect that I am making heavy use of is scaling an image down. My app can take quite large images and scales them down to a "working" size. However, there are still times when the app crashes due to memory. This is because the category creates a new scaled image from the original. Therefore, the original HUGE image is still resident while the new smaller (but still big) image is created.
So, my question is, is there a way to load this large original image and rescale it in place? That is, rescale the original without creating a new image, and not allocing more memory?
Yes, and there is even a complete working Apple sample project that does this for you.
As far as I know there is no limitation on what size image it can scale down. Of course though, the larger the image the more time consuming the process is.

Resources