How to shrink the image taken from camera to 320x320 resolution? [duplicate] - ios

This question already has answers here:
Closed 10 years ago.
Possible Duplicate:
What’s the easiest way to resize/optimize an image size with the iPhone SDK?
I want to change the reolution of image taken from camera to 320x320. Can any one please tell me how to do it.
I know how to take image from camera. So please tell me the rest (i.e) changing the reolution of image.
Thanks in advance

This is called in this post: https://stackoverflow.com/a/613380/1648976
+ (UIImage*)imageWithImage:(UIImage*)image
scaledToSize:(CGSize)newSize;
{
UIGraphicsBeginImageContext( newSize );
[image drawInRect:CGRectMake(0,0,newSize.width,newSize.height)];
UIImage* newImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return newImage;
}
As far as storage of the image, the fastest image format to use with the iPhone is PNG, because it has optimizations for that format. However, if you want to store these images as JPEGs, you can take your UIImage and do the following:
NSData *dataForJPEGFile = UIImageJPEGRepresentation(theImage, 0.6);
This creates an NSData instance containing the raw bytes for a JPEG image at a 60% quality setting. The contents of that NSData instance can then be written to disk or cached in memory.
This does a convert, not exactely to 320*320.. but you can twak the 0.6 to lower or higher.
If this is not what you want, please tell me more precisely

1) Low-pass filter the original image and
2) decimate or
3) resample
If the original dimension is 640x320, it's enough to LP filter and then choose every other sample. That's decimation.
If the original dimension is e.g. 480x320, then one has to still LP filter and interpolate the pixel values for those pixels, that do not align exactly with the original pixels.
The LP filtering is crucial, as without it e.g. a very high resolution chessboard pattern will be re-sampled to noise or weird patterns, caused by an effect called 'frequency aliasing'.

Related

App crashed when I display a large image by UIImageView

I set a image with 10000 * 10000 pixels to UIImageView from network by SDWebImage, and App crashed because it allocated too much memory. I tried to resize the image that had been loaded by SDWebImage, so I add the code below:
UIGraphicsBeginImageContext(size);
CGContextRef context = UIGraphicsGetCurrentContext();
CGContextClearRect(context, CGRectMake(0, 0, size.width, size.height));
CGContextSetInterpolationQuality(context, 0.8);
[self drawInRect:drawRect blendMode:kCGBlendModeNormal alpha:1];
UIImage *image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
Although the image size was smaller, my app crashed due to the same reason.
It seems that there are some rendering action during the resizing action, the memory would rose to 600M and fell to 87M in a little while.
How can I resize a image without rendering?
It seems that display the image by UIWebView did not exist the problem. How it works?
Any help and suggestions will be highly appreciable。
Resolution:
https://developer.apple.com/library/ios/samplecode/LargeImageDownsizing/
The Resolution does work for jpg but not for png
You can't unpack the image into memory because it's too big. This is what image tiling is for, so you would download a set of tiles for the image based on the part you're currently looking at (the zoom position and scale).
I.e. if you're looking at the whole image you get 1 tile which is zoomed out and therefore low quality and small size. As you zoom in you get back other small size images which show less of the image 'area'.
The web view is likely using the image format to download a relatively small image size that is a scaled down version of the whole image, so it doesn't need to unpack the whole image to memory. It can do this because it knows your image is 10,000x10,000 but that it is going to be displayed on the page at 300x300 (for example).
Did you try to use : UIImageJPEGRepresentation (or UIImagePNGRepresentation)
You can make your image size smaller with it.

Show image of 10000x8000 pixels in iOS app

I'm trying to show image in my iOS APP.The image is of 10000x8000, which is much higher than iPhone's screen resolution, if I add it to UIImageView ,the APP will receive memory warnings and lead to crash.Can any one give me a advice about how to deal with it? Thanks a lot.
This is the sample app from apple which displays large size image, it uses TileImageView scale the image and reuse tiles according to present zoom.
https://developer.apple.com/library/ios/samplecode/LargeImageDownsizing/Introduction/Intro.html
Or if you simple want scale down the image, you could use this.
+ (UIImage*)imageWithImage:(UIImage*)image
scaledToSize:(CGSize)newSize;
{
UIGraphicsBeginImageContext( newSize );
[image drawInRect:CGRectMake(0,0,newSize.width,newSize.height)];
UIImage* newImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return newImage;
}
This is just a side note and not an answer.
#AmitTandel's answer will surely help you but,
remember, you should not always request for this high resolution image instead you should implement a logic which stores this high resolution image to somewhere after reducing it using #Amit's answer. And then at the time of re-requesting for the same image url, you should first look up in to your stored directory, if that image not exist only then you may request for an image.
But yes, I'm not suggesting you to follow my answer as you will require to download this very high resolution sized image (which is actually none of use), so if you're getting this image from your server, you can ask them to give you reduced size for images. I've heard that, they can reduce images to different sizes to help apps showing specific images at variant places.

How to get NSData representation of UIGraphicsGetImageFromCurrentImageContext() [duplicate]

This question already has answers here:
convert UIImage to NSData
(7 answers)
Closed 7 years ago.
I'm taking a "snapshot" of the image context in UIGraphicsBeginImageContextWithOptions(UIScreen.mainScreen().bounds.size, true, 0) and eventually creating a UIImage using
var renderedImage = UIGraphicsGetImageFromCurrentImageContext()
However I need to get the NSData representation of this UIImage without using UIImageJPEGRepresentation or UIImagePNGRepresentation (because these produce files that are way larger than the original UIImage). How can I do this?
these produce files that are way larger than the original UIImage). How can I do this?
Image files contain compressed data, while NSData is raw i.e. not compressed. Therefore NSData will in about all cases be larger, when written into a file.
More info at another question: convert UIImage to NSData
I'm not sure what you mean by "way larger" than the original UIImage. The data backing the UIImage object is at least as big as the data you would get by converting it into a JPG, and roughly equivalent to the data you would get by converting it to a PNG.
The rendered image will be twice the screen size in points, because you have rendered a retina screen into an image.
You can avoid this and render the image as non-retina by making your image context have a scale of 1:
UIGraphicsBeginImageContextWithOptions(UIScreen.mainScreen().bounds.size, true, 1)

UIImage resize performance and quality issue

I'm working with UIImage and like everyone else have to deal with retina and non-retina display adaptability. As for as I know, retina display requires double pixels.
I'm wondering if I could simply use a large image with the same width/height ratio, just resize it smaller to adapt all device?
For example, I made a original image with size of 200*200 pixel. Now I want to use it in application as 20*20 pixel, and 80*80 pixel (two situations). Then I have to make four copies like img2020.png, img2020#2x.png, img8080.png and img8080#2x.png
So if I want to use it in three situations with difference size, I have to store 6 copies. Can I just use UIImage's resize function to do this? I've tried a bit but cannot figure out it's quality and performance.
Any ideas? Thanks a lot :)
All native API suppose you to use image.png and image#2x.png, so it may be difficult sometimes to use just one image and scale it depending on retina/non-retina. Moreover using retina graphics on non-retina devices lead to more extensive use of these devices' resource causing battery drain. And, of course, if you have many images, that will decrease performance of your application. In other words there are reasons to use double set of images and you should better use it instead of one large image being scaled.
You don't need to make 6 copies. You should use the size 200*200 pixel. And set the property contentMode of imageview to aspectFit. Or you can also use below function and change the size of images at run time.
-(UIImage *)Resize_Image:(UIImage *)image requiredHeight:(float)requiredheight andWidth:(float)requiredwidth
{
float actualHeight = image.size.height;
float actualWidth = image.size.width;
if (actualWidth*requiredheight <actualHeight*requiredwidth)
{
actualWidth=requiredheight*(actualWidth/actualHeight);
actualHeight=requiredheight;
}
else
{
actualHeight=requiredwidth*(actualWidth/actualHeight); actualWidth=requiredwidth;
}
CGRect rect = CGRectMake(0.0, 0.0, actualWidth, actualHeight);
UIGraphicsBeginImageContext(rect.size);
[image drawInRect:rect];
UIImage *img = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return img;
}
I made some comparisons before. Leaving iOS handle the resizing causes lower quality, and really unacceptable sometimes.
I feel lazy sometimes, my approach is to run it with the retina version, and if it looks bad, I will create a low-res version.
If you're writing an iPhone-only app, most of iPhones on the market has retina, so I don't think you should worry about non-retina version. Just my opinion though.

Resize UIImage in Place

I have a category (very popular code found on web) to UIImage to do various image manipulation.
- (UIImage *)imageScaledToSize:(CGSize)newSize {
UIGraphicsBeginImageContext(newSize);
[self drawInRect:CGRectMake(0, 0, newSize.width, newSize.height)];
UIImage *newImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return newImage;
}
One aspect that I am making heavy use of is scaling an image down. My app can take quite large images and scales them down to a "working" size. However, there are still times when the app crashes due to memory. This is because the category creates a new scaled image from the original. Therefore, the original HUGE image is still resident while the new smaller (but still big) image is created.
So, my question is, is there a way to load this large original image and rescale it in place? That is, rescale the original without creating a new image, and not allocing more memory?
Yes, and there is even a complete working Apple sample project that does this for you.
As far as I know there is no limitation on what size image it can scale down. Of course though, the larger the image the more time consuming the process is.

Resources