Drawing retina versus non-retina images - ios

UIImage 1: Loaded from a file with the #2x modifier with size 400x400, thus UIImage 1 will report its size as 200x200
UIImage 2: Loaded from a file without the #2x modifier with size 400x400, thus UIImage 2 will report its size as 400x400
I then create 2 images from the above applying the code below to each
UIGraphicsBeginImageContextWithOptions(CGSizeMake(400,400), YES, 1.0);
[image drawInRect:CGRectMake(0, 0, 400, 400)];
UIImage *rescaledI = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
Considering the above, can I expect the image quality for both resulting images to be exactly the same? (I am trying to determine if drawing a 200x200 retina image to a 400x400 non-retina context will degrade quality versus drawing the same image not loaded as a retina image)

Just return the current image's size.
UIImage *image1 = [UIImage imagedNamed:#"myimage.png"];
//access width and height like this
image1.size.width;
image1.size.height;
UIGraphicsBeginImageContextWithOptions(CGSizeMake(image1.size.width, image1.size.height), YES, 1.0);
[image drawInRect:CGRectMake(0, 0, image1.size.width, image1.size.height)];
UIImage *rescaledI = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
Of course if you should replace image1 with whatever image you are trying to get the size of. Some switch statement or if statement should do the trick for you.
Never hardcode sizes/dimensions/locations etc. You should always pull that info dynamically by asking your image its size. Then you can change the size of your image without fear of having to locate the hardcoded size in your application.

The image is always 400*400 pixels: the difference is that in a retina display 400*400 pixels cover less space, that is exactly half (200*200 Core Graphics points). If you are not applying any transformation to the image will stay exactly the same.
The code you wrote renders the image as is because you are overriding the device scale factor and setting it to always 1 (1 pixel to 1 point).
You should use two images, one twice as big, if you want your image to cover the same amount of screen on both retina and non retina devices.

Related

Combine two UIImageView in to one image stretching

When i tried two combine two UIImageview then images are stretching here's code what i am using
CGSize size =CGSizeMake(MAX(self.imgCapture.size.width, self.imgGallary.size.width), MAX(self.imgCapture.size.height, self.imgGallary.size.height));
UIGraphicsBeginImageContext(size);
[self.imgCaptured.image drawInRect:CGRectMake(self.view.frame.origin.x,self.view.frame.origin.y,size.width/2,self.imgCapture.size.height)];
[self.imgGallaryCD.image drawInRect:CGRectMake(self.view.frame.origin.x+(size.width/2),self.view.frame.origin.y,size.width/2,self.imgGallary.size.height)];
UIImage *finalImage = UIGraphicsGetImageFromCurrentImageContext();
Here's First screenshot is there is two UIImageview's
Second screenshot is when i am combine this image into one but image is stretching i want that aspect ratio like screenshot 1
The image is not "stretching". It is squeezing. It's a matter of simple arithmetic. Looking at your image context size and your drawInRect commands, we see that your image context is the size of one image, so now you are drawing both images at half width. So they are squeezed horizontally. You need the image context to be the size of both images added together.

App crashed when I display a large image by UIImageView

I set a image with 10000 * 10000 pixels to UIImageView from network by SDWebImage, and App crashed because it allocated too much memory. I tried to resize the image that had been loaded by SDWebImage, so I add the code below:
UIGraphicsBeginImageContext(size);
CGContextRef context = UIGraphicsGetCurrentContext();
CGContextClearRect(context, CGRectMake(0, 0, size.width, size.height));
CGContextSetInterpolationQuality(context, 0.8);
[self drawInRect:drawRect blendMode:kCGBlendModeNormal alpha:1];
UIImage *image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
Although the image size was smaller, my app crashed due to the same reason.
It seems that there are some rendering action during the resizing action, the memory would rose to 600M and fell to 87M in a little while.
How can I resize a image without rendering?
It seems that display the image by UIWebView did not exist the problem. How it works?
Any help and suggestions will be highly appreciable。
Resolution:
https://developer.apple.com/library/ios/samplecode/LargeImageDownsizing/
The Resolution does work for jpg but not for png
You can't unpack the image into memory because it's too big. This is what image tiling is for, so you would download a set of tiles for the image based on the part you're currently looking at (the zoom position and scale).
I.e. if you're looking at the whole image you get 1 tile which is zoomed out and therefore low quality and small size. As you zoom in you get back other small size images which show less of the image 'area'.
The web view is likely using the image format to download a relatively small image size that is a scaled down version of the whole image, so it doesn't need to unpack the whole image to memory. It can do this because it knows your image is 10,000x10,000 but that it is going to be displayed on the page at 300x300 (for example).
Did you try to use : UIImageJPEGRepresentation (or UIImagePNGRepresentation)
You can make your image size smaller with it.

Objective-C How does snapchat make the text on top of an image/video so sharp and not pixelated?

In my app, it allows users to place text on top of images like snapchat, then they are allowed to save the image to their device. I simply add the text view on top of the image and take a picture of the image using the code:
UIGraphicsBeginImageContext(imageView.layer.bounds.size);
[imageView.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage* savedImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
But when I compare the text on my image, to the text from a snapchat image...it is significantly different. Snapchat's word text on top of image is significantly sharper then mine. Mine looks very pixelated. Also I am not compressing the image at all, just saving the image as is using ALAssetLibrary.
Thank You
When you use UIGraphicsBeginImageContext, it defaults to a 1x scale (i.e. non-retina resolution). You probably want:
UIGraphicsBeginImageContextWithOptions(imageView.layer.bounds.size, YES, 0);
Which will use the same scale as the screen (probably 2x). The final parameter is the scale of the resulting image; 0 means "whatever the screen is".
If your imageView is scaled to the size of the screen, then I think your jpeg will also be limited to that resolution. If setting the scale on UIGraphicsBeginImageContextWithOptions does not give you enough resolution, you can do your drawing in a larger offscreen image. Something like:
UIGraphicsBeginImageContext(imageSize);
[image drawInRect:CGRectMake(0,0,imageSize.width,imageSize.height)];
CGContextScaleCTM(UIGraphicsGetCurrentContext(),scale,scale);
[textOverlay.layer renderInContext:UIGraphicsGetCurrentContext()];
newImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
You need to set the "scale" value to scale the textOverlay view, which is probably at screen size, to the offscreen image size.
Alternatively, probably simpler, you can start with a larger UIImageView, but put it within another UIView to scale it to fit on screen. Do the same with your text overlay view. Then, your code for creating composite should work, at whatever resolution you choose for the UIImageView.

iOS image sizes for iPad and iPhone

I have developed an small iOS app, where i have image named bg.png which is of dimension
1024 * 768 for iPad.
Now i have many images which has been created for iPad size. Now i need to make support of this app in iPhone, for that weather i need to create same set of images agian for iPhone size,
568 * 300 for iPhone.
or there is another way to do this?
Scaling down the iPad image assets will destroy UX on iPhone. Also images like icon, splash screen usually contain company logo. Scaling down will tamper the look of the logo and overall image. Better way is to create separate images for iPhone form factor. Trim the png files using http://tinypng.org/ to keep binary size low.
Cheers!Amar.
You can use this code to re-size the image by following code,
CGSize newSize = CGSizeMake(568, 300);
UIGraphicsBeginImageContext(newSize);
[yourIpadImage drawInRect:CGRectMake(0, 0, newSize.width, newSize.height)];
newIphoneImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
+ (UIImage *)imageWithImage:(UIImage *)image scaledToSize:(CGSize)newSize {
//UIGraphicsBeginImageContext(newSize);
UIGraphicsBeginImageContextWithOptions(newSize, NO, 0.0);
[image drawInRect:CGRectMake(0, 0, newSize.width, newSize.height)];
UIImage *newImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return newImage;
}
You have option to change the Size of Your Image
sathiamoorthys solution is a difficult way or rescaling your image. You can do that by simply creating a UIImageView, initialize it with a UIImage and then change its frame.
Note that your image will look scaled/distorted that way.
follow this:
open the image in preview.
go to tools > adjust size
put in whatever size you want.
save the image as a different name.
yes
you should create duplicate and resize them for iphone. Using same images for iphone will bring memory issues because the images are unnecessarily big for iphone.
Use any software to resize them or you can do this using preview also as Nikita described above
If you are doing this to create universal app then you must postfix ~ipad in the name of the image file.
Please visit this link, May help you and solve your issue.
There is the some tips like:
Propotional scale,
Resize
If you want your images to show up unscaled, you are going to need an additional image with the correct size.
So supporting both iPad with and without retina screens would require one image of 768x1024 and one of 1536 x 2048. For iPhone 3.5" you would need 960 x 640 when it is a retina screen or 480 x 320 when it is non-retina. For iPhone 5 (4" screen) you would need 568 x 320.
If you use UIImages method imageNamed: there is help from Apple. It loads on retina devices that method looks for the the image you specified with the postfix '#2x'. So you can simply code:
UIImage * myImage = [UIImage imageNamed: #"myImage"]
If you make sure you project contains myImage.png for non-retina devices and myImage#2x.png for retina devices the right image gets loaded at runtime.

UIImageView content mode and scale factor

I have a programmatically created UIImage image, using this kind of code:
UIGraphicsBeginImageContextWithOptions(CGSizeMake(128, 128), NO, 0.0f);
// Render in context
UIImage *resultImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
Since my context options specify scale of 0, on retina devices it will be set to 2, and I can confirm that on resulting UIImage scale property.
Now, the problem is that this image size is 128x128, scale 2. When I am putting it into UIImageView of size 64x64 and contentMode = Center, it renders my image outside imageview, presumably rendering into 128x128 box without any scaling.
My understanding of retina graphics was that if an image has scale factor 2.0, the it should be rendered at 1/2 size, thus resulting in higher DPI.
So I was expecting the image view to render 64x64 image at retina quality. Where am I wrong?
The image will be rendered at the size you give it - 128 x 128. The scale factor means that you will have better rendered curves etc, but the image will still be 128 x 128 points. As stated in the documentation, the size parameter is:
The size (measured in points) of the new bitmap context. This represents the size of the image returned by the UIGraphicsGetImageFromCurrentImageContext function.
If you want a retina-quality 64x64 image, use a 64x64 size for your context.

Resources