Using images in UIView as background, retina screen - ios

i want to set an image to a custom UIView. I created the Image in Photoshop with 100 by 100 pixels.
Then I did:
self.customView.backgroundColor = [UIColor colorWithPatternImage:[UIImage imageNamed:#"Image.png"]];
And it worked just fine, except that i could see the pixels on my Retina display. Don't get me wrong i saw that coming, because i set the image in Photoshop to 100x100 and the UIView to 100x100. So due to the "false" pixel definition in xcode it is displayed with the double size pixels.
But how can i create a image, say an image with 200x200 and then set it to the background of the 100x100 View so it is displayed with native resolution. I tried to search for something like "scale to fit"-function for the backgroundPattern for the UIView but couldn't find anything.
So my question is, how do i set an image as "Retina 2x" Image?
And how do i get it into my UIView?
Thanks in advance!

This is your answer.
Basically put two images in your project. "Image.png" and "Image#2x.png". The code inside [UIImage imageNamed:#"Image.png"] will pull from the correct image based on the screen hardware.

Related

Rendering a full sized image with UIGraphicsImageRenderer

Images taken with our device camera's are larger than the screen on our iOS device.
Usually when we render an image in iOS i see people setting their rendering size to the size of the main view or their image view.
As shown in this stack overflow response.
Doing so will make the image smaller and lower quality.
If the logic follows.
Would it be possible to render the image with it's full size if I make the image view go beyond the device screen. In other words the image view will be set to the size of the original full sized image.
Will doing this generate a rendered image with the same pixel size of the original image ?
What have I done so far ?
I have tested other image overlaying/editing apps and they seem to reduce the image's pixel size.
I used the methods provided in the link above and they also do the same.
Yes it's definitely possible by embedding a UIImageView in a UIScrollView and using sizeToFit:
imageView.image = image;
[imageView sizeToFit];
scrollView.contentSize = image.size;
You will then be able to pan across the image within the scroll view.

Make UIButton image appear crisp-perfect on retina display

So I've done lots of reading on how to achieve perfect image quality on UIButton's .imageView property in iOS. Here's my example ->
I've got an UIButton 24x24 points as per the following line:
myButton = [UIButton buttonWithType:UIButtonTypeCustom];
myButton.frame = CGRectMake(82, 8, 24, 24);
myButton.contentMode = UIViewContentModeCenter;
buttonImage = [UIImage imageNamed:#"myImage.png"];
ogImage = [buttonImage imageWithRenderingMode:UIImageRenderingModeAlwaysOriginal];
I then have the original image sized to 46x46 pixels (twice 23x23, 24x24 on button for even size to prevent iOS auto-aliasing), then the image#2x at 92x92 pixels. Im testing on an iPhone 6s (obviously retina display) and am still seeing some jaggedness on my UIButton's image. What am I doing wrong here? Am I still not understanding how to achieve perfect retina quality?
Here's an image, Im hoping it displays well for example:
Not sure if this is going to help, but this is my personal preference. If for example, I have a UIButoon of size 24x24, I always generate 24x24, 48x48, and 72x72 images. I rely on my Adobe Illustrator to creat pixel perfect images. Always check your images in pixel preview mode and make sure edges are aligned with the pixels. If not, it can produce artifacts that you see in xCode.
If it's what you want in Illustrator then it's what you get in xCode.

Using a high-res image for a custom swipe mechanism in a UITableViewCell

I have a UITableViewController where the UITableViewCell is dynamic and inheriting from a custom UITableViewCell class. I have introduced a swipe gesture on the cell to favourite the cell, but I'm having a bit of issues with the image that shows when the cell is swiped.
This is what it currently looks like:
On anything but the iPhone 5s, it looks pixelated and terrible. The image is currently a square image of size 50x50.
If I upped the size to 200x200, I get:
This looks less pixelated, but also at the same time looks absolutely terrible.
In my cellForRow, this is the code I am using to apply the image: [self.rightUtilityButton sw_addUtilityButtonWithColor:
[UIColor orangeColor] icon:[UIImage imageNamed:#"Star.png"]];
I am using the SWTableViewCell open source code (https://github.com/CEWendel/SWTableViewCell) as my reference for this, but I can't find in the code where this size is actually set on the UITableViewCell custom image and I can't get in touch with the author.
Issues
I basically want to use the 200x200 image, but in the frame size of the 50x50 image. How can I force this? Perhaps I can put the 200x200 image into another UIImageView with a clear background? If that is the way, I'm not sure how to achieve it. Any guidance would be appreciated.
You should include multiple resolutions for the same image and use the #nx naming scheme.
For example: You have set the imageView as 50 x 50 px in the interface builder or code.
For the above imageView, you should provide the following images:
star.png - 50x50 px (For screens with scale 1 resolution like in iPhone 3GS, iPad 2)
star#2x.png - 100x100 px (For screens with scale 2 resolution like in iPhone 4s - iPhone 6)
star#3x.png - 150x150 px (For screens with scale 3 resolution like in iPhone 6S)
Then, simply set the image for the imageView as:
[imageView setImage:#"star"]; //Dont use the extension
At this point, the app will automatically pick up the best image based on the current screen and show the right image.

iOS UIImageView - Top of Image Chopped Off?

I'm new to iOS Development, I'm using a UIImageView to display an image. I've made a 320x480 and a 640x960 image called "red.png" and "red#2x.png".
No matter how I scale or align the UIImageView, the image always chops off half way at the top.
Is there something I'm meant to do to combat this, as I thought those resolutions were correct?
The UIImageView is sized at 320x568 to fill the storyboard.
Thanks :)
My comment above is hard to read, and it's probably an answer:
myImageView.contentMode = UIViewContentModeScaleAspectFit;
Try changing to image name as follows:
self.colourImage.image = [UIImage imageNamed: #"red.png"];
You should not directly specify '#2x' in image names.
It is automatically used if a device supports retina display.
That is why you need to create 2 sets of images; one for non-retina and one for retina (using #2x) and the OS selects the correct one for you.

UIImage rendering not clear on iPad Mini

The following code block is used in my application to take a screenshot of the current screen of an iPad mini(768 x 1024):
UIImage *img;
UIGraphicsBeginImageContext(self.view.bounds.size);
[self.view.layer renderInContext:UIGraphicsGetCurrentContext()];
img = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return img;
In a different viewcontroller, I present a UIScrollView with a width of 540 and a height of 290. I display the screencapture UIImage in a UIImageView which I create programmatically initWithFrame with a rectangle width of 250 and height of 250. The content size of the scrollview is 768 by 250.
Now running the application, I display four rectangles and screenshot the screen using the above block of code. Transitioning to the UIScrollView, the image is not clear (and by not clear, some rectangles are missing sides while some are thicker than others). Is there a way to display the image clearer? I know the image has to be scaled down from the original 768 by 1024 to 250 by 250. Could this be the problem? If so, what would be the best fix?
Edit:
Above a screenshot of the image I want to capture.
Below is the UIImage in UIImageView within a UIScrollView:
Cast each coordinate to int, or use CGRectIntegral, to do that directly on a CGRect, decimal point requires AA and makes images blurry.
Try changing the content mode of your UIImageViews. If you use UIViewContentModeScaleAspectFill, you shouldn't see any extra space around the edges.
Update: From the screenshots you posted, it looks like this is just an effect of the built-in downscaling in UIKit. Try manually downscaling the image to fit using Core Graphics first. Alternatively, you might want to use something like the CILanczosScaleTransform Core Image filter (iOS 6+).

Resources