bezierPathWithRoundedRect: gives bad result on retina screen - ios

I used a method to get rounded pictures on my iOS app which work perfectly fine on iphone 3. My problem is that as soon as I try it on iphone 4 or above, the pictures get a bad quality.
Is there any way, I can turn my code around to get high res rounded picture?
-(void) setRoundedView:(UIImageView *)imageView picture: (UIImage *)picture toDiameter:(float)newSize{
UIGraphicsBeginImageContextWithOptions(imageView.bounds.size, NO, 1.0);
[[UIBezierPath bezierPathWithRoundedRect:imageView.bounds
cornerRadius:100.0] addClip];
CGRect frame=imageView.bounds;
frame.size.width=newSize;
frame.size.height=newSize;
[picture drawInRect:frame];
imageView.image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
}
Many thanks for your help!

The issue is how you're defining your image context. You are specifying a 1.0 scale, but on retina screens that should be 2.0.
Instead you can use 0.0 to default to native quality:
UIGraphicsBeginImageContextWithOptions(imageView.bounds.size, NO, 0.0);

Related

png image is not displaying after "reDraw"

I am using custom png images for items of .tabBarItem of my UITabBarController.
But my png images are too big (64x64), so I use the method below to redraw the image in a smaller rect (for example, make the size parameter (25,25) ).
-(UIImage*) getSmallImage:(UIImage*)image inSize: (CGSize)size
{
CGSize originalImageSize = image.size;
CGRect newRect = CGRectMake(0, 0, size.width, size.height);
float ratio = MAX(newRect.size.width/originalImageSize.width,
newRect.size.height/originalImageSize.height);
UIGraphicsBeginImageContextWithOptions(newRect.size, NO, 0.0);
UIBezierPath *path = [UIBezierPath bezierPathWithRoundedRect:newRect cornerRadius:5.0];
[path addClip];
CGRect projectRect;
projectRect.size.width = ratio * originalImageSize.width;
projectRect.size.height = ratio * originalImageSize.height;
//projectRect.origin.x = (newRect.size.width - projectRect.size.width) / 2.0;
//projectRect.origin.y = (newRect.size.height - projectRect.size.height) / 2.0;
// Draw the image on it
[image drawInRect:projectRect];
// Get the image from the image context
UIImage *smallImage = UIGraphicsGetImageFromCurrentImageContext();
// Cleanup image context resources
UIGraphicsEndImageContext();
return smallImage;
}
Every image I use was returned by this method. Everything was fine on simulators, but those images were not displaying when I test them on my iphone.
But if I abandon the method above and import the image directly like this: self.tabBarItem.image = [UIImage imageNamed:#"Input"]; Then the images were correctly shown on my phone, but only too big.
How can I fix this problem?
I'll answer this question by myself.
After hours of debugging, here is the problem:
in the method given above, originproperty of CGRect projectRectwas not set.
After I set both origin.x & origin.y to 0, everything worked out.
Tip: every time you meet a WTF problem, be patient and try to test your code in different ways. 'Cause in 99.9% of this kind of cases, there is something wrong with your code in stead of a bug of Xcode.
Though I still don't know why the code in my question works well in simulators, I'll let it go because I guess someday when I become an expert, this kind of question would be easy as well as silly.

UIImage doesn't want to fit in my Slide Show

In order to make a Slide Show, I installed the KASlideShow pod, which is perfect. On the iphone 5S, my Slideshow took all the width (What I wanted), but, when it comes to test the app on the iPhone 6, it takes only 3/4 of the screen.
I tried to scale the UIImage using this method :
CGRect rect = CGRectMake(0,0, width, height);
UIGraphicsBeginImageContext( rect.size );
[originialImage drawInRect:rect];
UIImage *finalImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
NSData *imageData = UIImagePNGRepresentation(finalImage);
UIImage *img=[UIImage imageWithData:imageData];
But it only scales to 3/4, even if I put enormous values. It's as if there was a limit.
I tried to use UIViewContentMode.ScaleAspectFill attribute, but it doesn't do anything.
I do not have more ideas to test :/
Thanks for your help.

Place a full sized image to fit the entire screen in a CALayer

I have an image (png) which must fill the entire screen of my app. I'm using CALayers and doing everything programatically but still this sounds like something that should be trivial to but I can't get it to work. I have two versions of the image a retina version (2048px x 1536px) and a non-retina version 1024px x 768px). The image is listed a universal image in the Asset catalogue
The code is simple enough I think:
// CREATE FULL SCREEN CALAYER
CALayer *myLayer = [[CALayer alloc] init];
[myLayer setBounds:CGRectMake(0, 0, bounds.size.width, bounds.size.height)];
[myLayer setPosition:CGPointMake(bounds.size.width/2, bounds.size.height/2)];
[self.view.layer addSublayer:myLayer];
// LOAD THE IMAGE INTO THE LAYER —— AM EXPECTING IT TO FILL THE LAYER
UIImage *layerImage = [UIImage imageNamed:#"infoScreen"];
CGImageRef image = [layerImage CGImage];
[myLayer setContents:(__bridge id)image];
[myLayer setContentsGravity:kCAGravityCenter]; /* IT WORKS FINE IF I USE setContentsGravity:kCAGravityResizeAspectFill */
This code works fine on a non-iPad retina. However on the Retina iPad, the image is always loaded at twice the actual size (so it appears zoomed in). I'm using the Simulator and iOS 8. What am I doing wrong?
Beging your image processing with
func UIGraphicsBeginImageContextWithOptions(size: CGSize, opaque: Bool, scale: CGFloat)
The last parameter in the above function determines the scaling for the graphics. You can set
this value by retrieving the scale property of the main screen. In swift I would do it this way:
var screen = UIScreen.mainScreen()
var scale = screen.scale
Hope it helps.
Edit: - Code for doing this in swift, you can modify it to suit your need.
UIGraphicsBeginImageContextWithOptions(rect.size, true, 0.0)
var ctx : CGContextRef = UIGraphicsGetCurrentContext()
<UIImage>.drawInRect(rect)
I had this same problem, was solved by setting the contentsScale value on the CALayer - for some reason the default scale on CALayers is always 1.0, even on Retina devices.
i.e.
layer.contentsScale = [UIScreen mainScreen].scale;
Also, if you're drawing a shape using CAShapeLayer and wondering its edges look a little jagged on retina devices, try:
shapeLayer.rasterizationScale = [UIScreen mainScreen].scale;
shapeLayer.shouldRasterize = YES;

iOS: renderInContext and Landscape orientation issue

I'm trying to save the currently shown views on my iOS device for a certain app, and this is working properly. But I've got a problem as soon as I'm trying to save a UIImageView in Landscape orientation.
See the following image that describes my problem:
I'm using Auto layout for this app, and it runs on both iPhone and iPad. It seems like the ImageView is always saved as shown in portrait mode, and I'm a little bit stuck right now.
This is the code I use:
CGSize frameSize = self.view.frame.size;
if (UIInterfaceOrientationIsLandscape(self.interfaceOrientation)) {
frameSize = CGSizeMake(self.view.frame.size.height, self.view.frame.size.width);
}
UIGraphicsBeginImageContextWithOptions(frameSize, NO, 0.0);
CGContextRef ctx = UIGraphicsGetCurrentContext();
CGFloat scale = CGRectGetWidth(self.view.frame) / CGRectGetWidth(self.view.bounds);
CGContextScaleCTM(ctx, scale, scale);
[self.view.layer renderInContext:ctx];
[self.delegate photoSaved:UIGraphicsGetImageFromCurrentImageContext()];
UIGraphicsEndImageContext();
Looking forward to your help!
I still have no idea what your exact issue is but using your screenshot code makes a bit strange image (not rotated or anything though, just too small). Can you try this code instead please.
+ (UIImage *)imageFromView:(UIView *)view {
UIGraphicsBeginImageContextWithOptions(view.bounds.size, view.opaque, .0f);
[view.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage * img = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return img;
}
Other then that you must understand there is a huge difference between UIImage and CGImage as the UIImage includes the orientation while CGImage does not. When dealing with image transformations it is usually with the CGImage and getting its width or height will discard the orientation. That means a CGImage will have flipped dimensions when its orientation is not up (UIImageOrientationUp). But usually when dealing with such images you create a CGImage from the context and then use [UIImage imageWithCGImage:ref scale:1.0f orientation:originalOrientation]. Only if you wish to explicitly rotate the image so it has no orientation (being UIImageOrientationUp) you need to rotate and translate the image and draw it onto the context.
Anyway, this orientation issues are quite fixed by now, UIImagePNGRepresentation respects the orientation and you have an image constructor from the CGImage already written above which is what used to be missing in the past if I remember correctly.

How to save UIView content (screenshot) without reducing it's quality

I have a UIView and I want to save it's content to an image, I successfully did that using UIGraphicsBeginImageContext and UIGraphicsGetImageFromCurrentImageContext() but the problem is that the image quality is reduced. Is there a way to take a screenshot/save UIView content to an image without reducing it's quality?
Here's a snippet of my code:
UIGraphicsBeginImageContext(self.myView.frame.size);
[self.myview.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage *viewImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
Try Following Code :
UIGraphicsBeginImageContextWithOptions(YourView.bounds.size, NO, 0);
[YourView drawViewHierarchyInRect:YourView.bounds afterScreenUpdates:YES];
UIImage *newImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
This code working Awsome...!!!
Mital Solanki’s answer is a fine solution, but the root of the problem is that your view is on a retina screen (so has a scale factor of 2) and you are creating a graphics context with a scale factor of 1. The documentation for UIGraphicsBeginImageContext states:
This function is equivalent to calling the UIGraphicsBeginImageContextWithOptions function with the opaque parameter set to NO and a scale factor of 1.0.
Instead use UIGraphicsBeginImageContextWithOptions with a scale of 0, which is equivalent to passing a scale of [[UIScreen mainScreen] scale].

Resources