I have some code where I'm grabbing an image of the screen, and then cropping it based on some boundary values:
UIGraphicsBeginImageContextWithOptions(self.mainView.bounds.size, NO, 0.0);
[self.mainView.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage *comicImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
UIGraphicsBeginImageContextWithOptions(CGSizeMake(boundary.width, boundary.height), NO, 0.0);
[comicImage drawAtPoint:CGPointMake(-boundary.xMin, -boundary.yMin)];
comicImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
What I'm wondering is, am I producing a poor quality image with low resolution using the above method? Is there a better way to go about doing this?
Maybe,you can use the follow code
UIGraphicsBeginImageContextWithOptions(size, NO, [UIScreen mainScreen].scale);
Related
In my application, I need to be able to share a screenshot of a UIView. This is the code I am using to take the screenshot:
CGRect viewFrame = [view bounds];
UIGraphicsBeginImageContextWithOptions(viewFrame.size, YES, 0.0);
CGContextRef context = UIGraphicsGetCurrentContext();
[view.layer renderInContext:context];
UIImage *viewScreenshot = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
I'm not satisfied with the quality of the image produced. Here is an example. I'm not fond of the noise/blurriness around the text and the image.
Can anyone tell me how to improve the quality, por favor? Or is this just how it is?
The code fragment you provided is correct. It will produce perfect image without artefacts. However you have to use lossless format when you are saving it to file. The bluriness you see in your image is caused with lossy compression.
I agree with the other answers. Just a quick note. In iOS7+ you can use the following code instead of yours. It is much faster:
UIGraphicsBeginImageContextWithOptions(view.bounds.size, view.opaque, 0.0f);
[view drawViewHierarchyInRect:view.bounds afterScreenUpdates:NO];
UIImage * snapshotImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return snapshotImage;
In order to make a Slide Show, I installed the KASlideShow pod, which is perfect. On the iphone 5S, my Slideshow took all the width (What I wanted), but, when it comes to test the app on the iPhone 6, it takes only 3/4 of the screen.
I tried to scale the UIImage using this method :
CGRect rect = CGRectMake(0,0, width, height);
UIGraphicsBeginImageContext( rect.size );
[originialImage drawInRect:rect];
UIImage *finalImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
NSData *imageData = UIImagePNGRepresentation(finalImage);
UIImage *img=[UIImage imageWithData:imageData];
But it only scales to 3/4, even if I put enormous values. It's as if there was a limit.
I tried to use UIViewContentMode.ScaleAspectFill attribute, but it doesn't do anything.
I do not have more ideas to test :/
Thanks for your help.
I have a UIView and I want to save it's content to an image, I successfully did that using UIGraphicsBeginImageContext and UIGraphicsGetImageFromCurrentImageContext() but the problem is that the image quality is reduced. Is there a way to take a screenshot/save UIView content to an image without reducing it's quality?
Here's a snippet of my code:
UIGraphicsBeginImageContext(self.myView.frame.size);
[self.myview.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage *viewImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
Try Following Code :
UIGraphicsBeginImageContextWithOptions(YourView.bounds.size, NO, 0);
[YourView drawViewHierarchyInRect:YourView.bounds afterScreenUpdates:YES];
UIImage *newImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
This code working Awsome...!!!
Mital Solanki’s answer is a fine solution, but the root of the problem is that your view is on a retina screen (so has a scale factor of 2) and you are creating a graphics context with a scale factor of 1. The documentation for UIGraphicsBeginImageContext states:
This function is equivalent to calling the UIGraphicsBeginImageContextWithOptions function with the opaque parameter set to NO and a scale factor of 1.0.
Instead use UIGraphicsBeginImageContextWithOptions with a scale of 0, which is equivalent to passing a scale of [[UIScreen mainScreen] scale].
I would like to take an image and duplicate it. Then increase it by 105% and overlay it on the original image.
What is the correct way to do this on iOS?
This is your basic code for drawing the image and then saving it as an image again:
- (UIImage *)renderImage:(UIImage *)image atSize:(CGSize)size
{
UIGraphicsBeginImageContext(size);
[image drawInRect:CGRectMake(0.0, 0.0, size.width, size.height)];
// draw anything else into the context
UIImage *newImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return newImage;
}
Where it says "draw anything else into the context" you can draw the image at a reduced size by setting the appropriate rect to draw in. Then, call the renderImage method with whatever size you want the full image to be. You can use CGContextSetAlpha to set the transparency.
Yep as the title says I need to take a cropped snapshot of my app.
I want to cut top of the screenshot little bit (%20) I already have a code which I used to take a snapshot and send it to facebook and its working but its taking the photo of all of the screen so how can tell my code to ignore the %20 percent of the screen.Maybe with height and width also I looked some questions in the stack overflow and manage to slide my screenshot so I get rid of the unwanted part at the top but this time at the bottom huge white area appeared so it didnt solve my problem.
Here is my snapshot code
UIGraphicsBeginImageContext(self.ekran.bounds.size);
[self.ekran.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage *resultingImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
A method to crop the image, that accepts any frame to crop the image against
- (UIImage *)cropImage:(UIImage *)imageToCrop toRect:(CGRect)rect
{
CGImageRef imageRef = CGImageCreateWithImageInRect([imageToCrop CGImage], rect);
UIImage *cropped = [UIImage imageWithCGImage:imageRef];
CGImageRelease(imageRef);
return cropped;
}
Use it as follows:
UIGraphicsBeginImageContext(self.ekran.bounds.size);
[self.ekran.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage *resultingImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
CGFloat imgHight = resultingImage.size.height;
// Create a frame that crops the top 20% of the image
CGRect* imageFrame = CGRectMake(0, imgHight - (imgHight*0.8), width, imgHight*0.8);
resultingImage = [self cropImage:resultingImage toRect:imageFrame];