iOS Screenshot warning - ios

I need to take a screen shot of some charts in my app,
Im using the following code:
CGRect screenRect = [[UIScreen mainScreen] bounds];
UIGraphicsBeginImageContext(screenRect.size);
CGContextRef ctx = UIGraphicsGetCurrentContext();
[[UIColor blackColor] set];
CGContextFillRect(ctx, screenRect);
[self.view.layer renderInContext:ctx];
UIImage *newImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
but in [self.view.layer renderInContext:ctx]; I get the warning Instance method -renderInContext: not found (return type defaults to id)
So, what Im I missing? to avoid this warning and successfully take my screen shot??
thanks a lot!

You need to #import <QuartzCore/QuartzCore.h>

Related

UIGraphicsBeginImageContextWithOptions casue black edge

UIGraphicsBeginImageContextWithOptions(size, YES, [UIScreen mainScreen].scale);
CGContextRef context = UIGraphicsGetCurrentContext();
CGContextSaveGState(context);
[[UIColor whiteColor] set];
CGContextFillRect(context, CGRectMake(0, 0, size.width, size.height));
CGContextRestoreGState(context);
UIImage *image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return image;
I am using this code above to generate a image. when I use UIGraphicsBeginImageContextWithOptions(size, ***YES***, [UIScreen mainScreen].scale)
and set the opaque property to YES. I get an image with black edge in right and bottom.
I have no idea why the black edge appears. Why does this appear and how to avoid this?

Stretched UIView background gets cut off during screenshot

So, I am taking a screenshot of a subclassed UIView that I save into the device's photo stream.
Problem:
The problem is that I use resizableImageWithCapInsets to add a stretched background to my UIView, but this background gets cut off on the right side and I have no idea why. If someone could help me out it would be highly appreciated.
I add the stretched background to my UIView the following way:
[diagramBase addSubview:[self addTileBackgroundOfSize:diagramBase.frame
andType:#"ipad_diagram_border.png"]];
Which calls this method:
- (UIImageView *) addTileBackgroundOfSize:(CGRect)frame
andType:(NSString *)type
{
frame.origin.x = 0.0f;
frame.origin.y = 0.0f;
UIImageView *backgroundView = [[UIImageView alloc] initWithFrame:frame];
UIImage *image = [UIImage imageNamed:type];
UIEdgeInsets insets = UIEdgeInsetsMake(10.0f, 10.0f, 10.0f, 10.0f);
UIImage *backgroundImage = [image resizableImageWithCapInsets:insets];
backgroundView.image = backgroundImage;
return backgroundView;
}
The actual printscreen is done with this method (RINDiagramView is the name of my subclassed UIView, which I am taking a screenshot of). The rotation is in there because I need the image rotated when I save it, but I commented out that part and that is not what does the background to act weird.
- (UIImage *) createSnapshotOfView:(RINDiagram *) view
{
CGRect rect = [view bounds];
rect.size.height = rect.size.height - 81.0f;
UIGraphicsBeginImageContextWithOptions(rect.size, YES, 0.0f);
CGContextRef context = UIGraphicsGetCurrentContext();
[view.layer renderInContext:context];
UIImage *capturedScreen = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
UIImage *finalImage = [[UIImage alloc] initWithCGImage: capturedScreen.CGImage
scale: 1.0
orientation: UIImageOrientationLeft];
return finalImage;
}
I use Xcode 5.1 and everything is done programmatically (no storyboard and such). The base SDK is iOS 7.1.
If you're doing iOS 7+ you can use the new drawViewHierarchyInRect:afterScreenUpdates: and related methods which Apple says are really performant.
Even if you're targeting iOS 6 you should give it a try to see if you get the same problem.
Try using the correct scale?
UIImage *finalImage = [[UIImage alloc] initWithCGImage: capturedScreen.CGImage
scale: [[UIScreen mainScreen] scale]
orientation: UIImageOrientationLeft];
Use a different UIViewContentMode?
UIViewContentModeScaleToFill -> check if you can see the edges
UIViewContentModeScaleAspectFit -> check if you can see the edges, even if position is incorrect
UIViewContentModeScaleAspectFill -> check for edge right side
The reason you got a right-side cut image is caused by this line
UIImage *finalImage = [[UIImage alloc] initWithCGImage: capturedScreen.CGImage
scale: 1.0
orientation: UIImageOrientationLeft];
You made the image orientation to left, the context will thought the left-side is your top-side.And your size has a minus to the height value, so the result turns to the right-side is cut.
About the rotation, I added some code into your code.Hopes it is helpful.
- (UIImage *) createSnapshotOfView:(UIView *) view
{
CGRect rect = [view bounds];
rect.size.height = rect.size.height - 81.0f;
UIGraphicsBeginImageContextWithOptions(rect.size, YES, 0.0f);
CGContextRef context = UIGraphicsGetCurrentContext();
view.transform = CGAffineTransformMakeRotation(M_PI_2);
[view.layer renderInContext:context];
UIImage *capturedScreen = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
UIImage *finalImage = [[UIImage alloc] initWithCGImage: capturedScreen.CGImage
scale: 1.0
orientation: UIImageOrientationLeft];
view.transform = CGAffineTransformMakeRotation(0);
return finalImage;
}
UIGraphicsBeginImageContext(self.window.bounds.size);
[self.window.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage *image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
NSData * data = UIImagePNGRepresentation(image);
[data writeToFile:#"foo.png" atomically:YES];
for retina display, change the first line into this:
if ([[UIScreen mainScreen] respondsToSelector:#selector(scale)])
UIGraphicsBeginImageContextWithOptions(self.window.bounds.size, NO, [UIScreen mainScreen].scale);
else
UIGraphicsBeginImageContext(self.window.bounds.size);
adjust your size, may you get help..

iOS screen capture is blurry for some reason

I'm trying to capture the screen inside of my app, it used to work just great, but now I only get a blurry image. like that:
This is my code:
- (UIImage *)takeScreenShotWithFrame:(CGRect)frame
{
if ([[UIScreen mainScreen] respondsToSelector:#selector(scale)]) {
UIGraphicsBeginImageContextWithOptions(frame.size, NO, [UIScreen mainScreen].scale);
} else {
UIGraphicsBeginImageContext(frame.size);
}
[self.view drawViewHierarchyInRect:frame afterScreenUpdates:NO];
UIImage *image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return image;
}
What am I doing wrong?
Thanks in advance!
instead of:
[self.view drawViewHierarchyInRect:frame afterScreenUpdates:NO];
Try:
CGContextRef context = UIGraphicsGetCurrentContext();
[self.view.layer renderInContext:context];

Take screen in particular view in iphone with camera preview

CGRect screenRect = [[UIScreen mainScreen] bounds];
UIGraphicsBeginImageContext(screenRect.size);
CGContextRef ctx = UIGraphicsGetCurrentContext();
[[UIColor whiteColor] set];
CGContextFillRect(ctx, screenRect);
[self.view.layer renderInContext:ctx];
UIImage *img = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return img;
Now I m using this to take screen shot. It works well but the camera preview shows black. I need to take screen shot with camera preview for both IOS6 and IOS7. Any suggestion greatly appreceatable..

Setting Context for ios screenshots

I want to take a screenshot of a specific part of the screen. I have set up a method to tale a shot of the whole screen, but I want a specific section.
I know I have to change this code:
CGSize imageSize = [[UIScreen mainScreen] bounds].size;
But I tried using the CGRectMake (50,50, 400, 400) instead of [[UIScreen mainScreen] bounds].size and it gives an error... Why?
Try this :-
CGRect rect = CGRectMake(50,50, 400, 400);
UIGraphicsBeginImageContext(rect.size);
CGContextRef ctx = UIGraphicsGetCurrentContext();
CGContextFillRect(ctx, rect);
[self.view.layer renderInContext:ctx];
UIImage *newImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
Hope it helps you..

Resources