UINavigationBar appears gray in screenshot - ios

I am taking a screenshot of my app's current view using the code below. I have a UIViewController embedded inside a UINavigationController. The method is being called in the UIViewController.
In the screenshot, the navigation bar is colored gray even though the barTintColor is set to another color. Why is this happening?
-(UIImage *)generateScreenshot {
CGFloat scale = [[UIScreen mainScreen] scale];
UIGraphicsBeginImageContextWithOptions(self.view.bounds.size, NO, scale);
CGContextRef context = UIGraphicsGetCurrentContext();
[self.view.layer renderInContext:context];
UIImage *viewImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return viewImage;
}

I think the problem should be that you are rendering a view which doesn't include the navigation bar,Replace [self.view.layer renderInContext:context]; with
[[appDelegate window].layer renderInContext:context];
Hope It Helps...:)

Related

How to take a picture of a UIView?

I have an app where I want to take a screenshot of a UIView when I am in another view. I have 4 views on one of my pages that can be toggled using a segmented control. But I want bmy view in the second segment to e displayed when I am in the first segment. Is there any way that I can do this?
This is what I have used so far:
UIGraphicsBeginImageContext(self.view.bounds.size);
CGContextRef context=UIGraphicsGetCurrentContext();
[self.view.layer renderInContext:context];
UIImage *image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
CGRect rect;
rect=CGRectMake(0, 120, 320, 560);
CGImageRef imageRef = CGImageCreateWithImageInRect([image CGImage], rect);
Screenshot = [UIImage imageWithCGImage:imageRef];
CGImageRelease(imageRef);
return Screenshot;
This gets me the screenshot of the view I currently am in.
What I want to do is to take a screenshot of the second view while in the first view.
This is what I changed:
UIGraphicsBeginImageContext(View2.bounds.size);
For some reason this is not working? Can you tell me why and how to fix it?
You need to change [self.view.layer renderInContext:context]; to [self.view2.layer renderInContext:context];
UIGraphicsBeginImageContext(View2.bounds.size); Only specifies the bounds to render the view!

Stretched UIView background gets cut off during screenshot

So, I am taking a screenshot of a subclassed UIView that I save into the device's photo stream.
Problem:
The problem is that I use resizableImageWithCapInsets to add a stretched background to my UIView, but this background gets cut off on the right side and I have no idea why. If someone could help me out it would be highly appreciated.
I add the stretched background to my UIView the following way:
[diagramBase addSubview:[self addTileBackgroundOfSize:diagramBase.frame
andType:#"ipad_diagram_border.png"]];
Which calls this method:
- (UIImageView *) addTileBackgroundOfSize:(CGRect)frame
andType:(NSString *)type
{
frame.origin.x = 0.0f;
frame.origin.y = 0.0f;
UIImageView *backgroundView = [[UIImageView alloc] initWithFrame:frame];
UIImage *image = [UIImage imageNamed:type];
UIEdgeInsets insets = UIEdgeInsetsMake(10.0f, 10.0f, 10.0f, 10.0f);
UIImage *backgroundImage = [image resizableImageWithCapInsets:insets];
backgroundView.image = backgroundImage;
return backgroundView;
}
The actual printscreen is done with this method (RINDiagramView is the name of my subclassed UIView, which I am taking a screenshot of). The rotation is in there because I need the image rotated when I save it, but I commented out that part and that is not what does the background to act weird.
- (UIImage *) createSnapshotOfView:(RINDiagram *) view
{
CGRect rect = [view bounds];
rect.size.height = rect.size.height - 81.0f;
UIGraphicsBeginImageContextWithOptions(rect.size, YES, 0.0f);
CGContextRef context = UIGraphicsGetCurrentContext();
[view.layer renderInContext:context];
UIImage *capturedScreen = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
UIImage *finalImage = [[UIImage alloc] initWithCGImage: capturedScreen.CGImage
scale: 1.0
orientation: UIImageOrientationLeft];
return finalImage;
}
I use Xcode 5.1 and everything is done programmatically (no storyboard and such). The base SDK is iOS 7.1.
If you're doing iOS 7+ you can use the new drawViewHierarchyInRect:afterScreenUpdates: and related methods which Apple says are really performant.
Even if you're targeting iOS 6 you should give it a try to see if you get the same problem.
Try using the correct scale?
UIImage *finalImage = [[UIImage alloc] initWithCGImage: capturedScreen.CGImage
scale: [[UIScreen mainScreen] scale]
orientation: UIImageOrientationLeft];
Use a different UIViewContentMode?
UIViewContentModeScaleToFill -> check if you can see the edges
UIViewContentModeScaleAspectFit -> check if you can see the edges, even if position is incorrect
UIViewContentModeScaleAspectFill -> check for edge right side
The reason you got a right-side cut image is caused by this line
UIImage *finalImage = [[UIImage alloc] initWithCGImage: capturedScreen.CGImage
scale: 1.0
orientation: UIImageOrientationLeft];
You made the image orientation to left, the context will thought the left-side is your top-side.And your size has a minus to the height value, so the result turns to the right-side is cut.
About the rotation, I added some code into your code.Hopes it is helpful.
- (UIImage *) createSnapshotOfView:(UIView *) view
{
CGRect rect = [view bounds];
rect.size.height = rect.size.height - 81.0f;
UIGraphicsBeginImageContextWithOptions(rect.size, YES, 0.0f);
CGContextRef context = UIGraphicsGetCurrentContext();
view.transform = CGAffineTransformMakeRotation(M_PI_2);
[view.layer renderInContext:context];
UIImage *capturedScreen = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
UIImage *finalImage = [[UIImage alloc] initWithCGImage: capturedScreen.CGImage
scale: 1.0
orientation: UIImageOrientationLeft];
view.transform = CGAffineTransformMakeRotation(0);
return finalImage;
}
UIGraphicsBeginImageContext(self.window.bounds.size);
[self.window.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage *image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
NSData * data = UIImagePNGRepresentation(image);
[data writeToFile:#"foo.png" atomically:YES];
for retina display, change the first line into this:
if ([[UIScreen mainScreen] respondsToSelector:#selector(scale)])
UIGraphicsBeginImageContextWithOptions(self.window.bounds.size, NO, [UIScreen mainScreen].scale);
else
UIGraphicsBeginImageContext(self.window.bounds.size);
adjust your size, may you get help..

CALayer renderInContext draws a blank image

This code works:
UIGraphicsBeginImageContextWithOptions(aRect.size, NO, 0.0);
[self.view drawViewHierarchyInRect:aRect afterScreenUpdates:YES];
anImageView.image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
but I need to support iOS 5 and 6. My Googling says this code ought to work:
UIGraphicsBeginImageContextWithOptions(aRect.size, NO, 0.0);
[self.view.layer renderInContext:UIGraphicsGetCurrentContext()];
anImageView.image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
but the image is blank. How do I take a snapshot of a view in iOS 5 and 6?
The solution was to scale the view to fit it the bounds of the graphics context. Most examples I found of this assume that the source view and the destination context are the same size. The graphics context I was using was much smaller than the view being snapshotted, and it was actually just clipping a corner of the view that was transparent.
UIGraphicsBeginImageContextWithOptions(aRect.size, NO, 0.0);
CGContextRef ctx = UIGraphicsGetCurrentContext();
CGFloat scale = CGRectGetWidth(aRect) / CGRectGetWidth(self.view.bounds);
CGContextScaleCTM(ctx, scale, scale);
[self.view.layer renderInContext:ctx];
anImageView.image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();

Taking screenshot of UIView which having subview with CATransform3DMakeRotation

I am trying to generate screenshot of UIView which having subview with CATransform3DMakeRotation. Screenshot is generated but it doesn't contain Rotation.
Is it possible to achieve this?
Actual View:
ScreenShot Image
Using following call to Flip the view horizontally...
currentView.layer.transform = CATransform3DConcat(currentView.layer.transform,CATransform3DMakeRotation(M_PI, 0.0, 1.0, 0.0f));
Code for taking screen shot
+ (UIImage *) imageWithView:(UIView *)view
{
CGSize screenDimensions = view.bounds.size;
// Create a graphics context with the target size
// (last parameter takes scale into account)
UIGraphicsBeginImageContextWithOptions(screenDimensions, NO, 0);
// Render the view to a new context
CGContextRef context = UIGraphicsGetCurrentContext();
[view.layer renderInContext:context];
UIImage *img = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return img;
}
The "renderInContext" only works for Affine transform. So convert the 3D transform into affine transform like this
currentView.layer.affineTransform = CATransform3DGetAffineTransform(CATransform3DConcat(currentView.layer.transform,CATransform3DMakeRotation(M_PI, 0.0, 1.0, 0.0f)));
Try this code
CGSize newSize = CGSizeMake(yourview.frame.size.width , yourview.frame.size.height);
UIGraphicsBeginImageContextWithOptions(newSize,YES,2.0f);
CGContextRef context = UIGraphicsGetCurrentContext();
CGContextSetInterpolationQuality(context, kCGInterpolationHigh);
[yourview.layer renderInContext:context];
[yourview drawRect:yourview.frame];
UIImage *screenShot = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
This might work, try it:
CGRect grabRect = CGRectMake(40,40,300,200);
//for retina displays
if ([[UIScreen mainScreen] respondsToSelector:#selector(scale)]) {
UIGraphicsBeginImageContextWithOptions(grabRect.size, NO, [UIScreen mainScreen].scale);
} else {
UIGraphicsBeginImageContext(grabRect.size);
}
CGContextRef ctx = UIGraphicsGetCurrentContext();
CGContextTranslateCTM(ctx, -grabRect.origin.x, -grabRect.origin.y);
[self.view.layer renderInContext:ctx];
UIImage *viewImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
UIImageWriteToSavedPhotosAlbum(viewImage, nil, nil, nil);
i have achieved this in one of my application by doing a little tweak like first i capture the whole screen's screen shot and then crop it with the desired frame i need here is a sample code from my app.
- (UIImage *) croppedPhoto
{
[imgcropRectangle setHidden:TRUE];
UIGraphicsBeginImageContext(self.view.frame.size);
[self.view.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage *image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
// Create bitmap image from original image data,
// using rectangle to specify desired crop area
CGImageRef imageRef = CGImageCreateWithImageInRect([image CGImage], self.imgcropRectangle.frame);
UIImage *result = [UIImage imageWithCGImage:imageRef];
CGImageRelease(imageRef);
[imgcropRectangle setHidden:FALSE];
return result;
}
here imgcropRectangle is the UIImageView's object that defines my desired rectangle so i use it's frame for cropping from full screen to desired output. Hope it will help you :)
Try rendering view.layer.presentationLayer instead of view.layer
use this and before passing the view check its subviews :
+ (UIImage *) imageWithView:(UIView *)view
{
UIGraphicsBeginImageContext(CGSizeMake(view.frame.size.width, view.frame.size.height));
[view.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage *viewImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return viewImage
}

Interpolation issue after renderInContext:UIGraphicsGetCurrentContext(), iOS

I have a few UIVies butted edge to edge. THe views completely cover the superView. Looks great in display, but when rendered the adjoining edges are visible, that is to sat a line appears between them. Since the views look perfect in display, I imagine it must be interpolation of the pixels of the views that causes this.
Anyone know how to fix this?
The image below is a render. On the device or simulator the lines would not be visible.
render code
-(void)renderImage {
CGSize renderSize = CGSizeMake(masterView.frame.size.width, masterView.frame.size.height);
UIGraphicsBeginImageContext(renderSize);
CGContextRef context = UIGraphicsGetCurrentContext();
CGContextSaveGState(context);
CGContextConcatCTM(context, [[masterView layer] affineTransform]);
[[masterView layer] renderInContext:UIGraphicsGetCurrentContext()];
renderedImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
CGContextRestoreGState(context);
UIImageWriteToSavedPhotosAlbum(renderedImage, self, #selector(image:didFinishSavingWithError:contextInfo:), nil);
masterView.transform = CGAffineTransformIdentity;
}
Core graphics attempts to anti alias your views. You need to tell it not to do that.
Consider the following example which renders self, a UIView, as a UIImage without anti aliasing:
UIGraphicsBeginImageContextWithOptions(self.bounds.size, YES, [[UIScreen mainScreen] scale]);
CGContextSetAllowsAntialiasing(UIGraphicsGetCurrentContext(), FALSE);
[self.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage *image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
try:
UIGraphicsBeginImageContextWithOptions(renderSize, false, [[UIScreen mainScreen] scale]);
instead of:
UIGraphicsBeginImageContext(renderSize);

Resources