Snap shot is not working? - ios

I am new to mobile programming. I am working in H264 video rendering in iOS application using VideoToolBox framework. It has one feature to take snapshot while rendering the video. Whenever I take a snapshot, I get the Black screen only.
I tried this
1. renderInContext,
2. drawViewHierarchyInRect,
3. snapshotViewAfterScreenUpdates method
to capture the rendering the video but returns a Black screen only.
//snapshot coding
UIGraphicsBeginImageContextWithOptions (self.view.bounds.size, YES, 0.0);
[self.view.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage *snapshotImage = UIGraphicsGetImageFromCurrentImageContext();
mImageView.image = snapshotImage;
UIGraphicsEndImageContext();
UIImageWriteToSavedPhotosAlbum(snapshotImage,self, #selector(image:didFinishSavingWithError: contextInfo:), nil);

Check this out,
following chunk of code works for me to take screen's snap shot
if ([[UIScreen mainScreen] respondsToSelector:#selector(scale)])
UIGraphicsBeginImageContextWithOptions(APP_DELEGATE.window.bounds.size, NO, [[UIScreen mainScreen] scale]);
else
UIGraphicsBeginImageContext(APP_DELEGATE.window.bounds.size);
[APP_DELEGATE.window.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage *image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
NSData * data = UIImagePNGRepresentation(image);
I guess, it will help you. let me know if so

I've not worked with video yet, but a simple snapshot of UIView with subViews on it works fine
+ (UIImage *)makeSnapShot:(UIView *)view image:(UIImageView *)imageView
{
CGFloat offset_x = /*your_value*/;
CGFloat offset_y = /*your_value*/;
UIGraphicsBeginImageContext(view.bounds.size);
[view.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage *image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
CGRect rect = CGRectMake(offset_x, offset_y, imageView.bounds.size.width, imageView.bounds.size.height);
CGImageRef imageRef = CGImageCreateWithImageInRect([image CGImage], rect);
image = [UIImage imageWithCGImage:imageRef];
CGImageRelease(imageRef);
return image;
}

Not sure if this is what you're looking for, but if you need to get a snapshot of the VTDecompressionSession, you can send the CVImageBuffer that you get from the decodeFrame callback into this method to get a UIImage. You can also add your CIContext to the parameters list instead of using the temporaryContext.
+ (UIImage *) UIImageFromCVImageBufferRef:(CVImageBufferRef)imageBuf
{
CIImage *ciImage = [CIImage imageWithCVPixelBuffer:imageBuf];
CIContext *temporaryContext = [CIContext contextWithOptions:nil];
CGImageRef videoImage = [temporaryContext
createCGImage:ciImage
fromRect:CGRectMake(0, 0,
CVPixelBufferGetWidth(imageBuf),
CVPixelBufferGetHeight(imageBuf))];
UIImage *image = [[UIImage alloc] initWithCGImage:videoImage];
CGImageRelease(videoImage);
return image;
}

func takeScreenshot(_ shouldSave: Bool = true) {
var screenshotImage :UIImage?
let layer = UIApplication.shared.keyWindow!.layer
let scale = UIScreen.main.scale
UIGraphicsBeginImageContextWithOptions(layer.frame.size, false, scale)
self.view.drawHierarchy(in: self.view.bounds, afterScreenUpdates: true)
screenshotImage = UIGraphicsGetImageFromCurrentImageContext()
UIGraphicsEndImageContext()
if let image = screenshotImage, shouldSave {
UIImageWriteToSavedPhotosAlbum(image, nil, nil, nil)
}
}

Related

Default iPhone screenshot programmatically

I tried two different ways to create a screenshot, but unfortunately they don't work as I need, I have an RMMapView that is blank on the screenshot. When I create snapshot manually on my device it works perfectly, and the map view is on the screen. So I would like to achieve the same result programmatically. Is it possible somehow as I tried? Or what is the right way to do that? (To reproduce that type of screenshot)
- (UIImage *) takeScreenshot {
//1. version
UIGraphicsBeginImageContextWithOptions(self.view.bounds.size, NO, [UIScreen mainScreen].scale);
[self.view drawViewHierarchyInRect:self.view.bounds afterScreenUpdates:YES];
UIImage *image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return image;
//2. version
UIGraphicsBeginImageContext(self.view.bounds.size);
CGContextRef context=UIGraphicsGetCurrentContext();
[self.view.layer renderInContext:context];
UIImage *image=UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return image;
}
Actually You can use a method called takeSnapshot in a RMMapView.
UIImage *image = self.mapView.takeSnapshot
[Update]
Can you try this method instead
CGSize size = self.view.bounds.size;
CGRect cropRect = self.mapView.bounds
UIGraphicsBeginImageContext(size);
[self.view.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage * mapImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
CGImageRef imageRef = CGImageCreateWithImageInRect(mapImage.CGImage, cropRect);
UIImage * cropImage = [UIImage imageWithCGImage:imageRef];
CGImageRelease(imageRef);
UIImageWriteToSavedPhotosAlbum(cropImage, nil, nil, nil);
UIGraphicsEndImageContext();
Good luck

How to programatically take a screenshot in Sprite-Kit?

I've been reading resolved questions on how to programmatically take a screenshot but I can't seem to get what I've read to work in sprite kit. For instance:
This question How to take a screenshot programmatically
UIGraphicsBeginImageContext(self.window.bounds.size);
[self.window.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage *image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
NSData * data = UIImagePNGRepresentation(image);
[data writeToFile:#"foo.png" atomically:YES];
UPDATE April 2011: for retina display, change the first line into this:
if ([[UIScreen mainScreen] respondsToSelector:#selector(scale)])
UIGraphicsBeginImageContextWithOptions(self.window.bounds.size, NO, [UIScreen mainScreen].scale);
else
UIGraphicsBeginImageContext(self.window.bounds.size);
gave me this information, which I tested on my game but it didn't work because window was not recognized on an SKScene. I tried replacing it with scene but that didn't work. Any suggestions?
I also tried this:
UIGraphicsBeginImageContextWithOptions(self.view.bounds.size, NO, scale);
[self drawViewHierarchyInRect:self.bounds afterScreenUpdates:YES];
UIImage *viewImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
Which I found in this question: ios Sprite Kit screengrab?
But it didn't seem to work either because it didn't recognize scale, or bounds in the second line.
This is the best solution to take a screen shoot in your Sprite-Kit
UIGraphicsBeginImageContextWithOptions(self.view.bounds.size, NO, 1);
[self.view drawViewHierarchyInRect:self.view.bounds afterScreenUpdates:YES];
UIImage *viewImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return viewImage;
func screenShot() {
UIGraphicsBeginImageContextWithOptions(UIScreen.mainScreen().bounds.size, false, 0)
self.view!.drawViewHierarchyInRect(self.view!.bounds, afterScreenUpdates: true)
var image:UIImage = UIGraphicsGetImageFromCurrentImageContext()
UIGraphicsEndImageContext()
var Image = image //The image is stored in the variable Image
}
screenshot() //Calls the function and takes a screenshot
// Full Screen Shot function. Hope this will work well in spritekit.
func screenShot() -> UIImage {
UIGraphicsBeginImageContext(CGSizeMake(frame.size.width, frame.size.height))
var context:CGContextRef = UIGraphicsGetCurrentContext()
self.view?.drawViewHierarchyInRect(frame, afterScreenUpdates: true)
var screenShot = UIGraphicsGetImageFromCurrentImageContext()
UIGraphicsEndImageContext();
return screenShot
}
you can take screen shot for a any view.UIGraphicsBeginImageContextWithOptions(self.view.bounds.size, self.view.opaque, 0.0);
[self.view.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage *gameOverScreenImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();

Cropped iOS Image comes back too large

Been trying to fix this problem all day to no avail.
Pretty much, I'm taking a screenshot of the view, then trying to crop out the first 50px and a footer. Problem is that when I do this, the result is a little blowed up, and quality is lost. Here's what I wrote, which I think conforms to retina.
-(UIImage *)takeSnapShotAndReturn{
//Take screenshot of whole view
if([[UIScreen mainScreen] respondsToSelector:#selector(scale)]){
UIGraphicsBeginImageContextWithOptions(self.view.bounds.size,NO,[UIScreen mainScreen].scale);
}
else{
UIGraphicsBeginImageContext(self.view.window.bounds.size);
}
[self.view.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage *image = UIGraphicsGetImageFromCurrentImageContext();
combinedImage = [self cropOutArea:image withRectangle:CGRectMake(0, 50, 320, 467)];
UIImageWriteToSavedPhotosAlbum(combinedImage, nil, nil, nil);
UIGraphicsEndImageContext();
return image;
}
-(UIImage *)cropOutArea:(UIImage*)image withRectangle:(CGRect)rectangle{
if(image.scale > 1){
rectangle = CGRectMake(rectangle.origin.x * image.scale,
rectangle.origin.y * image.scale,
rectangle.size.width * image.scale,
rectangle.size.height * image.scale);
}
CGImageRef imageRef = CGImageCreateWithImageInRect(image.CGImage, rectangle);
UIImage *result = [UIImage imageWithCGImage:imageRef scale:image.scale orientation:image.imageOrientation];
CGImageRelease(imageRef);
return result;
}
I find cropping extremely confusing!
I'm not sure EXACTLY what you're trying to do, but this may be it .....
-(UIImage *)simplishTopCropAndTo640:(UIImage *)fromImage
// moderately optimised!
{
float shortDimension = fminf(fromImage.size.width, fromImage.size.height);
// 1.use CGImageCreateWithImageInRect to take only the top square...
// 2. use drawInRect (or CGContextDrawImage, same) to scale...
CGRect topSquareOfOriginalRect =
CGRectMake(0,0, shortDimension,shortDimension);
// NOT fromImage.size.width,fromImage.size.width);
CGImageRef topSquareIR = CGImageCreateWithImageInRect(
fromImage.CGImage, topSquareOfOriginalRect);
CGSize size = CGSizeMake( 640,640 );
CGRect sized = CGRectMake(0.0f, 0.0f, size.width, size.height);
UIGraphicsBeginImageContextWithOptions(size, NO, 0.0f);
CGContextRef cc = UIGraphicsGetCurrentContext();
CGContextSetInterpolationQuality(cc, kCGInterpolationLow);
CGContextTranslateCTM(cc, 0, size.height);
CGContextScaleCTM(cc, 1.0, -1.0);
CGContextDrawImage(cc, sized, topSquareIR );
// arguably, those three lines more simply...
//[[UIImage imageWithCGImage:topSquareIR] drawInRect:sized];
CGImageRelease(topSquareIR);
UIImage *result = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
result =
[UIImage imageWithCGImage:result.CGImage
scale:result.scale
orientation: fromImage.imageOrientation];
//consider...something like...
//[UIImage imageWithCGImage:cgimg
// scale:3 orientation:fromImage.imageOrientation];
return result;
}
Consider also this valuable category .....
-(UIImage *)ordinaryCrop:(CGRect)toRect
{
// crops any image, to any rect. you can't beat that
CGImageRef imageRef = CGImageCreateWithImageInRect([self CGImage], toRect);
UIImage *cropped = [UIImage imageWithCGImage:imageRef];
CGImageRelease(imageRef);
return cropped;
}
Finally don't forget this if you're using the camera "the most useful code in the universe!" iOS UIImagePickerController result image orientation after upload
Hope it helps somehow
Try setting this BOOL property before releasing result in cropOutArea.
result.layer.masksToBounds = YES

Taking screenshot of UIView which having subview with CATransform3DMakeRotation

I am trying to generate screenshot of UIView which having subview with CATransform3DMakeRotation. Screenshot is generated but it doesn't contain Rotation.
Is it possible to achieve this?
Actual View:
ScreenShot Image
Using following call to Flip the view horizontally...
currentView.layer.transform = CATransform3DConcat(currentView.layer.transform,CATransform3DMakeRotation(M_PI, 0.0, 1.0, 0.0f));
Code for taking screen shot
+ (UIImage *) imageWithView:(UIView *)view
{
CGSize screenDimensions = view.bounds.size;
// Create a graphics context with the target size
// (last parameter takes scale into account)
UIGraphicsBeginImageContextWithOptions(screenDimensions, NO, 0);
// Render the view to a new context
CGContextRef context = UIGraphicsGetCurrentContext();
[view.layer renderInContext:context];
UIImage *img = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return img;
}
The "renderInContext" only works for Affine transform. So convert the 3D transform into affine transform like this
currentView.layer.affineTransform = CATransform3DGetAffineTransform(CATransform3DConcat(currentView.layer.transform,CATransform3DMakeRotation(M_PI, 0.0, 1.0, 0.0f)));
Try this code
CGSize newSize = CGSizeMake(yourview.frame.size.width , yourview.frame.size.height);
UIGraphicsBeginImageContextWithOptions(newSize,YES,2.0f);
CGContextRef context = UIGraphicsGetCurrentContext();
CGContextSetInterpolationQuality(context, kCGInterpolationHigh);
[yourview.layer renderInContext:context];
[yourview drawRect:yourview.frame];
UIImage *screenShot = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
This might work, try it:
CGRect grabRect = CGRectMake(40,40,300,200);
//for retina displays
if ([[UIScreen mainScreen] respondsToSelector:#selector(scale)]) {
UIGraphicsBeginImageContextWithOptions(grabRect.size, NO, [UIScreen mainScreen].scale);
} else {
UIGraphicsBeginImageContext(grabRect.size);
}
CGContextRef ctx = UIGraphicsGetCurrentContext();
CGContextTranslateCTM(ctx, -grabRect.origin.x, -grabRect.origin.y);
[self.view.layer renderInContext:ctx];
UIImage *viewImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
UIImageWriteToSavedPhotosAlbum(viewImage, nil, nil, nil);
i have achieved this in one of my application by doing a little tweak like first i capture the whole screen's screen shot and then crop it with the desired frame i need here is a sample code from my app.
- (UIImage *) croppedPhoto
{
[imgcropRectangle setHidden:TRUE];
UIGraphicsBeginImageContext(self.view.frame.size);
[self.view.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage *image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
// Create bitmap image from original image data,
// using rectangle to specify desired crop area
CGImageRef imageRef = CGImageCreateWithImageInRect([image CGImage], self.imgcropRectangle.frame);
UIImage *result = [UIImage imageWithCGImage:imageRef];
CGImageRelease(imageRef);
[imgcropRectangle setHidden:FALSE];
return result;
}
here imgcropRectangle is the UIImageView's object that defines my desired rectangle so i use it's frame for cropping from full screen to desired output. Hope it will help you :)
Try rendering view.layer.presentationLayer instead of view.layer
use this and before passing the view check its subviews :
+ (UIImage *) imageWithView:(UIView *)view
{
UIGraphicsBeginImageContext(CGSizeMake(view.frame.size.width, view.frame.size.height));
[view.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage *viewImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return viewImage
}

How to take a screenshot programmatically on iOS

I want a screenshot of the image on the screen saved into the saved photo library.
Considering a check for retina display use the following code snippet:
#import <QuartzCore/QuartzCore.h>
if ([[UIScreen mainScreen] respondsToSelector:#selector(scale)]) {
UIGraphicsBeginImageContextWithOptions(self.window.bounds.size, NO, [UIScreen mainScreen].scale);
} else {
UIGraphicsBeginImageContext(self.window.bounds.size);
}
[self.window.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage *image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
NSData *imageData = UIImagePNGRepresentation(image);
if (imageData) {
[imageData writeToFile:#"screenshot.png" atomically:YES];
} else {
NSLog(#"error while taking screenshot");
}
Below method works for OPENGL objects also
//iOS7 or above
- (UIImage *) screenshot {
CGSize size = CGSizeMake(your_width, your_height);
UIGraphicsBeginImageContextWithOptions(size, NO, [UIScreen mainScreen].scale);
CGRect rec = CGRectMake(0, 0, your_width, your_height);
[_viewController.view drawViewHierarchyInRect:rec afterScreenUpdates:YES];
UIImage *image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return image;
}
UIGraphicsBeginImageContextWithOptions(self.view.bounds.size, self.view.opaque, 0.0);
[self.myView.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage *image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
NSData *imageData = UIImageJPEGRepresentation(image, 1.0 ); //you can use PNG too
[imageData writeToFile:#"image1.jpeg" atomically:YES];
- (UIImage*) getGLScreenshot {
NSInteger myDataLength = 320 * 480 * 4;
// allocate array and read pixels into it.
GLubyte *buffer = (GLubyte *) malloc(myDataLength);
glReadPixels(0, 0, 320, 480, GL_RGBA, GL_UNSIGNED_BYTE, buffer);
// gl renders "upside down" so swap top to bottom into new array.
// there's gotta be a better way, but this works.
GLubyte *buffer2 = (GLubyte *) malloc(myDataLength);
for(int y = 0; y <480; y++)
{
for(int x = 0; x <320 * 4; x++)
{
buffer2[(479 - y) * 320 * 4 + x] = buffer[y * 4 * 320 + x];
}
}
// make data provider with data.
CGDataProviderRef provider = CGDataProviderCreateWithData(NULL, buffer2, myDataLength, NULL);
// prep the ingredients
int bitsPerComponent = 8;
int bitsPerPixel = 32;
int bytesPerRow = 4 * 320;
CGColorSpaceRef colorSpaceRef = CGColorSpaceCreateDeviceRGB();
CGBitmapInfo bitmapInfo = kCGBitmapByteOrderDefault;
CGColorRenderingIntent renderingIntent = kCGRenderingIntentDefault;
// make the cgimage
CGImageRef imageRef = CGImageCreate(320, 480, bitsPerComponent, bitsPerPixel, bytesPerRow, colorSpaceRef, bitmapInfo, provider, NULL, NO, renderingIntent);
// then make the uiimage from that
UIImage *myImage = [UIImage imageWithCGImage:imageRef];
return myImage;
}
- (void)saveGLScreenshotToPhotosAlbum {
UIImageWriteToSavedPhotosAlbum([self getGLScreenshot], nil, nil, nil);
}
Source.
IN SWIFT
func captureScreen() -> UIImage
{
UIGraphicsBeginImageContextWithOptions(self.view.bounds.size, false, 0);
self.view.drawViewHierarchyInRect(view.bounds, afterScreenUpdates: true)
let image: UIImage = UIGraphicsGetImageFromCurrentImageContext()
UIGraphicsEndImageContext()
return image
}
As of iOS10, this gets a bit simpler. UIKit comes with UIGraphicsImageRender that allows you to
... accomplish drawing tasks, without having to handle configuration such as color depth and image scale, or manage Core Graphics contexts
Apple Docs - UIGraphicsImageRenderer
So you can now do something like this:
let renderer = UIGraphicsImageRenderer(size: someView.bounds.size)
let image = renderer.image(actions: { context in
someView.drawHierarchy(in: someView.bounds, afterScreenUpdates: true)
})
Many of the answers here worked for me in most cases. But when trying to take a snapshot of an ARSCNView, I was only able to do it using the method described above. Although it might be worth noting that at this time, ARKit is still in beta and Xcode is in beta 4
See this post it looks like you can use UIGetScreenImage() for now.
This will save a screenshot and as well return the screenshot too.
-(UIImage *)capture{
UIGraphicsBeginImageContext(self.view.bounds.size);
[self.view.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage *imageView = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
UIImageWriteToSavedPhotosAlbum(imageView, nil, nil, nil); //if you need to save
return imageView;
}
I think the following snippet would help if you want to take a full screen(except for status bar),just replace AppDelegate with your app delegate name if necessary.
- (UIImage *)captureFullScreen {
AppDelegate *_appDelegate = (AppDelegate *)[UIApplication sharedApplication].delegate;
if ([[UIScreen mainScreen] respondsToSelector:#selector(scale)]) {
// for retina-display
UIGraphicsBeginImageContextWithOptions(_appDelegate.window.bounds.size, NO, [UIScreen mainScreen].scale);
[_appDelegate.window drawViewHierarchyInRect:_appDelegate.window.bounds afterScreenUpdates:NO];
} else {
// non-retina-display
UIGraphicsBeginImageContext(_bodyView.bounds.size);
[_appDelegate.window drawViewHierarchyInRect:_appDelegate.window.bounds afterScreenUpdates:NO];
}
UIImage *image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return image;
}
I couldn't find an answer with Swift 3 implementation. So here it goes.
static func screenshotOf(window: UIWindow) -> UIImage? {
UIGraphicsBeginImageContextWithOptions(window.bounds.size, true, UIScreen.main.scale)
guard let currentContext = UIGraphicsGetCurrentContext() else {
return nil
}
window.layer.render(in: currentContext)
guard let image = UIGraphicsGetImageFromCurrentImageContext() else {
UIGraphicsEndImageContext()
return nil
}
UIGraphicsEndImageContext()
return image
}
This will work with swift 4.2, the screenshot will be saved in library, but please don't forget to edit the info.plist # NSPhotoLibraryAddUsageDescription :
#IBAction func takeScreenshot(_ sender: UIButton) {
//Start full Screenshot
print("full Screenshot")
UIGraphicsBeginImageContext(card.frame.size)
view.layer.render(in: UIGraphicsGetCurrentContext()!)
var sourceImage = UIGraphicsGetImageFromCurrentImageContext()
UIGraphicsEndImageContext()
UIImageWriteToSavedPhotosAlbum(sourceImage!, nil, nil, nil)
//Start partial Screenshot
print("partial Screenshot")
UIGraphicsBeginImageContext(card.frame.size)
sourceImage?.draw(at: CGPoint(x:-25,y:-100)) //the screenshot starts at -25, -100
var croppedImage = UIGraphicsGetImageFromCurrentImageContext()
UIGraphicsEndImageContext()
UIImageWriteToSavedPhotosAlbum(croppedImage!, nil, nil, nil)
}
I'm answering this question as it's a highly viewed, and there are many answers out there plus there's Swift and Obj-C.
Disclaimer This is not my code, nor my answers, this is only to help people that land here find a quick answer. There are links to the original answers to give credit where credit is due!! Please honor the original answers with a +1 if you use their answer!
Using QuartzCore
#import <QuartzCore/QuartzCore.h>
if ([[UIScreen mainScreen] respondsToSelector:#selector(scale)]) {
UIGraphicsBeginImageContextWithOptions(self.window.bounds.size, NO, [UIScreen mainScreen].scale);
} else {
UIGraphicsBeginImageContext(self.window.bounds.size);
}
[self.window.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage *image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
NSData *imageData = UIImagePNGRepresentation(image);
if (imageData) {
[imageData writeToFile:#"screenshot.png" atomically:YES];
} else {
NSLog(#"error while taking screenshot");
}
In Swift
func captureScreen() -> UIImage
{
UIGraphicsBeginImageContextWithOptions(self.view.bounds.size, false, 0);
self.view.drawViewHierarchyInRect(view.bounds, afterScreenUpdates: true)
let image: UIImage = UIGraphicsGetImageFromCurrentImageContext()
UIGraphicsEndImageContext()
return image
}
Note: As the nature with programming, updates may need to be done so please edit or let me know! *Also if I failed to include an answer/method worth including feel free to let me know as well!
For iOS 7.0 or above..
If you want to take screenshots of view say(myView), you can do this with single line:
[myView snapshotViewAfterScreenUpdates:NO];
Get Screenshot From View
-(UIImage *)getScreenshotImage {
if ([[UIScreen mainScreen] scale] == 2.0) {
UIGraphicsBeginImageContextWithOptions(self.view.frame.size, FALSE, 2.0);
} else {
UIGraphicsBeginImageContextWithOptions(self.view.frame.size, FALSE, 1.0);
}
[self.view.window.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage * result = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return result;
}
Save Image to Photos
UIImageWriteToSavedPhotosAlbum(YOUR_IMAGE, nil, nil, nil);
How-To
UIImageWriteToSavedPhotosAlbum([self getScreenshotImage], nil, nil, nil);
Two options available at bellow site:
OPTION 1: using UIWindow (tried and work perfectly)
// create graphics context with screen size
CGRect screenRect = [[UIScreen mainScreen] bounds];
UIGraphicsBeginImageContext(screenRect.size);
CGContextRef ctx = UIGraphicsGetCurrentContext();
[[UIColor blackColor] set];
CGContextFillRect(ctx, screenRect);
// grab reference to our window
UIWindow *window = [UIApplication sharedApplication].keyWindow;
// transfer content into our context
[window.layer renderInContext:ctx];
UIImage *screengrab = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
OPTION 2: using UIView
// grab reference to the view you'd like to capture
UIView *wholeScreen = self.splitViewController.view;
// define the size and grab a UIImage from it
UIGraphicsBeginImageContextWithOptions(wholeScreen.bounds.size, wholeScreen.opaque, 0.0);
[wholeScreen.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage *screengrab = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
For retina screen (as DenNukem answer)
// grab reference to our window
UIWindow *window = [UIApplication sharedApplication].keyWindow;
// create graphics context with screen size
CGRect screenRect = [[UIScreen mainScreen] bounds];
if ([[UIScreen mainScreen] respondsToSelector:#selector(scale)]) {
UIGraphicsBeginImageContextWithOptions(screenRect.size, NO, [UIScreen mainScreen].scale);
} else {
UIGraphicsBeginImageContext(screenRect.size);
[window.layer renderInContext:UIGraphicsGetCurrentContext()];
}
for more detail:
http://pinkstone.co.uk/how-to-take-a-screeshot-in-ios-programmatically/
Get Screenshot From View :
- (UIImage *)takeSnapshotView {
CGRect rect = [myView bounds];//Here you can change your view with myView
UIGraphicsBeginImageContextWithOptions(rect.size,YES,0.0f);
CGContextRef context = UIGraphicsGetCurrentContext();
[myView.layer renderInContext:context];
UIImage *capturedScreen = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return capturedScreen;//capturedScreen is the image of your view
}
Hope, this is what you're looking for. Any concern get back to me. :)
Another option is to use the Automation tool on instruments. You write a script to put the screen into whatever you state you want, then take the shots. Here is the script I used for one of my apps. Obviously, the details of the script will be different for your app.
var target = UIATarget.localTarget();
var app = target.frontMostApp();
var window = app.mainWindow();
var picker = window.pickers()[0];
var wheel = picker.wheels()[2];
var buttons = window.buttons();
var button1 = buttons.firstWithPredicate("name == 'dateButton1'");
var button2 = buttons.firstWithPredicate("name == 'dateButton2'");
function setYear(picker, year) {
var yearName = year.toString();
var yearWheel = picker.wheels()[2];
yearWheel.selectValue(yearName);
}
function setMonth(picker, monthName) {
var wheel = picker.wheels()[0];
wheel.selectValue(monthName);
}
function setDay(picker, day) {
var wheel = picker.wheels()[1];
var name = day.toString();
wheel.selectValue(name);
}
target.delay(1);
setYear(picker, 2015);
setMonth(picker, "July");
setDay(picker, 4);
button1.tap();
setYear(picker, 2015);
setMonth(picker, "December");
setDay(picker, 25);
target.captureScreenWithName("daysShot1");
var nButtons = buttons.length;
UIALogger.logMessage(nButtons + " buttons");
for (var i=0; i<nButtons; i++) {
UIALogger.logMessage("button " + buttons[i].name());
}
var tabBar = window.tabBars()[0];
var barButtons = tabBar.buttons();
var nBarButtons = barButtons.length;
UIALogger.logMessage(nBarButtons + " buttons on tab bar");
for (var i=0; i<nBarButtons; i++) {
UIALogger.logMessage("button " + barButtons[i].name());
}
var weeksButton = barButtons[1];
var monthsButton = barButtons[2];
var yearsButton = barButtons[3];
target.delay(2);
weeksButton.tap();
target.captureScreenWithName("daysShot2");
target.delay(2);
monthsButton.tap();
target.captureScreenWithName("daysShot3");
target.delay(2);
yearsButton.tap();
target.delay(2);
button2.tap();
target.delay(2);
setYear(picker, 2018);
target.delay(2);
target.captureScreenWithName("daysShot4");
Just a small contribution, I've done this with a button but the pressing also means the button is captured pressed. So first I unhighlight.
- (IBAction)screenShot:(id)sender {
// Unpress screen shot button
screenShotButton.highlighted = NO;
// create graphics context with screen size
CGRect screenRect = [[UIScreen mainScreen] bounds];
if ([[UIScreen mainScreen] respondsToSelector:#selector(scale)]) {
UIGraphicsBeginImageContextWithOptions(self.view.bounds.size, NO, [UIScreen mainScreen].scale);
} else {
UIGraphicsBeginImageContext(self.view.bounds.size);
}
CGContextRef ctx = UIGraphicsGetCurrentContext();
[[UIColor blackColor] set];
CGContextFillRect(ctx, screenRect);
// grab reference to our window
UIWindow *window = [UIApplication sharedApplication].keyWindow;
// transfer content into our context
[window.layer renderInContext:ctx];
UIImage *screengrab = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
// save screengrab to Camera Roll
UIImageWriteToSavedPhotosAlbum(screengrab, nil, nil, nil);
}
I got the main body of the code from:
http://pinkstone.co.uk/how-to-take-a-screeshot-in-ios-programmatically/
where I used option 1, option 2 didn't seem to work for me. Added the adjustments for Rentina screen sizes from this thread, and the unhighlighting of the screenShotButton. The view I'm using it on is a StoryBoarded screen of buttons and labels and with several UIView added later via the program.
In Swift you can use following code.
if UIScreen.mainScreen().respondsToSelector(Selector("scale")) {
UIGraphicsBeginImageContextWithOptions(self.window!.bounds.size, false, UIScreen.mainScreen().scale)
}
else{
UIGraphicsBeginImageContext(self.window!.bounds.size)
}
self.window?.layer.renderInContext(UIGraphicsGetCurrentContext())
var image : UIImage = UIGraphicsGetImageFromCurrentImageContext()
UIGraphicsEndImageContext()
UIImageWriteToSavedPhotosAlbum(image, nil, nil, nil)
Swift 4:
func makeImage(withView view: UIView) -> UIImage? {
let rect = view.bounds
UIGraphicsBeginImageContextWithOptions(rect.size, true, 0)
guard let context = UIGraphicsGetCurrentContext() else {
assertionFailure()
return nil
}
view.layer.render(in: context)
guard let image = UIGraphicsGetImageFromCurrentImageContext() else {
assertionFailure()
return nil
}
UIGraphicsEndImageContext()
return image
}

Resources