iOS programatically taking a screen shot messes up the image - ios

I want to take a screen shot every time user holds the screen for one second and then open up an email window to let the user send the image. But I have a strange problem; the image gets messed up if a gradient image is present (UIImage which has a PNG gradient loaded in it - will explain).
So I created a UILongPressGestureRecgnizer, set it's minimumPressDuration to 1.0f and added it to the main view as the gesture recognizer: [self.view addGestureRecognizer:myRecognizer]. The recognizer calls a method, lets say shareClicked. In which I want to capture the current screen and pop up the email composer with that image in it. Here is how I do it:
if ([[UIScreen mainScreen] respondsToSelector:#selector(scale)])
UIGraphicsBeginImageContextWithOptions(self.view.bounds.size, NO, [UIScreen mainScreen].scale);
else
UIGraphicsBeginImageContext(self.view.bounds.size);
[self.view.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage *image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
MFMailComposeViewController *mailComposer = [[MFMailComposeViewController alloc] init];
mailComposer.mailComposeDelegate = self;
[mailComposer setSubject:#"Share feature"];
[mailComposer setMessageBody:[NSString stringWithFormat:#"Im sharing this with you cause its cool."] isHTML:NO];
[mailComposer addAttachmentData:UIImagePNGRepresentation(image) mimeType:#"image/png" fileName:#"image"];
[self presentViewController:mailComposer animated:YES completion:NULL];
BUT, there is a problem. There is a table view present on the main view and in the third cell there is a gradient I mentioned earlier. Im not sure how to explain it but its something like this:
As visible on the image there is a gradient on the top and there is another one a bit lower. The top one gets rendered normally while the second one causes a strange problem. The gradient is not actually blue but its white and goes to transparent. Here is an image of the gradient on black surface:
Here is a screen shot from the simulator how it should look:
And finally here is how it gets rendered and displayed in the mail composer:
What am I doing wrong? What is the problem? Its not simulator's fault cause it's the same on mobile device as well. Its iOS7 if that makes any difference.
Just to be clear the top gradient is the same but radial and its rendered perfectly. Both gradients are subviews to tableview's cell. How to fix it? Hopefully we find a solution as I did spend some time to "craft" this question :)

Have a look to the recomended solution https://developer.apple.com/library/ios/qa/qa1703/_index.html#//apple_ref/doc/uid/DTS40010193 as UIGetScreenImage is no more allowed

Related

Translucent Modal ViewController - how to handle rotation

I would like to display a UIViewController modally and be able to see a blurred version of the view that presented it.
Following a number of similar questions such as this:
iOS 7 Translucent Modal View Controller
I have added a background to my controller's view that is based on the captured view of the presenting controller. The problem I am facing is that my app supports multiple orientations and when the modal view is presented and rotated, the underlying background image no longer matches.
I tried grabbing a fresh snapshot of the presenting viewController in didRotateFromInterfaceOrientation: of the modal viewController, but it appears that the UI of the presenting viewController is not being updated and the resulting image is still the wrong orientation. Is there any way to force redrawing of a view that is being hidden by the modal one?
After long considerations, I have come up with a passable way to handle it. How well it will work depends a bit on the type of content you have in the presenting viewController.
The general idea is to take not one, but two screenshots before presenting a new viewController - one for portrait, one for landscape. This is achieved by changing the frames of the top viewController and navigation bar (if applicable) to emulate a different orientation, taking the screenshot of the result, and changing it back. The user never sees this change on device, but the screen grab still displays a new orientation.
The exact code will depend on where you are calling it from, but the main logic is the same. My implementation runs from AppDelegate because it is reused by several subclasses of UIViewController.
The following is the code that will grab the appropriate screenshots.
// get references to the views you need a screenshot of
// this may very depending on your app hierarchy
UIView *container = [self.window.subviews lastObject]; // UILayoutContainerView
UIView *subview = container.subviews[0]; // UINavigationTransitionView
UIView *navbar = container.subviews[1]; // UINavigationBar
CGSize originalSubviewSize = subview.frame.size;
CGSize originalNavbarSize = navbar.frame.size;
// compose the current view of the navbar and subview
UIImage *currentComposed = [self composeForeground:navbar withBackground:subview];
// rotate the navbar and subview
subview.frame = CGRectMake(subview.frame.origin.x, subview.frame.origin.y, originalSubviewSize.height, originalSubviewSize.width);
// the navbar has to match the width of the subview, height remains the same
navbar.frame = CGRectMake(navbar.frame.origin.x, navbar.frame.origin.y, originalSubviewSize.height, originalNavbarSize.height);
// compose the rotated view
UIImage *rotatedComposed = [self composeForeground:navbar withBackground:subview];
// change the frames back to normal
subview.frame = CGRectMake(subview.frame.origin.x, subview.frame.origin.y, originalSubviewSize.width, originalSubviewSize.height);
navbar.frame = CGRectMake(navbar.frame.origin.x, navbar.frame.origin.y, originalNavbarSize.width, originalNavbarSize.height);
// assign the variables depending on actual orientations
UIImage *landscape; UIImage *portrait;
if (originalSubviewSize.height > originalSubviewSize.width) {
// current orientation is portrait
portrait = currentComposed;
landscape = rotatedComposed;
} else {
// current orientation is landscape
portrait = rotatedComposed;
landscape = currentComposed;
}
CustomTranslucentViewController *vc = [CustomTranslucentViewController new];
vc.backgroundSnap = portrait;
vc.backgroundSnapLandscape = landscape;
[rooVC presentViewController:vc animated:YES completion:nil];
The method composeForeground:withBackground: is a convenience method that generates an appropriate background image based on two input views (navigation bar + view controller). Aside from composing the two view together, it does a bit more magic to make the result look more natural when rotating the presented viewController. Specifically, it extends the screenshot to a 1024x1024 square and fills the extra space with a mirrored copy of the composed image. In many cases, once blurred this looks good enough since the animation of the views re-drawing for the orientation change is not available.
- (UIImage *)composeForeground:(UIView *)frontView withBackground:(UIView *)backView {
UIGraphicsBeginImageContextWithOptions(backView.frame.size, 0, 0);
[backView.layer renderInContext:UIGraphicsGetCurrentContext()];
// translation is necessary to account for the extra 20 taken up by the status bar
CGContextTranslateCTM(UIGraphicsGetCurrentContext(), frontView.frame.origin.x, frontView.frame.origin.y);
[frontView.layer renderInContext:UIGraphicsGetCurrentContext()];
CGContextTranslateCTM(UIGraphicsGetCurrentContext(), -frontView.frame.origin.x, -frontView.frame.origin.y);
// this is the core image, would have left it at this if we did not need to use fancy mirrored tiling
UIImage *newImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
// add mirrored sections
CGFloat addition = 256; // 1024 - 768
if (newImage.size.height > newImage.size.width) {
// portrait, add a mirrored image on the right
UIImage *horizMirror = [[UIImage alloc] initWithCGImage:newImage.CGImage scale:newImage.scale orientation:UIImageOrientationUpMirrored];
UIGraphicsBeginImageContextWithOptions(CGSizeMake(newImage.size.width+addition, newImage.size.height), 0, 0);
[horizMirror drawAtPoint:CGPointMake(newImage.size.width, 0)];
} else {
// landscape, add a mirrored image at the bottom
UIImage *vertMirror = [[UIImage alloc] initWithCGImage:newImage.CGImage scale:newImage.scale orientation:UIImageOrientationDownMirrored];
UIGraphicsBeginImageContextWithOptions(CGSizeMake(newImage.size.width, newImage.size.height+addition), 0, 0);
[vertMirror drawAtPoint:CGPointMake(0, newImage.size.height)];
}
// combine the mirrored extension with the original image
[newImage drawAtPoint:CGPointZero];
newImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
// for ios 6, crop off the top 20px
if (SYSTEM_VERSION_LESS_THAN(#"7")) {
UIGraphicsBeginImageContextWithOptions(CGSizeMake(newImage.size.width, newImage.size.height-20), NO, 0);
[newImage drawAtPoint:CGPointMake(0, -20)];
newImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
}
return newImage;
}
The resulting landscape and portrait images can be blurred and tinted as desired, and set as background for the presented viewController. Use willRotateToInterfaceOrientation:duration: method of this viewController to select the appropriate image.
Note: I have tried to reduce the amount of work done on images and graphics contexts as much as possible, but there is still a slight delay when generating the background (around 30-90 ms per composeForeground:withBackground: iteration, depending on the content, on a vintage slow iPad 2). If you know of a way to further optimize or simplify the above solution, please share!

iOS UIImagePickerController annoying rotation (display picture) issue

I'm writing app now which is using uiimagepicturecontroller.
Task is very simply. U need to take a picture a thing inside place in overlay (overlay is just a circle) and then display it. It works fine, when user holding the iPhone vertical. Picture is cropping and displaying fine. Problem is when user rotate their iPhone horizontal. I suppose camera is rotating and i can't display photo as i'd like.
Any ideas how to remove autorotatin?
Device orientations in my target->general is only Portrait
I tried override autorotate method in UIImagePickerController and it didn't works
Any idea how to fix it? Or any tricky tricky method to display image correctly?
Best regards,
David.
EDIT: Solution for this is use AVCam and write "own" class to take a picture.
i think it works:
-(void)imagePickerController:(UIImagePickerController *)pickerinput didFinishPickingMediaWithInfo:(NSDictionary *)info {
[pickerinput dismissViewControllerAnimated:YES completion:nil];
UIImage *image=[info objectForKey:UIImagePickerControllerOriginalImage];
UIImageView *yourImgView=[[UIImageView alloc] initwithimage:image];
//please add this
yourImgView.transform = CGAffineTransformMakeRotation(M_PI/2);
}
Try this.
_img = [UIImage imageWithCIImage:_image scale:_imgScale orientation:_imgOrient];
_imgOrient = _img.imageOrientation;
_imgScale = _img.scale;

need a very tiny (rectangular in shape) overlay over UIImagePickerController, and then crop the image accordingly - UPDATED

In my application, i need the user to take a snap of only a 10 letter word (using overlay, which should be right in the centre of the screen of the UIImagePicker), and then in need to show him that image (only the part of the image covered by that rectangle). So, I need to crop that image according to the overlay.
Here, i have taken a picture using UIImagePickerControl. Now, i want to see the dimensions of the image that i have taken..
UIImage *imageToprocess = [info objectForKey:UIImagePickerControllerOriginalImage];
NSLog(#"image width %f", imageToprocess.size.width);
NSLog(#"image height %f", imageToprocess.size.height);
I see the following result on console.. But how is this possible. the dimensions of the image is exceeding the dimension of the iPhone screen size.. (which is 320, 568)
UsingTesseractOCR[524:60b] image width 2448.000000
2013-12-17 16:02:18.962 UsingTesseractOCR[524:60b] image height 3264.000000
Can anybody help me out here?? I have gone through several questions here, but did not understand how to do it.
Please help..
Refer this sample code for image capturing and cropping.
https://github.com/kishikawakatsumi/CropImageSample
For creating overlay, first create a custom view (of full dimensions of camera preview) and add an transparent image with just a rectangle in its background. use this view as overlay view.
myview =[[UIImageView alloc]init];
myview.frame=CGRectMake(0, 0, 320, 431);
// why 431? bcoz height = height of device - height of tabbar present in the
bottom for camera controls of picker
//for iphone 4 ,480-49
myview.backgroundColor =[UIColor clearColor];
myview.opaque = NO;
myview.image =[UIImage imageNamed:#"A45Box.png"];
myview.userInteractionEnabled =YES;
note that you create a background image appropriately (means dimensions). You can also draw rectangle programmatically but this is much easy way.
Secondly, talking about your cropping issue, you have to get your hands dirty....Try these links for help
https://github.com/iosdeveloper/ImageCropper
https://github.com/barrettj/BJImageCropper
https://github.com/ardalahmet/SSPhotoCropperViewController

Blurring a UIView that contains the keyboard with content beneath it (iOS 7)

I have a UIView with a UITableView the extends beneath the keyboard. The content in the table view is bright enough, making it clear that content sits behind the keyboard. I'm attempting to take a screenshot of the entire view in order to blur it using the following code:
- (UIImage *)screenshotFromView:(UIView *)view;
{
UIGraphicsBeginImageContextWithOptions(view.bounds.size, NO, 0.0);
[view drawViewHierarchyInRect:view.bounds afterScreenUpdates:NO];
UIImage *image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return image;
}
However, the image that is returned does not create a transparent keyboard. This presents an odd transition when going from the non-blurred view to the blurred view, since there is clearly content behind the keyboard before the transition to the blurred image.
Is it possible to take a screenshot of the entire screen, without use of private APIs, while still keeping the transparency of the keyboard + the status bar?
I got the exact problem as you these days, so I exactly know what you want. I wanted the whole UI blurred behind a message including the keyboard, which is not included with any regular screenshot method. My cure is the following code:
- (UIImage*)screenShotWithKeyboard:(UIView *)viewToShoot
{
UIWindow *keyboard = nil;
for (UIWindow *window in [[UIApplication sharedApplication] windows])
{
if ([[window description] hasPrefix:#"<UITextEffectsWin"])
{
keyboard = window;
break;
}
}
// Define the dimensions of the screenshot you want to take (the entire screen in this case)
CGSize size = [[UIScreen mainScreen] bounds].size;
// Create the screenshot
UIGraphicsBeginImageContext(size);
CGContextRef context=UIGraphicsGetCurrentContext();
// delete following line if you only want the keyboard
[[viewToShoot layer] renderInContext:context];
if(keyboard!=nil)
[[keyboard layer] renderInContext:context];
UIImage *screenImg = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return screenImg;
}
I got the idea from an article of Aran Balkan I broke it down to one method and testet it for iOS 7 which seems to work for me. The article worth reading as he explain the tricks behind a little. As you only want the actual keyboard as image, you can comment out the line I marked in the code. With that image keyboard you can do your blur stuff yourself.
The code is far from perfect, but I think you got the idea.
Two thinks at the end:
I am a freshman to objective c and iOS development. If you find any problematic bugs, a comment is very welcome to improve this answer.
Second, I wrote this code today in my app and I do not know yet if I violate any developer rules for iOS. At the moment I do not see any problems, but I will investigate this further as I want to release my app with that graphic trick. I will keeping this post updated. Until than, as with point one, I would highly appreciate any comment regarding this issue.
Have you considered using UIKeyboardAppearanceDark? Currently the default value of keyboardAppearance corresponds to UIKeyboardAppearanceLight, so it may not be suited to your use case.

Blur screen with iOS 7's snapshot API

I believe the NDA is down, so I can ask this question. I have a UIView subclass:
BlurView *blurredView = ((BlurView *)[self.view snapshotViewAfterScreenUpdates:NO]);
blurredView.frame = self.view.frame;
[self.view addSubview:blurredView];
It does its job so far in capturing the screen, but now I want to blur that view. How exactly do I go about this? From what I've read I need to capture the current contents of the view (context?!) and convert it to CIImage (no?) and then apply a CIGaussianBlur to it and draw it back on the view.
How exactly do I do that?
P.S. The view is not animated, so it should be OK performance wise.
EDIT: Here is what I have so far. The problem is that I can't capture the snapshot to a UIImage, I get a black screen. But if I add the view as a subview directly, I can see the snapshot is there.
// Snapshot
UIView *view = [self.view snapshotViewAfterScreenUpdates:NO];
// Convert to UIImage
UIGraphicsBeginImageContextWithOptions(view.bounds.size, view.opaque, 0.0);
[view.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage *img = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
// Apply the UIImage to a UIImageView
BlurView *blurredView = [[BlurView alloc] initWithFrame:CGRectMake(0, 0, 500, 500)];
[self.view addSubview:blurredView];
blurredView.imageView.image = img;
// Black screen -.-
BlurView.m:
- (id)initWithFrame:(CGRect)frame {
self = [super initWithFrame:frame];
if (self) {
// Initialization code
self.imageView = [[UIImageView alloc] init];
self.imageView.frame = CGRectMake(20, 20, 200, 200);
[self addSubview:self.imageView];
}
return self;
}
Half of this question didn't get answered, so I thought it worth adding.
The problem with UIScreen's
- (UIView *)snapshotViewAfterScreenUpdates:(BOOL)afterUpdates
and UIView's
- (UIView *)resizableSnapshotViewFromRect:(CGRect)rect
afterScreenUpdates:(BOOL)afterUpdates
withCapInsets:(UIEdgeInsets)capInsets
Is that you can't derive a UIImage from them - the 'black screen' problem.
In iOS7 Apple provides a third piece of API for extracting UIImages, a method on UIView
- (BOOL)drawViewHierarchyInRect:(CGRect)rect
afterScreenUpdates:(BOOL)afterUpdates
It is not as fast as snapshotView, but not bad compared to renderInContext (in the example provided by Apple it is five times faster than renderInContext and three times slower than snapshotView)
Example use:
UIGraphicsBeginImageContextWithOptions(image.size, NULL, 0);
[view drawViewHierarchyInRect:rect];
UIImage* newImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
Then to get a blurred version
UIImage* lightImage = [newImage applyLightEffect];
where applyLightEffect is one of those Blur category methods on Apple's UIImage+ImageEffects category mentioned in the accepted answer (the enticing link to this code sample in the accepted answer doesn't work, but this one will get you to the right page: the file you want is iOS_UIImageEffects).
The main reference is to WWDC2013 session 226, Implementing Engaging UI on iOS
By the way, there is an intriguing note in Apple's reference docs for renderInContext that hints at the black screen problem..
Important: The OS X v10.5 implementation of this method does not support the entire Core Animation composition model. QCCompositionLayer, CAOpenGLLayer, and QTMovieLayer layers are not rendered. Additionally, layers that use 3D transforms are not rendered, nor are layers that specify backgroundFilters, filters, compositingFilter, or a mask values. Future versions of OS X may add support for rendering these layers and properties.
The note hasn't been updated since 10.5, so I guess 'future versions' may still be a while off, and we can add our new CASnapshotLayer (or whatever) to the list.
Sample Code from WWDC ios_uiimageeffects
There is a UIImage category named UIImage+ImageEffects
Here is its API:
- (UIImage *)applyLightEffect;
- (UIImage *)applyExtraLightEffect;
- (UIImage *)applyDarkEffect;
- (UIImage *)applyTintEffectWithColor:(UIColor *)tintColor;
- (UIImage *)applyBlurWithRadius:(CGFloat)blurRadius
tintColor:(UIColor *)tintColor
saturationDeltaFactor:(CGFloat)saturationDeltaFactor
maskImage:(UIImage *)maskImage;
For legal reason I can't show the implementation here, there is a demo project in it. should be pretty easy to get start with.
To summarize how to do this with foundry's sample code, use the following:
I wanted to blur the entire screen just slightly so for my purposes so I'll use the main screen bounds.
CGRect screenCaptureRect = [UIScreen mainScreen].bounds;
UIView *viewWhereYouWantToScreenCapture = [[UIApplication sharedApplication] keyWindow];
//screen capture code
UIGraphicsBeginImageContextWithOptions(screenCaptureRect.size, NO, [UIScreen mainScreen].scale);
[viewWhereYouWantToScreenCapture drawViewHierarchyInRect:screenCaptureRect afterScreenUpdates:NO];
UIImage *capturedImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
//blur code
UIColor *tintColor = [UIColor colorWithWhite:1.0 alpha:0];
UIImage *blurredImage = [capturedImage applyBlurWithRadius:1.5 tintColor:tintColor saturationDeltaFactor:1.2 maskImage:nil];
//or use [capturedImage applyLightAffect] but I thought that was too much for me
//use blurredImage in whatever way you so desire!
Notes on the screen capture part
UIGraphicsBeginImageContextWithOptions() 2nd argument is opacity. It should be NO unless you have nothing with any alpha other than 1. If you return yes the screen capture will not look at transparency values so it will go faster but will probably be wrong.
UIGraphicsBeginImageContextWithOptions() 3rd argument is the scale. Probably want to put in the scale of the device like I did to make sure and differentiate between retina and non-retina. But I haven't really tested this and I think 0.0f also works.
drawViewHierarchyInRect:afterScreenUpdates: watch out what you return for the screen updates BOOL. I tried to do this right before backgrounding and if I didn't put NO the app would go crazy with glitches when I returned to the foreground. You might be able to get away with YES though if you're not leaving the app.
Notes on blurring
I have a very light blur here. Changing the blurRadius will make it blurrier, and you can change the tint color and alpha to make all sorts of other effects.
Also you need to add a category for the blur methods to work...
How to add the UIImage+ImageEffects category
You need to download the category UIImage+ImageEffects for the blur to work. Download it here after logging in: https://developer.apple.com/downloads/index.action?name=WWDC%202013
Search for "UIImageEffects" and you'll find it. Just pull out the 2 necessary files and add them to your project. UIImage+ImageEffects.h and UIImage+ImageEffects.m.
Also, I had to Enable Modules in my build settings because I had a project that wasn't created with xCode 5. To do this go to your target build settings and search for "modules" and make sure that "Enable Modules" and "Link Frameworks Automatically" are both set to yes or you'll have compiler errors with the new category.
Good luck blurring!
Check WWDC 2013 sample application "running with a snap".
The blurring is there implemented as a category.

Resources