iPhone: Capture iOS Camera with Overlay View - ios

In my application i am capturing picture through camera with overlay view & in the overlay view, there is a custom button through which i want to capture the whole screen.Overlay view is transparent at some points where i want to capture image. I am doing it like this:
- (IBAction)captue:(id)sender
{
[self setBackgroundColor:[UIColor clearColor]];
UIGraphicsBeginImageContext(self.frame.size);
[self.layer.presentationLayer renderInContext:UIGraphicsGetCurrentContext()];
UIImage * image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
UIImageWriteToSavedPhotosAlbum(image, nil, nil, nil);
}
It is capturing the image of overlay view but at the camera view (the view where overlay is transparent and i want to show camera view there) it is capturing black color instead of a photo. Anyone please tell me am i doing something wrong?

I found Screen capture is the one of the thing to capture camera view with overlay. But i didn't get the preview layer in screen captured video(in case video recording). Look at MyAVControllerDemo code to get clear idea and i used IAScreenCaptureView to capture video, or simple snap shot. Now working properly.

Using AVFoundationFramework to fix your problem.
Referred this link: http://code4app.net/ios/Camera-With-AVFoundation/5003cb1d6803fa9a2c000000

Related

How do you screenshot an ARKit camera view?

Looking for a way to capture a screenshot or video of the ARScene without the buttons and other subviews
Traditional screen capture captures everything visible on the screen including UI controls
ARSCNViews have a built in snapshot method that returns a UIImage of the ARScene
So in Objective-C you could do:
UIImage *image = [sceneView snapshot];
and in Swift
var image = sceneView.snapshot()

Screenshot the top most layer/view

I have a UIView which contains another UIView with an UIImage.
dView=[[drawView alloc]initWithFrame:myUIView.frame];
[dView newMaskWithColor:[UIColor colorWithPatternImage:chosenImage]];
[myUIView addSubview:dView];
And by using this code, I erased a part of it, and it looks like this now:
and now I added a layer behind this view, and presented another image in that layer.
[myUIView.layer insertSublayer:_videoPreviewLayer below:dView.layer];
and now it looks like this: (this is the screenshot manually taken on device)
When I try to screenshot the above view programmatically, the result is:
I dont why the newly added videopreview layer doesnt appeared in the screenshot.Here is my screenshot method.
-(void)takeSnapShot{
UIView *topView = [[[[UIApplication sharedApplication] keyWindow] subviews] lastObject]; //tried this too
//capture the screenshot of the uiimageview and save it in camera roll
UIGraphicsBeginImageContext(self.myUIView.frame.size);
[self.myUIView.layer renderInContext:UIGraphicsGetCurrentContext()]; // tried this
[self.dView.layer renderInContext:UIGraphicsGetCurrentContext()]; //tried this
[self.view.layer renderInContext:UIGraphicsGetCurrentContext()]; //tried this
[topView.layer renderInContext:UIGraphicsGetCurrentContext()]; //tried this
UIImage *viewImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
UIImageWriteToSavedPhotosAlbum(viewImage, nil, nil, nil);
}
I am calling this method on a button click in my view controller.
I even tried capturing the window that is suggested from this answer, still I dont get the result I get from screenhotting manually.
Is there any way to capture the top most view that is shown to user ?
Check out this post: Why don't added sublayers show up in screenshot?
Seems that sublayers aren't handled by the renderInContext method.
When you want to capture a screenshot try storing a frame from the preview layer in a UIImageView then you should be able to manually put that frame underneath your mask and just screenshot the superview.

Screenshot programmatically and blur in Swift

I want to take a screenshot from my app programmatically and the code working fine but I have a UIVisualEffectView with blur effect and the screenshot gave me the image without blur!
How I can make the screenshot take the blur also?
UIGraphicsBeginImageContextWithOptions(fullView.bounds.size, true, 1)
self.fullView.layer.renderInContext(UIGraphicsGetCurrentContext())
var viewImage = UIGraphicsGetImageFromCurrentImageContext()
UIGraphicsEndImageContext();
UIImageWriteToSavedPhotosAlbum(viewImage, nil, nil, nil)
To retain the UIVisualEffectView Blur when taking an image of the screen programmatically, you must be taking an image of the entire screen.
Here is a more technical definition provided by Apple:
Many effects require support from the window that hosts the
UIVisualEffectView. Attempting to take a snapshot of only the
UIVisualEffectView will result in a snapshot that does not contain the
effect. To take a snapshot of a view hierarchy that contains a
UIVisualEffectView, you must take a snapshot of the entire UIWindow or
UIScreen that contains it. - Apple Documentation
After taking a screenshot, a blur effect can be added programmatically. Here's an example of using the GPUImage library, which can be installed using CocoaPods. A similar effect can also be achieved using native Apple libraries.
GPUImagePicture *picture = [[GPUImagePicture alloc] initWithImage:snapShot];
GPUImageiOSBlurFilter *blurFilter = [[GPUImageiOSBlurFilter alloc] init];
[blurFilter setBlurRadiusInPixels:4];
[picture addTarget:blurFilter];
[blurFilter useNextFrameForImageCapture];
[picture processImage];
UIImage *processed = [blurFilter imageFromCurrentFramebuffer];
I had this same problem. I had a view controller with a visual effect (blur) applied to a background image and some other non-blurred view on top and I wanted to take a screenshot of that controller.
As highlighted by #user3721428 trying to capture the view doesn't work. For my surprise trying to render the window layer in context doesn't work.
What worked is to get a snapshotView from UIScreen:
let view = UIScreen.main.snapshotView(afterScreenUpdates: true)
And then grab an image from that view using a method like in #neave answer here

Capture Screen on Camera

Helo Everyone. I want to capture screenshot when camera is appeared. The scenario is I'm adding an overlay view to camera. And when user adjusts camera and tapp capture button. I want to generate an image what is on screen. I've tried screen shot by using this code but only overlay is capture not the image. That is camera is blank.
I've also seen this answer
but it only captures image not the overlay view
You can take the image received from the UIImagePickerController (the one received in didFinishPickingMediaWithInfo method from the delegate) and merge it with your overlay view like this :
dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0), ^{
UIImage *cameraImage = The image captured by the camera;
UIImage *overlayImage = Your overlay;
UIImage *computedImage = nil;
UIGraphicsBeginImageContextWithOptions(cameraImage.size, NO, 0.0f);
[cameraImage drawInRect:CGRectMake(0, 0, cameraImage.size.width, cameraImage.size.height)];
[overlayImage drawInRect:CGRectMake(0, 0, overlayImage.size.width, overlayImage.size.height)];
computedImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
dispatch_async(dispatch_get_main_queue(), ^{
// don't forget to go back to the main thread to access the UI again
});
});
EDIT : I added some dispatch_async to avoid blocking the UI

need a very tiny (rectangular in shape) overlay over UIImagePickerController, and then crop the image accordingly - UPDATED

In my application, i need the user to take a snap of only a 10 letter word (using overlay, which should be right in the centre of the screen of the UIImagePicker), and then in need to show him that image (only the part of the image covered by that rectangle). So, I need to crop that image according to the overlay.
Here, i have taken a picture using UIImagePickerControl. Now, i want to see the dimensions of the image that i have taken..
UIImage *imageToprocess = [info objectForKey:UIImagePickerControllerOriginalImage];
NSLog(#"image width %f", imageToprocess.size.width);
NSLog(#"image height %f", imageToprocess.size.height);
I see the following result on console.. But how is this possible. the dimensions of the image is exceeding the dimension of the iPhone screen size.. (which is 320, 568)
UsingTesseractOCR[524:60b] image width 2448.000000
2013-12-17 16:02:18.962 UsingTesseractOCR[524:60b] image height 3264.000000
Can anybody help me out here?? I have gone through several questions here, but did not understand how to do it.
Please help..
Refer this sample code for image capturing and cropping.
https://github.com/kishikawakatsumi/CropImageSample
For creating overlay, first create a custom view (of full dimensions of camera preview) and add an transparent image with just a rectangle in its background. use this view as overlay view.
myview =[[UIImageView alloc]init];
myview.frame=CGRectMake(0, 0, 320, 431);
// why 431? bcoz height = height of device - height of tabbar present in the
bottom for camera controls of picker
//for iphone 4 ,480-49
myview.backgroundColor =[UIColor clearColor];
myview.opaque = NO;
myview.image =[UIImage imageNamed:#"A45Box.png"];
myview.userInteractionEnabled =YES;
note that you create a background image appropriately (means dimensions). You can also draw rectangle programmatically but this is much easy way.
Secondly, talking about your cropping issue, you have to get your hands dirty....Try these links for help
https://github.com/iosdeveloper/ImageCropper
https://github.com/barrettj/BJImageCropper
https://github.com/ardalahmet/SSPhotoCropperViewController

Resources