Capture Screen on Camera - ios

Helo Everyone. I want to capture screenshot when camera is appeared. The scenario is I'm adding an overlay view to camera. And when user adjusts camera and tapp capture button. I want to generate an image what is on screen. I've tried screen shot by using this code but only overlay is capture not the image. That is camera is blank.
I've also seen this answer
but it only captures image not the overlay view

You can take the image received from the UIImagePickerController (the one received in didFinishPickingMediaWithInfo method from the delegate) and merge it with your overlay view like this :
dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0), ^{
UIImage *cameraImage = The image captured by the camera;
UIImage *overlayImage = Your overlay;
UIImage *computedImage = nil;
UIGraphicsBeginImageContextWithOptions(cameraImage.size, NO, 0.0f);
[cameraImage drawInRect:CGRectMake(0, 0, cameraImage.size.width, cameraImage.size.height)];
[overlayImage drawInRect:CGRectMake(0, 0, overlayImage.size.width, overlayImage.size.height)];
computedImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
dispatch_async(dispatch_get_main_queue(), ^{
// don't forget to go back to the main thread to access the UI again
});
});
EDIT : I added some dispatch_async to avoid blocking the UI

Related

Screenshot the top most layer/view

I have a UIView which contains another UIView with an UIImage.
dView=[[drawView alloc]initWithFrame:myUIView.frame];
[dView newMaskWithColor:[UIColor colorWithPatternImage:chosenImage]];
[myUIView addSubview:dView];
And by using this code, I erased a part of it, and it looks like this now:
and now I added a layer behind this view, and presented another image in that layer.
[myUIView.layer insertSublayer:_videoPreviewLayer below:dView.layer];
and now it looks like this: (this is the screenshot manually taken on device)
When I try to screenshot the above view programmatically, the result is:
I dont why the newly added videopreview layer doesnt appeared in the screenshot.Here is my screenshot method.
-(void)takeSnapShot{
UIView *topView = [[[[UIApplication sharedApplication] keyWindow] subviews] lastObject]; //tried this too
//capture the screenshot of the uiimageview and save it in camera roll
UIGraphicsBeginImageContext(self.myUIView.frame.size);
[self.myUIView.layer renderInContext:UIGraphicsGetCurrentContext()]; // tried this
[self.dView.layer renderInContext:UIGraphicsGetCurrentContext()]; //tried this
[self.view.layer renderInContext:UIGraphicsGetCurrentContext()]; //tried this
[topView.layer renderInContext:UIGraphicsGetCurrentContext()]; //tried this
UIImage *viewImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
UIImageWriteToSavedPhotosAlbum(viewImage, nil, nil, nil);
}
I am calling this method on a button click in my view controller.
I even tried capturing the window that is suggested from this answer, still I dont get the result I get from screenhotting manually.
Is there any way to capture the top most view that is shown to user ?
Check out this post: Why don't added sublayers show up in screenshot?
Seems that sublayers aren't handled by the renderInContext method.
When you want to capture a screenshot try storing a frame from the preview layer in a UIImageView then you should be able to manually put that frame underneath your mask and just screenshot the superview.

Video stream in AVSampleBufferDisplayLayer doesn't show up in screenshot

I've been using the new Video Toolbox methods to take an H.264 video stream and display it in a view controller using AVSampleBufferDisplayLayer. This all works as intended and the stream looks great. However, when I try to take a screenshot of the entire view, the contents of the AVSampleBufferDisplayLayer (i.e. the decompressed video stream) do not show up in the snapshot. The snapshot shows all other UI buttons/labels/etc. but the screenshot only shows the background color of the AVSampleBufferDisplayLayer (which I had set to bright blue) and not the live video feed.
In the method below (inspired by this post) I take the SampleBuffer from my stream and queue it to be displayed on the AVSampleBufferDisplayLayer. Then I call my method imageFromLayer: to get the snapshot as a UIImage. (I then either display that UIImage in the UIImageView imageDisplay, or I save it to the device's local camera roll to verify what the UIImage looks like. Both methods yield the same result.)
-(void) h264VideoFrame:(CMSampleBufferRef)sample
{
[self.AVSampleDisplayLayer enqueueSampleBuffer:sample];
dispatch_sync(dispatch_get_main_queue(), ^(void) {
UIImage* snapshot = [self imageFromLayer:self.AVSampleDisplayLayer];
[self.imageDisplay setImage:snapshot];
});
}
Here I simply take the contents of the AVSampleBufferDisplayLayer and attempt to convert it to a UIImage. If I pass the entire screen into this method as the layer, all other UI elements like labels/buttons/images will show up except for the AVDisplayLayer. If I pass in just the AVDisplayLayer, I get a solid blue image (since the background color is blue).
- (UIImage *)imageFromLayer:(CALayer *)layer
{
UIGraphicsBeginImageContextWithOptions([layer frame].size, YES, 1.0);
[layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage *outputImage = UIGraphicsGetImageFromCurrentImageContext();
//UIImageWriteToSavedPhotosAlbum(outputImage, self, nil, nil);
UIGraphicsEndImageContext();
return outputImage;
}
I've tried using UIImage snapshot = [self imageFromLayer: self.AVSampleDisplayLayer.presentationLayer]; and .modelLayer, but that didn't help. I've tried queueing the samplebuffer and waiting before taking a snapshot, I've tried messing with the opacity and xPosition of the AVDisplayLayer... I've even tried setting different values for the CMTimebase of the AVDisplayLayer. Any hints are appreciated!
Also according to this post, and this post other people are having similar troubles with snapshots in iOS 8.
I fixed this by switching from AVSampleDisplayLayer to VTDecompressionSession. In the VTDecompression didDecompress callback method, I send the decompressed image (CVImageBufferRef) into the following method to get a screenshot of the video stream and turn it into a UIImage.
-(void) screenshotOfVideoStream:(CVImageBufferRef)imageBuffer
{
CIImage *ciImage = [CIImage imageWithCVPixelBuffer:imageBuffer];
CIContext *temporaryContext = [CIContext contextWithOptions:nil];
CGImageRef videoImage = [temporaryContext
createCGImage:ciImage
fromRect:CGRectMake(0, 0,
CVPixelBufferGetWidth(imageBuffer),
CVPixelBufferGetHeight(imageBuffer))];
UIImage *image = [[UIImage alloc] initWithCGImage:videoImage];
[self doSomethingWithOurUIImage:image];
CGImageRelease(videoImage);
}

Is there a way to take a screenshot of my app without the background image?

I have an application where I created a button to screenshot the app to the camera roll. I want to be able to hide the background image when the user takes a screen shot.
-(IBAction) screenShot: (id) sender{
UIGraphicsBeginImageContext(sshot.frame.size);
[self.view.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage *viewImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
UIImageWriteToSavedPhotosAlbum(viewImage,nil, nil, nil);
}
I have this as my background
[self.view setBackgroundColor:[UIColor colorWithPatternImage:[UIImage imageNamed:#"back5.png"]]];
Is it possible to take a screenshot of the app where the background is no longer the image but just white color? I would also want to hide an image.view when using the screenshot as well.

GPUImage output image is missing in screen capture

I am trying to capture screen portion to post image on social media.
I am using following code to capture screen.
- (UIImage *) imageWithView:(UIView *)view
{
UIGraphicsBeginImageContextWithOptions(view.bounds.size, NO, 0.0);
[view.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage *img = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return img;
}
Above code is perfect for capturing screen.
Problem :
My UIView contains GPUImageView with the filtered image. When I tries to capture screen using above code, that particular portion of GPUImageView does not contains the filtered image.
I am using GPUImageSwirlFilter with the static image (no camera). I have also tried
UIImage *outImage = [swirlFilter imageFromCurrentFramebuffer]
but its not giving image.
Note : Following is working code, which gives perfect output of swirl effect, but I want same image in UIImage object.
dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_HIGH, 0), ^{
GPUImageSwirlFilter *swirlFilter = [GPUImageSwirlFilter alloc] init];
swirlLevel = 4;
[swirlFilter setAngle:(float)swirlLevel/10];
UIImage *inputImage = [UIImage imageNamed:gi.wordImage];
GPUImagePicture *swirlSourcePicture = [[GPUImagePicture alloc] initWithImage:inputImage];
inputImage = nil;
[swirlSourcePicture addTarget:swirlFilter];
dispatch_async(dispatch_get_main_queue(), ^{
[swirlFilter addTarget:imgSwirl];
[swirlSourcePicture processImage];
// This works perfect and I have filtered image in my imgSwirl. But I want
// filtered image in UIImage to use at different place like posting
// on social media
sharingImage = [swirlFilter imageFromCurrentFramebuffer]; // This also
// returns nothing.
});
});
1) Am I doing something wrong with GPUImage's imageFromCurrentFramebuffer ?
2) And why does screen capture code is not including GPUImageView portion in output image ?
3) How do I get filtered image in UIImage ?
First, -renderInContext: won't work with a GPUImageView, because a GPUImageView renders using OpenGL ES. -renderinContext: does not capture from CAEAGLLayers, which are used to back views presenting OpenGL ES content.
Second, you're probably getting a nil image in the latter code because you've forgotten to set -useNextFrameForImageCapture on your filter before triggering -processImage. Without that, your filter won't hang on to its backing framebuffer long enough to capture an image from it. This is due to a recent change in the way that framebuffers are handled in memory (although this change did not seem to get communicated very well).

iPhone: Capture iOS Camera with Overlay View

In my application i am capturing picture through camera with overlay view & in the overlay view, there is a custom button through which i want to capture the whole screen.Overlay view is transparent at some points where i want to capture image. I am doing it like this:
- (IBAction)captue:(id)sender
{
[self setBackgroundColor:[UIColor clearColor]];
UIGraphicsBeginImageContext(self.frame.size);
[self.layer.presentationLayer renderInContext:UIGraphicsGetCurrentContext()];
UIImage * image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
UIImageWriteToSavedPhotosAlbum(image, nil, nil, nil);
}
It is capturing the image of overlay view but at the camera view (the view where overlay is transparent and i want to show camera view there) it is capturing black color instead of a photo. Anyone please tell me am i doing something wrong?
I found Screen capture is the one of the thing to capture camera view with overlay. But i didn't get the preview layer in screen captured video(in case video recording). Look at MyAVControllerDemo code to get clear idea and i used IAScreenCaptureView to capture video, or simple snap shot. Now working properly.
Using AVFoundationFramework to fix your problem.
Referred this link: http://code4app.net/ios/Camera-With-AVFoundation/5003cb1d6803fa9a2c000000

Resources