How Can i change image quality when using AVCaptureSessionPreset - ios

im looking for some help with AVFoundation. Currently i have following a step by step guide on how to capture a still image from this topic: How to save photos taken using AVFoundation to Photo Album?
My question is, how can i lower the quality of my saved image as well as using the AVCaptureSessionPreset640x480. I would like to half the image quality, or if there is another way to make the saved image as small a file as possible - possibly 320x280 - then that could be better than adjusting the actual quality.
I dont know if anyone else has asked this before or not but i have been searching the web for the past couple of days and cannot find an answer. Below is my code.
`
#import "ViewController.h"
#import <ImageIO/ImageIO.h>
#interface ViewController ()
#end
#implementation ViewController
#synthesize imagePreview;
#synthesize iImage;
#synthesize stillImageOutput;
-(IBAction)captureNow {
AVCaptureConnection *videoConnection = nil;
for (AVCaptureConnection *connection in stillImageOutput.connections)
{
for (AVCaptureInputPort *port in [connection inputPorts])
{
if ([[port mediaType] isEqual:AVMediaTypeVideo] )
{
videoConnection = connection;
break;
}
}
if (videoConnection)
{
break;
}
}
NSLog(#"about to request a capture from: %#", stillImageOutput);
[stillImageOutput captureStillImageAsynchronouslyFromConnection:videoConnection completionHandler: ^(CMSampleBufferRef imageSampleBuffer, NSError *error)
{
CFDictionaryRef exifAttachments = CMGetAttachment( imageSampleBuffer, kCGImagePropertyExifDictionary, NULL);
if (exifAttachments)
{
// Do something with the attachments.
NSLog(#"attachements: %#", exifAttachments);
} else {
NSLog(#"no attachments");
}
NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageSampleBuffer];
//NSData *data2 = [NSData dataWithData:UIImageJPEGRepresentation(image, 0.5f)]];
UIImage *image = [[UIImage alloc] initWithData:imageData];
self.iImage.image = image;
UIImageWriteToSavedPhotosAlbum(image, nil, nil, nil);
}];
}
-(void)viewDidAppear:(BOOL)animated
{
AVCaptureSession *session = [[AVCaptureSession alloc] init];
session.sessionPreset = AVCaptureSessionPreset640x480;
CALayer *viewLayer = self.imagePreview.layer;
NSLog(#"viewLayer = %#", viewLayer);
AVCaptureVideoPreviewLayer *captureVideoPreviewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:session];
captureVideoPreviewLayer.frame = self.imagePreview.bounds;
[self.imagePreview.layer addSublayer:captureVideoPreviewLayer];
AVCaptureDevice *device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
NSError *error = nil;
AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:device error:&error];
if (!input) {
// Handle the error appropriately.
NSLog(#"ERROR: trying to open camera: %#", error);
}
[session addInput:input];
stillImageOutput = [[AVCaptureStillImageOutput alloc] init];
NSDictionary *outputSettings = [[NSDictionary alloc] initWithObjectsAndKeys: AVVideoCodecJPEG, AVVideoCodecKey, nil];
[stillImageOutput setOutputSettings:outputSettings];
[session addOutput:stillImageOutput];
[session startRunning];
}
- (void)viewDidLoad
{
[super viewDidLoad];
// Do any additional setup after loading the view, typically from a nib.
}
- (void)didReceiveMemoryWarning
{
[super didReceiveMemoryWarning];
// Dispose of any resources that can be recreated.
}
#end
`

Try using different Preset that will give you different resolution image.
As per the Apple Documentation, for iPhone4(Back) you will get the following resolution image for the following Preset for that session.
AVCaptureSessionPresetHigh : 1280x720
AVCaptureSessionPresetMedium : 480x360
AVCaptureSessionPresetLow : 192x144
AVCaptureSessionPreset640x480 : 640x480
AVCaptureSessionPreset1280x720 : 1280x720
AVCaptureSessionPresetPhoto : 2592x1936.This is not supported for video output
Hope this will help.

You need to set kCVPixelBufferWidthKey and kCVPixelBufferHeightKey options on AVCaptureStillImageOutput object to set resolution of your choice. This width/height will override session preset width/height. Minimal sample as below (add error check).
_session = [[AVCaptureSession alloc] init];
_device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
NSError * error;
_sessionInput = [AVCaptureDeviceInput deviceInputWithDevice:_device error:&error];
_stillImageOutput = [[AVCaptureStillImageOutput alloc] init];
NSDictionary *options = [NSDictionary dictionaryWithObjectsAndKeys:
[NSNumber numberWithDouble:320.0], (id)kCVPixelBufferWidthKey,
[NSNumber numberWithDouble:280.0], (id)kCVPixelBufferHeightKey,
[NSNumber numberWithUnsignedInt:kCVPixelFormatType_32BGRA], (id)kCVPixelBufferPixelFormatTypeKey,
nil];
[_stillImageOutput setOutputSettings:options];
[_session beginConfiguration ];
[_session addInput:_sessionInput];
[_session addOutput:_stillImageOutput];
[_session setSessionPreset:AVCaptureSessionPresetPhoto];
_avConnection = [_stillImageOutput connectionWithMediaType:AVMediaTypeVideo];
[ _session commitConfiguration ];
.............
- (void) start
{
[self.session startRunning];
}
.............
[self.stillImageOutput captureStillImageAsynchronouslyFromConnection:self.avConnection completionHandler: ^(CMSampleBufferRef imageSampleBuffer, NSError *error)
{
CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(imageSampleBuffer);
CVPixelBufferLockBaseAddress(imageBuffer, 0);
size_t width = CVPixelBufferGetWidth(imageBuffer);
size_t height = CVPixelBufferGetHeight(imageBuffer);
NSLog(#"%d : %d", height, width);
}];
Note: I have only tried this on an mac. Ideally it should work for iOS too. Also try maintaining some aspect ratio.

Related

Video Frame from camera using AVFoundation Framework in iOS?

I never worked with AVFoundation Framework, I want to get video frames from the back camera and process with these frames. Any one to help me, your experience will be appreciated. Thanks
You can use the following code to start camera session with AVFoundation in order to capture a still image:
AVCaptureSession *session;
AVCaptureStillImageOutput *stillImageOutput;
session = [[AVCaptureSession alloc] init];
[session setSessionPreset:AVCaptureSessionPresetPhoto];
AVCaptureDevice *inputDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
NSError *error;
AVCaptureDeviceInput *deviceInput = [AVCaptureDeviceInput deviceInputWithDevice:inputDevice error:&error];
if ([session canAddInput:deviceInput]) {
[session addInput:deviceInput];
}
AVCaptureVideoPreviewLayer *previewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:session];
[previewLayer setVideoGravity:AVLayerVideoGravityResizeAspectFill];
CALayer *rootLayer = [[self view] layer];
[rootLayer setMasksToBounds:YES];
CGRect frame = self.frameForCapture.frame;
[previewLayer setFrame:frame];
[rootLayer insertSublayer:previewLayer atIndex:0];
stillImageOutput = [[AVCaptureStillImageOutput alloc] init];
NSDictionary *outputSettings = [[NSDictionary alloc] initWithObjectsAndKeys:AVVideoCodecJPEG, AVVideoCodecKey, nil];
[stillImageOutput setOutputSettings:outputSettings];
[session addOutput:stillImageOutput];
[session startRunning];
Then, in order to actually capture the image, you can use a button with the following code:
- (IBAction)takePhoto:(id)sender {
AVCaptureConnection *videoConnection = nil;
for (AVCaptureConnection *connection in stillImageOutput.connections) {
for (AVCaptureInputPort *port in [connection inputPorts]) {
if ([[port mediaType] isEqual:AVMediaTypeVideo]) {
videoConnection = connection;
break;
}
}
if (videoConnection) {
break;
}
}
[stillImageOutput captureStillImageAsynchronouslyFromConnection:videoConnection
completionHandler:^(CMSampleBufferRef imageDataSampleBuffer, NSError *error) {
if (imageDataSampleBuffer != NULL) {
NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageDataSampleBuffer];
UIImage *image = [UIImage imageWithData:imageData];
}
}];
}
Then, you can do whatever you want to do with the saved image.

Objective-C AVCaptureDevice Front Camera

I have followed a tutorial that guided through a way to make a custom but simple camera app, almost exactly to the needs of the use I would like it. I actually have two issues that I need changing but I will focus on this first one for now.
The following code allows the use of the back camera, but I basically need it to be changed so that I can use the front camera. I will also link here the video I sourced it from to give them credit, and I followed what one of the commenters said about using the front camera but the answer didn't help at all.
https://www.youtube.com/watch?v=Xv1FfqVy-KM
I'm not excellent at coding at all, but trying to learn. Any help would be appreciated! Many thanks.
#interface ViewController ()
#end
#implementation ViewController
AVCaptureSession *session;
AVCaptureStillImageOutput *StillImageOutput;
- (void)viewDidLoad {
[super viewDidLoad];
// Do any additional setup after loading the view, typically from a nib.
}
- (void)viewWillAppear:(BOOL)animated {
session = [[AVCaptureSession alloc] init];
[session setSessionPreset:AVCaptureSessionPresetPhoto];
AVCaptureDevice *inputDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
NSError *error;
AVCaptureDeviceInput *deviceInput = [AVCaptureDeviceInput deviceInputWithDevice:inputDevice error:&error];
if ([session canAddInput:deviceInput]) {
[session addInput:deviceInput];
}
AVCaptureVideoPreviewLayer *previewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:session];
[previewLayer setVideoGravity:AVLayerVideoGravityResizeAspectFill];
CALayer *rootLayer = [[self view] layer];
[rootLayer setMasksToBounds:YES];
CGRect frame = frameforcapture.frame;
[previewLayer setFrame:frame];
[rootLayer insertSublayer:previewLayer atIndex:0];
StillImageOutput = [[AVCaptureStillImageOutput alloc] init];
NSDictionary *outputSettings = [[NSDictionary alloc] initWithObjectsAndKeys:AVVideoCodecJPEG, AVVideoCodecKey, nil];
[StillImageOutput setOutputSettings:outputSettings];
[session addOutput:StillImageOutput];
[session startRunning];
}
- (void)didReceiveMemoryWarning {
[super didReceiveMemoryWarning];
// Dispose of any resources that can be recreated.
}
- (IBAction)takephoto:(id)sender {
AVCaptureConnection *videoConnection = nil;
for (AVCaptureConnection *connection in StillImageOutput.connections) {
for (AVCaptureInputPort *port in [connection inputPorts ]) {
if ([[port mediaType] isEqual:AVMediaTypeVideo]) {
videoConnection = connection;
break;
}
}
}
[StillImageOutput captureStillImageAsynchronouslyFromConnection:videoConnection completionHandler:^(CMSampleBufferRef imageDataSampleBuffer, NSError *error) {
if (imageDataSampleBuffer != NULL) {
NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageDataSampleBuffer];
UIImage *image = [UIImage imageWithData:imageData];
imageView.image = image;
}
}];
}
#end
this code is returns an AVCaptureDevice instance for the default device of the given media type.
AVCaptureDevice *inputDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
change this code to
....
AVCaptureDevice *inputDevice = nil;
NSArray *devices = [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo];
for(AVCaptureDevice *camera in devices) {
if([camera position] == AVCaptureDevicePositionFront) { // is front camera
inputDevice = camera;
break;
}
}
......
- (IBAction)btnCameraClicked:(id)sender {
AVCaptureDevicePosition desiredPosition;
if (isUsingFrontFacingCamera)
desiredPosition = AVCaptureDevicePositionBack;
else
desiredPosition = AVCaptureDevicePositionFront;
for (AVCaptureDevice *d in [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo]) {
if ([d position] == desiredPosition) {
[[previewLayer session] beginConfiguration];
AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:d error:nil];
for (AVCaptureInput *oldInput in [[previewLayer session] inputs]) {
[[previewLayer session] removeInput:oldInput];
}
[[previewLayer session] addInput:input];
[[previewLayer session] commitConfiguration];
break;
}
}
isUsingFrontFacingCamera = !isUsingFrontFacingCamera;
}
//Declare in .h file
AVCaptureVideoPreviewLayer *previewLayer;
BOOL isUsingFrontFacingCamera;

Camera capturing taller image than previewed

There seems to be a number of questions about this issue here. Since most of them don’t provide code snippets, I am electing to show all my code. The problem: the picture taken is taller than the preview. My preview is w=288 by h=288; but the picture that is returned is w=2448 by h=3264. Why is the returned image not a square when my preview is a square?
AVCaptureSession *session;
AVCaptureStillImageOutput *stillImageOutput;
-(void)viewWillAppear:(BOOL)animated
{
[super viewWillAppear:animated];
session=[[AVCaptureSession alloc]init];
[session setSessionPreset:AVCaptureSessionPresetPhoto];
AVCaptureDevice *inputDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
NSError *error;
AVCaptureDeviceInput *deviceInput = [AVCaptureDeviceInput deviceInputWithDevice:inputDevice error:&error];
if ([session canAddInput:deviceInput]) {
[session addInput:deviceInput];
}
AVCaptureVideoPreviewLayer * previewLayout = [[AVCaptureVideoPreviewLayer alloc]initWithSession:session];
[previewLayout setVideoGravity:AVLayerVideoGravityResizeAspectFill];
CALayer *rootLayout =[[self view]layer];
[rootLayout setMasksToBounds:YES];
CGRect frame = self.frameForCapture.frame;
[previewLayout setFrame:frame];
[rootLayout insertSublayer:previewLayout atIndex:0];
stillImageOutput=[[AVCaptureStillImageOutput alloc]init];
NSDictionary *outputSettings =[[NSDictionary alloc]initWithObjectsAndKeys:AVVideoCodecJPEG, AVVideoCodecKey, nil];
[stillImageOutput setOutputSettings:outputSettings];
[session addOutput:stillImageOutput];
[session startRunning];
}
- (IBAction)takePhoto:(id)sender
{
AVCaptureConnection *videoConnection= nil;
for (AVCaptureConnection *connection in stillImageOutput.connections) {
for (AVCaptureInputPort *port in [connection inputPorts]) {
if ([[port mediaType] isEqual:AVMediaTypeVideo]) {
videoConnection=connection;
break;
}
}
if (videoConnection) {
break;
}
}
[stillImageOutput captureStillImageAsynchronouslyFromConnection:videoConnection
completionHandler:^(CMSampleBufferRef imageDataSampleBuffer, NSError *error) {
if (imageDataSampleBuffer != NULL) {//I.E. it does exist
NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageDataSampleBuffer];
UIImage *originalImage =[UIImage imageWithData:imageData ];
NSLog(#"Image w = %f; h = %f",originalImage.size.width,originalImage.size.height);
CGSize size = self.frameForCapture.frame.size;
UIGraphicsBeginImageContext(size);
[originalImage drawInRect:CGRectMake(0,0,size.width,size.width)];
UIImage* image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
self.imageView.image=image;
[self.imageView setHidden:NO];
[self.frameForCapture setHidden:YES];
}
}];
}
I have been playing around with CGImageCreateWithImageInRect to get the preview to be what I finally save and use in my ImageView. But so far no luck. One of the places I have been looking is Cropping an UIImage. Any ideas how I might fix this issue?

Issues aligning UIButton in UIView

I am trying to align a UIButton within my ViewController. I can't tell what the specific name is of my UIButton so I think that's why I'm having a hard time moving it as a subview. Here is my code:
-(void)viewDidAppear:(BOOL)animated
{
AVCaptureSession *session = [[AVCaptureSession alloc] init];
session.sessionPreset = AVCaptureSessionPresetMedium;
CALayer *viewLayer = self.vImagePreview.layer;
NSLog(#"viewLayer = %#", viewLayer);
AVCaptureVideoPreviewLayer *captureVideoPreviewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:session];
captureVideoPreviewLayer.frame = self.vImagePreview.bounds;
[self.vImagePreview.layer addSublayer:captureVideoPreviewLayer];
//Modify to fit screen
CGAffineTransform translate = CGAffineTransformMakeTranslation(0.0, 0.0);
CGAffineTransform scale = CGAffineTransformMakeScale(1.333333, 1.333333);
CGAffineTransform rotate = CGAffineTransformMakeRotation(DEGREES_RADIANS(180));
CGAffineTransform transform = CGAffineTransformConcat(translate, scale);
transform = CGAffineTransformConcat(transform, rotate);
self.vImagePreview.transform = transform;
[self.vImagePreview sendSubviewToFront:UIButton];
AVCaptureDevice *device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
NSError *error = nil;
AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:device error:&error];
if (!input) {
// Handle the error appropriately.
NSLog(#"ERROR: trying to open camera: %#", error);
}
[session addInput:input];
_stillImageOutput = [[AVCaptureStillImageOutput alloc] init];
NSDictionary *outputSettings = [[NSDictionary alloc] initWithObjectsAndKeys: AVVideoCodecJPEG, AVVideoCodecKey, nil];
[_stillImageOutput setOutputSettings:outputSettings];
[session addOutput:_stillImageOutput];
[session startRunning];
}
-(IBAction)captureNow {
AVCaptureConnection *videoConnection = nil;
for (AVCaptureConnection *connection in _stillImageOutput.connections)
{
for (AVCaptureInputPort *port in [connection inputPorts])
{
if ([[port mediaType] isEqual:AVMediaTypeVideo] )
{
videoConnection = connection;
break;
}
}
if (videoConnection)
{
break;
}
}
NSLog(#"about to request a capture from: %#", _stillImageOutput);
[_stillImageOutput captureStillImageAsynchronouslyFromConnection:videoConnection completionHandler: ^(CMSampleBufferRef imageSampleBuffer, NSError *error)
{
CFDictionaryRef exifAttachments = CMGetAttachment( imageSampleBuffer, kCGImagePropertyExifDictionary, NULL);
if (exifAttachments)
{
// Do something with the attachments.
NSLog(#"attachements: %#", exifAttachments);
} else {
NSLog(#"no attachments");
}
NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageSampleBuffer];
UIImage *image = [[UIImage alloc] initWithData:imageData];
self.vImage.image = image;
//Saved Photo modifications go here
UIImageWriteToSavedPhotosAlbum(image, nil, nil, nil);
}];
}
#end
and my headers, just in case:
#import <UIKit/UIKit.h>
#import <AVFoundation/AVFoundation.h>
#import <AVKit/AVKit.h>
#import <ImageIO/CGImageProperties.h>
#define DEGREES_RADIANS(angle) ((angle) / 180.0 * M_PI)
#interface ViewController : UIViewController
#property(nonatomic, retain) IBOutlet UIView *vImagePreview;
#property(nonatomic, retain) AVCaptureStillImageOutput *stillImageOutput;
#property(nonatomic, retain) IBOutlet UIImageView *vImage;
#end
Some clarification:
My button is on my storyboard. The button should trigger the CaptureNow method, taking a picture. I just can't figure out how to get it to appear over the preview layer.
Since your UIbutton is on storyboard, so instead of using
[self.vImagePreview sendSubviewToFront:UIButton];
use this line to display UIBUtton on top.
[self.view bringSubviewToFront:UIButton];
Just before start session running at the end of the emethod.

AVCaptureStillImageOutput never calls completition handler

Following code doesn't work. Whats wrong?
AVCaptureDevice * videoDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
AVCaptureDeviceInput * videoInput = [AVCaptureDeviceInput deviceInputWithDevice:videoDevice error:nil];
AVCaptureSession * captureSession = [[AVCaptureSession alloc] init];
captureSession.sessionPreset = AVCaptureSessionPresetMedium;
if (![captureSession canAddInput:videoInput])
NSLog(#"Can't add input");
[captureSession addInput:videoInput];
self.stillImageOutput = [[AVCaptureStillImageOutput alloc] init];
[self.stillImageOutput setOutputSettings:#{AVVideoCodecKey:AVVideoCodecJPEG}];
if (![captureSession canAddOutput:videoInput])
NSLog(#"Can't add output");
[captureSession addOutput:self.stillImageOutput];
[self.stillImageOutput captureStillImageAsynchronouslyFromConnection:[self.stillImageOutput.connections lastObject]
completionHandler:^(CMSampleBufferRef imageDataSampleBuffer, NSError *error)
{
NSLog(#"!!!");
if (imageDataSampleBuffer == NULL)
{
NSLog(#"%#", error);
return;
}
NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageDataSampleBuffer];
UIImage *image = [[UIImage alloc] initWithData:imageData];
self.imageView.image = image;
}];
// Creating preview layer
self.previewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:captureSession];
self.previewLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;
self.previewLayer.frame = self.view.layer.bounds;
[self.view.layer addSublayer:self.previewLayer];
[captureSession startRunning];
AVCaptureVideoPreviewLayer works nice, but AVCaptureStillImageOutput does not call completion handler at all...
You need to set up & start your session in one method,
then have a separate capture method :
/////////////////////////////////////////////////
////
//// Utility to find front camera
////
/////////////////////////////////////////////////
-(AVCaptureDevice *) frontFacingCameraIfAvailable{
NSArray *videoDevices = [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo];
AVCaptureDevice *captureDevice = nil;
for (AVCaptureDevice *device in videoDevices){
if (device.position == AVCaptureDevicePositionFront){
captureDevice = device;
break;
}
}
// couldn't find one on the front, so just get the default video device.
if (!captureDevice){
captureDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
}
return captureDevice;
}
/////////////////////////////////////////////////
////
//// Setup Session, attach Video Preview Layer
//// and Capture Device, start running session
////
/////////////////////////////////////////////////
-(void) setupCaptureSession {
AVCaptureSession *session = [[AVCaptureSession alloc] init];
session.sessionPreset = AVCaptureSessionPresetMedium;
AVCaptureVideoPreviewLayer *captureVideoPreviewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:session];
[self.view.layer addSublayer:captureVideoPreviewLayer];
NSError *error = nil;
AVCaptureDevice *device = [self frontFacingCameraIfAvailable];
AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:device error:&error];
if (!input) {
// Handle the error appropriately.
NSLog(#"ERROR: trying to open camera: %#", error);
}
[session addInput:input];
self.stillImageOutput = [[AVCaptureStillImageOutput alloc] init];
NSDictionary *outputSettings = [[NSDictionary alloc] initWithObjectsAndKeys: AVVideoCodecJPEG, AVVideoCodecKey, nil];
[self.stillImageOutput setOutputSettings:outputSettings];
[session addOutput:self.stillImageOutput];
[session startRunning];
}
/////////////////////////////////////////////////
////
//// Method to capture Still Image from
//// Video Preview Layer
////
/////////////////////////////////////////////////
-(void) captureNow {
AVCaptureConnection *videoConnection = nil;
for (AVCaptureConnection *connection in self.stillImageOutput.connections) {
for (AVCaptureInputPort *port in [connection inputPorts]) {
if ([[port mediaType] isEqual:AVMediaTypeVideo] ) {
videoConnection = connection;
break;
}
}
if (videoConnection) { break; }
}
NSLog(#"about to request a capture from: %#", self.stillImageOutput);
__weak typeof(self) weakSelf = self;
[self.stillImageOutput captureStillImageAsynchronouslyFromConnection:videoConnection completionHandler: ^(CMSampleBufferRef imageSampleBuffer, NSError *error) {
NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageSampleBuffer];
UIImage *image = [[UIImage alloc] initWithData:imageData];
[weakSelf displayImage:image];
}];
}
This works well:
- (void)viewDidLoad
{
[super viewDidLoad];
AVCaptureDevice * videoDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
AVCaptureDeviceInput * videoInput = [AVCaptureDeviceInput deviceInputWithDevice:videoDevice error:nil];
AVCaptureSession * captureSession = [[AVCaptureSession alloc] init];
captureSession.sessionPreset = AVCaptureSessionPresetMedium;
[captureSession addInput:videoInput];
[captureSession addOutput:self.stillImageOutput];
[captureSession startRunning];
// Creating preview layer
self.previewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:captureSession];
self.previewLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;
self.previewLayer.frame = self.view.layer.bounds;
[self.view.layer insertSublayer:self.previewLayer atIndex:0];
}
- (void)timerFired:(NSTimer *)timer
{
[self.stillImageOutput captureStillImageAsynchronouslyFromConnection:[self.stillImageOutput.connections lastObject]
completionHandler:^(CMSampleBufferRef imageDataSampleBuffer, NSError *error)
{
NSLog(#"!!!");
if (imageDataSampleBuffer == NULL)
NSLog(#"%#", error);
NSData * imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageDataSampleBuffer];
UIImage * image = [[UIImage alloc] initWithData:imageData];
self.imageView.image = image;
}];
}
- (void)viewDidAppear:(BOOL)animated
{
[super viewDidAppear:animated];
[NSTimer scheduledTimerWithTimeInterval:0.5 target:self selector:#selector(timerFired:) userInfo:nil repeats:YES];
}

Resources