iOS AVFoundation image capture too slow - ios

I am creating session and AVCaptureStillImageOutput this way:
__imageCaptureSession = [[AVCaptureSession alloc] init];
__imageCaptureSession.sessionPreset = AVCaptureSessionPresetPhoto;
__imageCapture = [[AVCaptureStillImageOutput alloc] init];
NSDictionary *outputSettings = [[NSDictionary alloc] initWithObjectsAndKeys:AVVideoCodecJPEG, AVVideoCodecKey, nil];
[__imageCapture setOutputSettings:outputSettings];
[__imageCaptureSession addInput:input];
[__imageCaptureSession addOutput:__imageCapture];
input is back camera. To take the photo I use code below. The problem is, taking of picture takes too much time. Is it possible to solve this problem?
AVCaptureConnection *videoConnection = nil;
for (AVCaptureConnection *connection in __imageCapture.connections)
{
for (AVCaptureInputPort *port in [connection inputPorts])
{
if ([[port mediaType] isEqual:AVMediaTypeVideo] )
{
videoConnection = connection;
break;
}
}
if (videoConnection) { break; }
}
[__imageCapture captureStillImageAsynchronouslyFromConnection:videoConnection completionHandler: ^(CMSampleBufferRef imageSampleBuffer, NSError *error) {
//metadata
CFDictionaryRef exifAttachments = CMGetAttachment(imageSampleBuffer, kCGImagePropertyExifDictionary, NULL);
NSMutableDictionary *metadata = [[NSMutableDictionary alloc] initWithDictionary:(NSDictionary*)exifAttachments];
//image
NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageSampleBuffer];
UIImage *image = [[UIImage alloc] initWithData:imageData];
//send notification
NSDictionary *dict = [NSDictionary dictionaryWithObjectsAndKeys:[image retain] ,kImageKey,
[metadata retain], kMetadataKey,
nil];
[metadata release];
[[NSNotificationCenter defaultCenter]postNotificationName:kTakePhotoNotification object:nil userInfo:dict];
[__captureSession stopRunning];
[image release];
}];

Related

iOS custom camera (take photo), appear a bug

this bug appears very accidental, I have no idea how to fix, please help me,
main code is here:
if you want to see the complete project, you can check this link
https://github.com/liman123/custom-camera
the bug is : when I take photo, the screen become this pic, this pic is distorted!
- (void)setupConfiguration
{
_captureSession = [[AVCaptureSession alloc]init];
_captureSession.sessionPreset = AVCaptureSessionPresetPhoto;
_capturedImageView = [[UIImageView alloc]init];
_capturedImageView.frame = self.view.frame; // just to even it out
_capturedImageView.backgroundColor = [UIColor clearColor];
_capturedImageView.userInteractionEnabled = YES;
_capturedImageView.contentMode = UIViewContentModeScaleAspectFill;
_captureVideoPreviewLayer = [[AVCaptureVideoPreviewLayer alloc]initWithSession:_captureSession];
_captureVideoPreviewLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;
_captureVideoPreviewLayer.frame = self.view.bounds;
[self.view.layer addSublayer:_captureVideoPreviewLayer];
NSArray *devices = [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo];
if (devices.count > 0) {
_captureDevice = devices[0];
NSError *error = nil;
AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:_captureDevice error:&error];
[_captureSession addInput:input];
_stillImageOutput = [[AVCaptureStillImageOutput alloc]init];
NSDictionary * outputSettings = [[NSDictionary alloc] initWithObjectsAndKeys: AVVideoCodecJPEG, AVVideoCodecKey, nil];
[_stillImageOutput setOutputSettings:outputSettings];
[_captureSession addOutput:_stillImageOutput];
}
}
-(void)captureButtonClick
{
_isCapturingImage = YES;
AVCaptureConnection *videoConnection = nil;
for (AVCaptureConnection *connection in _stillImageOutput.connections)
{
for (AVCaptureInputPort *port in [connection inputPorts])
{
if ([[port mediaType] isEqual:AVMediaTypeVideo] )
{
videoConnection = connection;
break;
}
}
if (videoConnection) {
break;
}
}
[_stillImageOutput captureStillImageAsynchronouslyFromConnection:videoConnection completionHandler: ^(CMSampleBufferRef imageSampleBuffer, NSError *error) {
if (imageSampleBuffer) {
NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageSampleBuffer];
UIImage *capturedImage = [[UIImage alloc]initWithData:imageData scale:1];
_isCapturingImage = NO;
_capturedImageView.image = capturedImage;
_selectedImage = capturedImage;
imageData = nil;
[self.view addSubview:_imageSelectedView];
}
}];
}

Video Frame from camera using AVFoundation Framework in iOS?

I never worked with AVFoundation Framework, I want to get video frames from the back camera and process with these frames. Any one to help me, your experience will be appreciated. Thanks
You can use the following code to start camera session with AVFoundation in order to capture a still image:
AVCaptureSession *session;
AVCaptureStillImageOutput *stillImageOutput;
session = [[AVCaptureSession alloc] init];
[session setSessionPreset:AVCaptureSessionPresetPhoto];
AVCaptureDevice *inputDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
NSError *error;
AVCaptureDeviceInput *deviceInput = [AVCaptureDeviceInput deviceInputWithDevice:inputDevice error:&error];
if ([session canAddInput:deviceInput]) {
[session addInput:deviceInput];
}
AVCaptureVideoPreviewLayer *previewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:session];
[previewLayer setVideoGravity:AVLayerVideoGravityResizeAspectFill];
CALayer *rootLayer = [[self view] layer];
[rootLayer setMasksToBounds:YES];
CGRect frame = self.frameForCapture.frame;
[previewLayer setFrame:frame];
[rootLayer insertSublayer:previewLayer atIndex:0];
stillImageOutput = [[AVCaptureStillImageOutput alloc] init];
NSDictionary *outputSettings = [[NSDictionary alloc] initWithObjectsAndKeys:AVVideoCodecJPEG, AVVideoCodecKey, nil];
[stillImageOutput setOutputSettings:outputSettings];
[session addOutput:stillImageOutput];
[session startRunning];
Then, in order to actually capture the image, you can use a button with the following code:
- (IBAction)takePhoto:(id)sender {
AVCaptureConnection *videoConnection = nil;
for (AVCaptureConnection *connection in stillImageOutput.connections) {
for (AVCaptureInputPort *port in [connection inputPorts]) {
if ([[port mediaType] isEqual:AVMediaTypeVideo]) {
videoConnection = connection;
break;
}
}
if (videoConnection) {
break;
}
}
[stillImageOutput captureStillImageAsynchronouslyFromConnection:videoConnection
completionHandler:^(CMSampleBufferRef imageDataSampleBuffer, NSError *error) {
if (imageDataSampleBuffer != NULL) {
NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageDataSampleBuffer];
UIImage *image = [UIImage imageWithData:imageData];
}
}];
}
Then, you can do whatever you want to do with the saved image.

Get BLACK photo when capture with AVFoundation

I make a Photo Capture and Video Record customise view (2 features in one view).
Init view:
_session = [[AVCaptureSession alloc] init];
_session.sessionPreset = AVCaptureSessionPreset640x480;
_captureVideoPreviewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:_session];
_captureVideoPreviewLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;
_captureVideoPreviewLayer.frame = _previewView.layer.bounds; // parent of layer
[_previewView.layer addSublayer:_captureVideoPreviewLayer];
AVCaptureDevice *videoDevice = [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo][0];
if ([videoDevice isFlashAvailable] && videoDevice.flashActive && [videoDevice lockForConfiguration:nil]) {
videoDevice.flashMode = AVCaptureFlashModeOff;
[videoDevice unlockForConfiguration];
}
NSError * error = nil;
AVCaptureDeviceInput * input = [AVCaptureDeviceInput deviceInputWithDevice:videoDevice error:&error];
if (!input) {
if (_delegate && [_delegate respondsToSelector:#selector(customCameraViewDidLoadFailed)]) {
[_delegate customCameraViewDidLoadFailed];
}
}
if ([_session canAddInput:input]) {
[_session addInput:input];
}
AVCaptureDevice *audioDevice = [[AVCaptureDevice devicesWithMediaType:AVMediaTypeAudio] firstObject];
AVCaptureDeviceInput *audioDeviceInput = [AVCaptureDeviceInput deviceInputWithDevice:audioDevice error:&error];
if (!audioDeviceInput) {
if (_delegate && [_delegate respondsToSelector:#selector(customCameraViewDidLoadFailed)]) {
[_delegate customCameraViewDidLoadFailed];
}
}
if ([_session canAddInput:audioDeviceInput]) {
[_session addInput:audioDeviceInput];
}
_stillImageOutput = [[AVCaptureStillImageOutput alloc] init];
NSDictionary * outputSettings = [[NSDictionary alloc] initWithObjectsAndKeys:AVVideoCodecJPEG, AVVideoCodecKey, nil];
[_stillImageOutput setOutputSettings:outputSettings];
if ([_session canAddOutput:_stillImageOutput]) {
[_session addOutput:_stillImageOutput];
}
_movieFileOutput = [[AVCaptureMovieFileOutput alloc] init];
if ([_session canAddOutput:_movieFileOutput])
{
[_session addOutput:_movieFileOutput];
AVCaptureConnection *connection = [_movieFileOutput connectionWithMediaType:AVMediaTypeVideo];
if ([connection isVideoStabilizationSupported]) {
[connection setEnablesVideoStabilizationWhenAvailable:YES];
}
}
[_session startRunning];
In Capture method:
_session.sessionPreset = AVCaptureSessionPreset640x480;
_isCapturing = YES;
AVCaptureConnection *videoConnection = nil;
for (AVCaptureConnection *connection in _stillImageOutput.connections)
{
for (AVCaptureInputPort *port in [connection inputPorts])
{
if ([[port mediaType] isEqual:AVMediaTypeVideo] )
{
videoConnection = connection;
break;
}
}
if (videoConnection) {
break;
}
}
[_loadingView startAnimating];
[_stillImageOutput captureStillImageAsynchronouslyFromConnection:videoConnection completionHandler: ^(CMSampleBufferRef imageSampleBuffer, NSError *error)
{
NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageSampleBuffer];
UIImage * capturedImage = [[UIImage alloc]initWithData:imageData scale:1];
_isCapturing = NO;
imageData = nil;
[_loadingView stopAnimating];
if (_delegate && [_delegate respondsToSelector:#selector(customCameraView:didFinishCaptureImage:)]) {
[_delegate customCameraView:self didFinishCaptureImage:capturedImage];
}
}];
I'm facing a problem that, the first photo/video when I capture, it's always BLACK photo, video.
Can you help me to solve this problem.
Thanks a lot!

AVCaptureSession rotation

I'm trying to get my app to create a UIImage the correct way round.
Most of my code is taken from Apple examples...
#interface CameraManager () <AVCaptureVideoDataOutputSampleBufferDelegate>
#property (nonatomic, strong) CIContext *context;
#property (nonatomic, strong) AVCaptureDevice *rearCamera;
#end
#implementation CameraManager
- (id)init {
if ((self = [super init])) {
self.context = [CIContext contextWithOptions:nil];
[self setupCamera];
[self addStillImageOutput];
}
return self;
}
- (void)setupCamera
{
self.session = [[AVCaptureSession alloc] init];
[self.session beginConfiguration];
[self.session setSessionPreset:AVCaptureSessionPresetPhoto];
NSArray *devices = [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo];
self.rearCamera = nil;
for (AVCaptureDevice *device in devices) {
if (device.position == AVCaptureDevicePositionBack) {
self.rearCamera = device;
break;
}
}
NSError *error = nil;
AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:self.rearCamera error:&error];
[self.session addInput:input];
AVCaptureVideoDataOutput *dataOutput = [[AVCaptureVideoDataOutput alloc] init];
[dataOutput setAlwaysDiscardsLateVideoFrames:YES];
NSDictionary *options = #{(id)kCVPixelBufferPixelFormatTypeKey : #(kCVPixelFormatType_32BGRA)};
[dataOutput setVideoSettings:options];
[dataOutput setSampleBufferDelegate:self queue:dispatch_get_main_queue()];
[self.session addOutput:dataOutput];
[self.session commitConfiguration];
}
- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection
{
// grab the pixel buffer
CVPixelBufferRef pixelBuffer = (CVPixelBufferRef) CMSampleBufferGetImageBuffer(sampleBuffer);
// create a CIImage from it, rotate it and zero the origin
CIImage *image = [CIImage imageWithCVPixelBuffer:pixelBuffer];
if ([[UIApplication sharedApplication] statusBarOrientation] == UIInterfaceOrientationLandscapeLeft) {
image = [image imageByApplyingTransform:CGAffineTransformMakeRotation(M_PI)];
}
CGPoint origin = [image extent].origin;
image = [image imageByApplyingTransform:CGAffineTransformMakeTranslation(-origin.x, -origin.y)];
// set it as the contents of the UIImageView
CGImageRef cgImage = [self.context createCGImage:image fromRect:[image extent]];
UIImage *uiImage = [UIImage imageWithCGImage:cgImage];
[[NSNotificationCenter defaultCenter] postNotificationName:#"image" object:uiImage];
CGImageRelease(cgImage);
}
- (void)addStillImageOutput
{
[self setStillImageOutput:[[AVCaptureStillImageOutput alloc] init]];
NSDictionary *outputSettings = [[NSDictionary alloc] initWithObjectsAndKeys:AVVideoCodecJPEG,AVVideoCodecKey,nil];
[[self stillImageOutput] setOutputSettings:outputSettings];
AVCaptureConnection *videoConnection = nil;
for (AVCaptureConnection *connection in [self.stillImageOutput connections]) {
for (AVCaptureInputPort *port in [connection inputPorts]) {
if ([[port mediaType] isEqual:AVMediaTypeVideo] ) {
videoConnection = connection;
break;
}
}
if (videoConnection) {
break;
}
}
[[self session] addOutput:[self stillImageOutput]];
}
- (void)captureStillImage
{
AVCaptureConnection *videoConnection = nil;
for (AVCaptureConnection *connection in [[self stillImageOutput] connections]) {
for (AVCaptureInputPort *port in [connection inputPorts]) {
if ([[port mediaType] isEqual:AVMediaTypeVideo]) {
videoConnection = connection;
break;
}
}
if (videoConnection) {
break;
}
}
NSLog(#"about to request a capture from: %#", [self stillImageOutput]);
[[self stillImageOutput] captureStillImageAsynchronouslyFromConnection:videoConnection
completionHandler:^(CMSampleBufferRef imageSampleBuffer, NSError *error) {
CFDictionaryRef exifAttachments = CMGetAttachment(imageSampleBuffer, kCGImagePropertyExifDictionary, NULL);
if (exifAttachments) {
NSLog(#"attachements: %#", exifAttachments);
} else {
NSLog(#"no attachments");
}
NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageSampleBuffer];
UIImage *image = [[UIImage alloc] initWithData:imageData];
[[NSNotificationCenter defaultCenter] postNotificationName:kImageCapturedSuccessfully object:image];
}];
}
This is my camera manager class code.
I am displaying the preview of the camera using the OutputSampleBufferDelegate (for various reasons).
I'm using the session output to "take a photo".
The method captureStillImage is the bit I'm trying to fix.
The photos are taken with the device in LandscapeLeft orientation (the interface is also LandscapeLeft).
The previews all show the correct way around and the exif data shows the width and height the correct way around too. (X = 3264, Y = 2448).
But when I display the UIImage it is rotated 90 degrees counter clockwise. The aspect ratio of the image is correct (i.e. everything looks fine, circles are still circles) just the rotation.
I have found several categories that claim to fix this.
I have also found several StackOverflow questions with answers that also claim to fix it.
None of these worked.
Does anyone know how to rotate this thing the right way around?
Adding the following code before you call captureStillImageAsynchronouslyFromConnection is what I do usually:
if ([videoConnection isVideoOrientationSupported]) {
[videoConnection setVideoOrientation:[UIDevice currentDevice].orientation];
}
Maybe you should try setting image orientation after receiving image data in captureStillImageAsynchronouslyFromConnection completion block:
UIImage *image = [[UIImage alloc] initWithData:imageData];
image = [[UIImage alloc] initWithCGImage:image.CGImage scale:1.0f orientation:UIImageOrientationDown];
Orientation issue is with the front camera, so check device type and generate new image, it will definitely solve the orientation issue:
-(void)capture:(void(^)(UIImage *))handler{
AVCaptureConnection *videoConnection = nil;
for (AVCaptureConnection *connection in self.stillImageOutput.connections)
{
for (AVCaptureInputPort *port in [connection inputPorts])
{
if ([[port mediaType] isEqual:AVMediaTypeVideo] )
{
videoConnection = connection;
break;
}
}
if (videoConnection) { break; }
}
[self.stillImageOutput captureStillImageAsynchronouslyFromConnection:videoConnection completionHandler: ^(CMSampleBufferRef imageSampleBuffer, NSError *error) {
if (imageSampleBuffer != NULL) {
NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageSampleBuffer];
**UIImage *capturedImage = [UIImage imageWithData:imageData];
if (self.captureDevice == [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo][1]) {
capturedImage = [[UIImage alloc] initWithCGImage:capturedImage.CGImage scale:1.0f orientation:UIImageOrientationLeftMirrored];
}**
handler(capturedImage);
}
}];
}

How to get shutter speed, aperture, and ISO values on iPhone

I have written code like this :
-(IBAction)startCapture
{
//session object
captureSession = [[AVCaptureSession alloc]init];
captureSession.sessionPreset = AVCaptureSessionPresetMedium;
AVCaptureVideoPreviewLayer *previewLayer = [AVCaptureVideoPreviewLayer layerWithSession:captureSession];
previewLayer.frame = CGRectMake(0, 10, 320, 200); ////self.view.frame; //
[self.view.layer addSublayer:previewLayer];
NSError *error = nil;
AVCaptureDevice *device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
//input object
AVCaptureDeviceInput *inputDevice = [[AVCaptureDeviceInput alloc]initWithDevice:device error:&error];
[captureSession addInput:inputDevice];
stillImageOutput = [[AVCaptureStillImageOutput alloc] init];
NSDictionary *outputSettings = [[NSDictionary alloc] initWithObjectsAndKeys: AVVideoCodecJPEG, AVVideoCodecKey, nil];
[stillImageOutput setOutputSettings:outputSettings];
[captureSession addOutput:stillImageOutput];
[captureSession startRunning];
}
-(IBAction) captureNow
{
AVCaptureConnection *videoConnection = nil;
for (AVCaptureConnection *connection in stillImageOutput.connections)
{
for (AVCaptureInputPort *port in [connection inputPorts])
{
if ([[port mediaType] isEqual:AVMediaTypeVideo] )
{
videoConnection = connection;
break;
}
}
if (videoConnection) { break; }
}
NSLog(#"about to request a capture from: %#", stillImageOutput);
[stillImageOutput captureStillImageAsynchronouslyFromConnection:videoConnection completionHandler: ^(CMSampleBufferRef imageSampleBuffer, NSError *error)
{
CFDictionaryRef exifAttachments = CMGetAttachment( imageSampleBuffer, kCGImagePropertyExifDictionary, NULL);
NSLog(#"exif Attachments:%#",exifAttachments);
if (exifAttachments)
{
NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageSampleBuffer];
UIImage *image = [[UIImage alloc] initWithData:imageData];
self.vImage.image = image;
// Do something with the attachments.
}
else
NSLog(#"no attachments");
}];
}
to capture the images. But I want to know the shutter speed, ISO value and aperture while capturing. How can I find out these values? I tried the following:
[stillImageOutput captureStillImageAsynchronouslyFromConnection:videoConnection completionHandler: ^(CMSampleBufferRef imageSampleBuffer, NSError *error)
{
CFDictionaryRef exifAttachments = CMGetAttachment( imageSampleBuffer, kCGImagePropertyExifDictionary, NULL);
NSDictionary *exifDict = (NSDictionary *)exifAttachments;
NSLog(#"\n exif data = %#",exifDict);
CFNumberRef aperaturevalue = CMGetAttachment( imageSampleBuffer, kCGImagePropertyExifApertureValue, NULL);
NSNumber *num = (NSNumber *)aperaturevalue;
NSLog(#"\n AperatureValue : %#",num);
CFNumberRef shutter = CMGetAttachment( imageSampleBuffer, kCGImagePropertyExifShutterSpeedValue, NULL);
NSNumber *shunum = (NSNumber *)shutter;
NSLog(#"\n shuttervalue : %#",shunum);
CFArrayRef isoRef = CMGetAttachment( imageSampleBuffer, kCGImagePropertyExifISOSpeedRatings, NULL);
NSArray *iso = (NSArray *)isoRef;
NSLog(#"Iso value : %#",iso);
}
but it is giving output like this:
exif data = {
ApertureValue = "2.970853605202583";
ExposureMode = 0;
ExposureProgram = 2;
FNumber = "2.8";
Flash = 32;
MeteringMode = 1;
SceneType = 1;
SensingMethod = 2;
WhiteBalance = 0;
}
2011-06-23 14:35:14.955 CameraExample[1464:307]
AperatureValue : (null)
2011-06-23 14:35:14.981 CameraExample[1464:307]
shuttervalue : (null)
2011-06-23 14:35:14.999 CameraExample[1464:307] Iso value : (null)
Try to add this code in your block :
CFDictionaryRef exifDictRef = CMGetAttachment(imageSampleBuffer,kCGImagePropertyExifDictionary, NULL);
NSDictionary *exifDict = (NSDictionary *)exifDictRef;
for (id key in exifDict) {
NSLog(#"key = %#, value = %#",key,[exifDict objectForKey:key]);
}
You should find the values you are looking for in these keys :
kCGImagePropertyExifShutterSpeedValue (result is a NSNumber)
kCGImagePropertyExifApertureValue (result is a NSNumber)
kCGImagePropertyExifISOSpeedRatings (result is a NSArray)

Resources