iOS : AVFoundation Image Capture Dark - ios

I'm developping an app which captures photos from my iPad front camera.
The photos are coming very dark.
Does someone have an idea about how to fix this issue, please ?
Here is my code and some explainations :
1) I initialize my capture session
-(void)viewDidAppear:(BOOL)animated{
captureSession = [[AVCaptureSession alloc] init];
NSArray *devices = [AVCaptureDevice devices];
AVCaptureDevice *frontCamera;
for (AVCaptureDevice *device in devices){
if ([device position] == AVCaptureDevicePositionFront) {
frontCamera = device;
}
}
if ([frontCamera isExposureModeSupported:AVCaptureExposureModeContinuousAutoExposure]){
NSError *error=nil;
if ([frontCamera lockForConfiguration:&error]){
frontCamera.exposureMode = AVCaptureExposureModeContinuousAutoExposure;
frontCamera.focusMode=AVCaptureFocusModeAutoFocus;
[frontCamera unlockForConfiguration];
}
}
NSError *error = nil;
AVCaptureDeviceInput *frontFacingCameraDeviceInput = [AVCaptureDeviceInput deviceInputWithDevice:frontCamera error:&error];
[captureSession addInput:frontFacingCameraDeviceInput];
[captureSession setSessionPreset:AVCaptureSessionPresetHigh];
captureVideoOutput = [[AVCaptureVideoDataOutput alloc] init];
captureImageOutput =[[AVCaptureStillImageOutput alloc] init];
[captureSession addOutput:captureVideoOutput];
[captureSession addOutput:captureImageOutput];
}
2) When the user presses the button Record, it starts a timer and preview the content of the camera to a preview layer
- (IBAction)but_record:(UIButton *)sender {
MainInt = 4;
timer = [NSTimer scheduledTimerWithTimeInterval:1.0 target:self selector:#selector(countup) userInfo:nil repeats:YES];
previewLayer = [[AVCaptureVideoPreviewLayer alloc]initWithSession:captureSession];
previewLayer.connection.videoOrientation = AVCaptureVideoOrientationPortrait;
CGRect rect = CGRectMake(0, 0, self.aView.bounds.size.width, self.aView.bounds.size.height);
previewLayer.frame = rect;
[self.aView.layer addSublayer:previewLayer];
[captureSession startRunning];
}
3) At the end of the timer, the photo is taken and saved
- (void)countup {
MainInt -=1;
if (MainInt == 0) {
[timer invalidate];
timer = nil;
[captureSession stopRunning];
AVCaptureConnection *videoConnection = nil;
for (AVCaptureConnection *connection in captureImageOutput.connections)
{
for (AVCaptureInputPort *port in [connection inputPorts])
{
if ([[port mediaType] isEqual:AVMediaTypeVideo] )
{
videoConnection = connection;
break;
}
}
if (videoConnection) { break; }
}
[captureImageOutput captureStillImageAsynchronouslyFromConnection:videoConnection completionHandler: ^(CMSampleBufferRef imageSampleBuffer, NSError *error)
{
CFDictionaryRef exifAttachments = CMGetAttachment( imageSampleBuffer, kCGImagePropertyExifDictionary, NULL);
NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageSampleBuffer];
stillImage = [[UIImage alloc] initWithData:imageData];
}];
[captureSession startRunning];
[captureSession stopRunning];
}
}
4) Finally, when the user press the save button, the image is recorded in a specific album
- (IBAction)but_save:(UIButton *)sender {
UIImage *img = stillImage;
[self.library saveImage:img toAlbum:#"mySpecificAlbum" withCompletionBlock:^(NSError *error)];
}
In fact, all the code works properly but the resulting images are very dark...

This was happening to me as well and it turned out I was trying to capture too soon and the camera didn't have enough time to stabilize. I had to add about 0.5 seconds of delay before the pictures would be normal brightness.
HTH

I had the same issue on an iOS 7 5th gen ipod touch, but not on a 4th gen ipod touch with iOS 6.1.
I found that the fix is to show a preview of the camera:
// Setup camera preview image
AVCaptureVideoPreviewLayer *previewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:_captureSession];
[_previewImage.layer addSublayer:previewLayer];
As instructed at https://developer.apple.com/library/ios/documentation/AudioVideo/Conceptual/AVFoundationPG/Articles/04_MediaCapture.html#//apple_ref/doc/uid/TP40010188-CH5-SW22
Note: I did not investigate accomplishing this without a preview

Related

IOS How to merge one image on camera preview layer?

I want to add a png image (blue rect) on camera preview view layer. I get the preview image from this function:
-(void) captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection
{
UIImage * image1 = [self ImageFromSampleBuffer:sampleBuffer];
NSData *imageData1;
}
And this function is that set preview image to an UIImageview.
-(void) SetPreview
{
session =[[AVCaptureSession alloc]init];
session.sessionPreset = AVCaptureSessionPresetPhoto;
//Add device;
AVCaptureDevice*device = nil;
device = [self cameraWithPosition:AVCaptureDevicePositionFront];
AVCaptureDeviceInput*input = [AVCaptureDeviceInput deviceInputWithDevice:device error:nil];
if(!input)
{
NSLog(#"NO Input");
}
[session addInput:input];
//Output
output = [[AVCaptureVideoDataOutput alloc] init];
[session addOutput:output];
output.videoSettings = #{(NSString*)kCVPixelBufferPixelFormatTypeKey:#(kCVPixelFormatType_32BGRA)};
//Preview layer
AVCaptureVideoPreviewLayer *previewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:session];
cameraView = self.view;
previewLayer.frame =CGRectMake(cameraView.bounds.origin.x+5, cameraView.bounds.origin.y+5, cameraView.bounds.size.width - 10, cameraView.bounds.size.height-10);
previewLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;
[_vImage.layer addSublayer:previewLayer];
timer =[NSTimer scheduledTimerWithTimeInterval:1 target:self selector:#selector(snapshot) userInfo:nil repeats:YES];
[session startRunning];
}
How can I implement this feature?
I solved this problem putting "Back Img" on the View.

Why the autofocus didn't work?

I am developing a camera related app. Now I succeeded to capture camera preview. However, when I was trying to set autofocus to my app, it didn't work. I tried both AVCaptureFocusModeContinuousAutoFocus and AVCaptureFocusModeAutoFocus, neither of them worked.
By the way, I tested on iPhone 6s.
my ViewController.h file
#import <UIKit/UIKit.h>
#include <AVFoundation/AVFoundation.h>
#interface ViewController : UIViewController
{
AVCaptureSession *cameraCaptureSession;
AVCaptureVideoPreviewLayer *cameraPreviewLayer;
}
- (void) initializeCaptureSession;
#end
my ViewController.m file
#import "ViewController.h"
#interface ViewController ()
#end
#implementation ViewController
- (void)viewDidLoad {
[super viewDidLoad];
// Do any additional setup after loading the view, typically from a nib.
[self initializeCaptureSession];
}
- (void) initializeCaptureSession
{
//Attempt to initialize AVCaptureDevice with back camera
NSArray *videoDevices = [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo];
AVCaptureDevice *captureDevice = nil;
for (AVCaptureDevice *device in videoDevices){
if (device.position == AVCaptureDevicePositionBack)
{
captureDevice = device;
break;
}
}
//If camera is accessible by capture session
if (captureDevice)
{
if ([captureDevice isFocusModeSupported:AVCaptureFocusModeAutoFocus]){
NSError *error;
if ([captureDevice lockForConfiguration:&error]){
[captureDevice setFocusPointOfInterest:CGPointMake(0.5f, 0.5f)];
[captureDevice setFocusMode:AVCaptureFocusModeAutoFocus];
[captureDevice unlockForConfiguration];
}else{
NSLog(#"Error: %#", error);
}
}
//Allocate camera capture session
cameraCaptureSession = [[AVCaptureSession alloc] init];
cameraCaptureSession.sessionPreset = AVCaptureSessionPresetMedium;
//Configure capture session input
AVCaptureDeviceInput *videoIn = [AVCaptureDeviceInput deviceInputWithDevice:captureDevice error:nil];
[cameraCaptureSession addInput:videoIn];
//Configure capture session output
AVCaptureVideoDataOutput *videoOut = [[AVCaptureVideoDataOutput alloc] init];
[videoOut setAlwaysDiscardsLateVideoFrames:YES];
[cameraCaptureSession addOutput:videoOut];
//Bind preview layer to capture session data
cameraPreviewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:cameraCaptureSession];
CGRect layerRect = self.view.bounds;
cameraPreviewLayer.bounds = self.view.bounds;
cameraPreviewLayer.position = CGPointMake(CGRectGetMidX(layerRect), CGRectGetMidY(layerRect));
cameraPreviewLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;
//Add preview layer to UIView layer
[self.view.layer addSublayer:cameraPreviewLayer];
//Begin camera capture
[cameraCaptureSession startRunning];
}
}
- (void)didReceiveMemoryWarning {
[super didReceiveMemoryWarning];
// Dispose of any resources that can be recreated.
[cameraCaptureSession stopRunning];
}
#end
here is of my code on swift, its tested on my iPhone 5s and working only on back camera, i don't know how about iPhone 6s
`if let device = captureDevice {
try! device.lockForConfiguration()
if device.isFocusModeSupported(.autoFocus) {
device.focusMode = .autoFocus
}
device.unlockForConfiguration()
}`
i think you need to remove
[captureDevice setFocusPointOfInterest:CGPointMake(0.5f, 0.5f)];
Try some thing like this:
if ([self.session canAddInput:videoDeviceInput]) {
[[NSNotificationCenter defaultCenter] removeObserver:self name:AVCaptureDeviceSubjectAreaDidChangeNotification object:currentDevice];
[[NSNotificationCenter defaultCenter] addObserver:self selector:#selector(subjectAreaDidChange:) name:AVCaptureDeviceSubjectAreaDidChangeNotification object:videoDevice];
[self.session addInput:videoDeviceInput];
self.videoDeviceInput = videoDeviceInput;
}
Then:
- (void)subjectAreaDidChange:(NSNotification *)notification
{
CGPoint devicePoint = CGPointMake( 0.5, 0.5 );
[self focusWithMode:AVCaptureFocusModeContinuousAutoFocus exposeWithMode:AVCaptureExposureModeContinuousAutoExposure atDevicePoint:devicePoint monitorSubjectAreaChange:NO];
}
And Then:
- (void)focusWithMode:(AVCaptureFocusMode)focusMode exposeWithMode:(AVCaptureExposureMode)exposureMode atDevicePoint:(CGPoint)point monitorSubjectAreaChange:(BOOL)monitorSubjectAreaChange
{
dispatch_async( self.sessionQueue, ^{
AVCaptureDevice *device = self.videoDeviceInput.device;
NSError *error = nil;
if ( [device lockForConfiguration:&error] ) {
// Setting (focus/exposure)PointOfInterest alone does not initiate a (focus/exposure) operation.
// Call -set(Focus/Exposure)Mode: to apply the new point of interest.
if ( device.isFocusPointOfInterestSupported && [device isFocusModeSupported:focusMode] ) {
device.focusPointOfInterest = point;
device.focusMode = focusMode;
}
if ( device.isExposurePointOfInterestSupported && [device isExposureModeSupported:exposureMode] ) {
device.exposurePointOfInterest = point;
device.exposureMode = exposureMode;
}
device.subjectAreaChangeMonitoringEnabled = monitorSubjectAreaChange;
[device unlockForConfiguration];
}
else {
NSLog( #"Could not lock device for configuration: %#", error );
}
} );
}
This was a code from a project I worked. Feel free to ask if you have any doubt.
AVCaptureSession *captureSession = [[AVCaptureSession alloc]init];
previewLayer = [AVCaptureVideoPreviewLayer layerWithSession:captureSession];
previewLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;
AVCaptureDevice *videoDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
NSError *err = nil;
AVCaptureDeviceInput *videoIn = [AVCaptureDeviceInput deviceInputWithDevice:videoDevice error: &err];
if (err == nil){
if ([captureSession canAddInput:videoIn]){
[captureSession addInput:videoIn];
}else{
NSLog(#"Failed to add video input");
}
}else{
NSLog(#"Error: %#",err);
}
[captureSession startRunning];
previewLayer.frame = [self.view bounds];
[self.view.layer addSublayer:previewLayer];
I am not sure what happened but if I used the code above the camera can give me a clear preview.

iOS - AVCaptureDevice - Autofocus & Exposure with camera capture

I have been implementing Custom Camera using AVCaptureDevice, which require AutoFocus & Exposure to work nicely. I am using the following code to do the camera initialisation
- (void) initializeCamera {
AVAuthorizationStatus status = [AVCaptureDevice authorizationStatusForMediaType:AVMediaTypeVideo];
if(status == AVAuthorizationStatusAuthorized) { // authorized
[self.captureVideoPreviewLayer removeFromSuperlayer];
self.captureSession = [[AVCaptureSession alloc] init];
self.captureSession.sessionPreset = AVCaptureSessionPresetPhoto;
[self removeDeviceObserverForFocus];
self.captureDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
[self addDeviceObserverForFocus];
NSError *error = nil;
[self.captureDevice lockForConfiguration:nil]; //you must lock before setting torch mode
[self.captureDevice setSubjectAreaChangeMonitoringEnabled:YES];
[self.captureDevice unlockForConfiguration];
//Capture layer
self.captureVideoPreviewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:self.captureSession];
self.captureVideoPreviewLayer.bounds = CGRectMake(0, 0, CGRectGetWidth([UIScreen mainScreen].bounds), CGRectGetHeight([UIScreen mainScreen].bounds));
self.captureVideoPreviewLayer.position = CGPointMake(CGRectGetMidX(self.captureVideoPreviewLayer.bounds), CGRectGetMidY(self.captureVideoPreviewLayer.bounds));
[self.captureVideoPreviewLayer setVideoGravity:AVLayerVideoGravityResizeAspectFill];
self.captureVideoPreviewLayer.connection.enabled = YES;
[self.viewCamera.layer insertSublayer:self.captureVideoPreviewLayer atIndex:0];
//Capture input
self.captureInput = [AVCaptureDeviceInput deviceInputWithDevice:self.captureDevice error:&error];
if (!self.captureInput) {
[self capturePhoto];
}
else {
if ([self.captureSession canAddInput:self.captureInput]) {
[self.captureSession addInput:self.captureInput];
}
}
self.captureOutput = [[AVCaptureStillImageOutput alloc] init];
[self.captureOutput setOutputSettings:#{AVVideoCodecKey : AVVideoCodecJPEG}];
[self.captureSession addOutput:self.captureOutput];
//THIS LINE
[self.captureSession setSessionPreset:AVCaptureSessionPresetPhoto];
// setup metadata capture
AVCaptureMetadataOutput *metadataOutput = [[AVCaptureMetadataOutput alloc] init];
CGRect visibleMetadataOutputRect = [self.captureVideoPreviewLayer metadataOutputRectOfInterestForRect:self.vwCamera.bounds];
metadataOutput.rectOfInterest = visibleMetadataOutputRect;
[self.captureSession addOutput:metadataOutput];
dispatch_async(dispatch_get_main_queue(), ^{
[self.captureSession startRunning];
});
}
else if(status == AVAuthorizationStatusNotDetermined){ // not determined
//Try for getting permission
[AVCaptureDevice requestAccessForMediaType:AVMediaTypeVideo completionHandler:^(BOOL granted) {
[self performSelectorOnMainThread:#selector(initializeCamera) withObject:nil waitUntilDone:NO];
}];
}
}
- (void)removeDeviceObserverForFocus {
#try {
while ([self.captureDevice observationInfo] != nil) {
[self.captureDevice removeObserver:self forKeyPath:#"adjustingFocus"];
}
}
#catch (NSException *exception) {
NSLog(#"Exception");
}
#finally {
}
}
- (void)addDeviceObserverForFocus {
[self.captureDevice addObserver:self forKeyPath:#"adjustingFocus" options:NSKeyValueObservingOptionNew context:nil];
}
- (void)observeValueForKeyPath:(NSString *)keyPath ofObject:(id)object change:(NSDictionary *)change context:(void *)context {
if( [keyPath isEqualToString:#"adjustingFocus"] ){
BOOL adjustingFocus = [ [change objectForKey:NSKeyValueChangeNewKey] isEqualToNumber:[NSNumber numberWithInt:1] ];
if (adjustingFocus) {
[self showFocusSquareAtPoint:self.viewCamera.center];
}
}
}
To monitor focus by movement of camera I am doing the following..
[[NSNotificationCenter defaultCenter] addObserver:self selector:#selector(avCaptureDeviceSubjectAreaDidChangeNotification:) name:AVCaptureDeviceSubjectAreaDidChangeNotification object:nil];
#pragma mark - AVCaptureDeviceSubjectAreaDidChangeNotification
-(void)avCaptureDeviceSubjectAreaDidChangeNotification:(NSNotification *)notification{
CGPoint devicePoint = CGPointMake( 0.5, 0.5 );
[self focusWithMode:AVCaptureFocusModeContinuousAutoFocus exposeWithMode:AVCaptureExposureModeContinuousAutoExposure atDevicePoint:devicePoint monitorSubjectAreaChange:NO];
[self showFocusSquareAtPoint:self.vwCamera.center];
}
- (void)focusWithMode:(AVCaptureFocusMode)focusMode exposeWithMode:(AVCaptureExposureMode)exposureMode atDevicePoint:(CGPoint)point monitorSubjectAreaChange:(BOOL)monitorSubjectAreaChange
{
dispatch_async( dispatch_get_main_queue(), ^{
AVCaptureDevice *device = self.captureDevice;
NSError *error = nil;
if ( [device lockForConfiguration:&error] ) {
// Setting (focus/exposure)PointOfInterest alone does not initiate a (focus/exposure) operation.
// Call -set(Focus/Exposure)Mode: to apply the new point of interest.
if ( device.isFocusPointOfInterestSupported && [device isFocusModeSupported:focusMode] ) {
device.focusPointOfInterest = point;
device.focusMode = focusMode;
}
if ( device.isExposurePointOfInterestSupported && [device isExposureModeSupported:exposureMode] ) {
device.exposurePointOfInterest = point;
device.exposureMode = exposureMode;
}
device.subjectAreaChangeMonitoringEnabled = monitorSubjectAreaChange;
[device unlockForConfiguration];
}
else {
NSLog( #"Could not lock device for configuration: %#", error );
}
} );
}
Everything works as expected when I use this [self.captureSession setSessionPreset:AVCaptureSessionPresetPhoto];
If I change the camera preset to something else like AVCaptureSessionPresetHigh AutoFocus and Exposure doesn't work well as expected..
Anyone who has come across such situation?
Thank you for help.
Are you trying to take a picture or record video? Cause the High preset is for video and the exposure and focus work differently(I believe). Here is info on the different presets in the docs - AVCaptureSessionPresets

I need to use both AVCaptureVideoDataOutput and AVCaptureMetadataOutput

I'm writing an app that needs to look at the raw video (custom edge detection etc) and use the meta data barcode reader.
even though the AVCaptureSession has an addOutput: method instead of setOutput: method, that's exactly what it's doing - first one in wins.
if I add AVCaptureVideoDataOutput as output first - it's delegate gets called.
if I add AVCaptureMetadataOutput as output first - it's delegate gets called.
Has anyone figured out a way around this?
short of removing the other one every other frame?
I was able to add both AVCaptureVideoDataOutput and AVCaptureMetadataOutput.
NSError *error = nil;
self.captureSession = [[AVCaptureSession alloc] init];
[self.captureSession setSessionPreset:AVCaptureSessionPresetHigh];
// Select a video device, make an input
AVCaptureDevice *captureDevice;
AVCaptureDevicePosition desiredPosition = AVCaptureDevicePositionFront;
// Find the front facing camera
for (AVCaptureDevice *device in [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo]) {
if ([device position] == desiredPosition) {
captureDevice = device;
break;
}
}
AVCaptureDeviceInput *deviceInput = [AVCaptureDeviceInput deviceInputWithDevice:captureDevice error:&error];
if (!error) {
[self.captureSession beginConfiguration];
// add the input to the session
if ([self.captureSession canAddInput:deviceInput]) {
[self.captureSession addInput:deviceInput];
}
AVCaptureMetadataOutput *metadataOutput = [AVCaptureMetadataOutput new];
if ([self.captureSession canAddOutput:metadataOutput]) {
[self.captureSession addOutput:metadataOutput];
self.metaDataOutputQueue = dispatch_queue_create("MetaDataOutputQueue", DISPATCH_QUEUE_SERIAL);
[metadataOutput setMetadataObjectsDelegate:self queue:self.metaDataOutputQueue];
[metadataOutput setMetadataObjectTypes:#[AVMetadataObjectTypeQRCode]];
}
self.videoDataOutput = [AVCaptureVideoDataOutput new];
if ([self.captureSession canAddOutput:self.videoDataOutput]) {
[self.captureSession addOutput:self.videoDataOutput];
NSDictionary *rgbOutputSettings = [NSDictionary dictionaryWithObject:[NSNumber numberWithInt:kCMPixelFormat_32BGRA] forKey:(id)kCVPixelBufferPixelFormatTypeKey];
[self.videoDataOutput setVideoSettings:rgbOutputSettings];
[self.videoDataOutput setAlwaysDiscardsLateVideoFrames:YES];
self.videoDataOutputQueue = dispatch_queue_create("VideoDataOutputQueue", DISPATCH_QUEUE_SERIAL);
[self.videoDataOutput setSampleBufferDelegate:self queue:self.videoDataOutputQueue];
[[self.videoDataOutput connectionWithMediaType:AVMediaTypeVideo] setEnabled:YES];
}
[self.captureSession commitConfiguration];
[self.captureSession startRunning];
}

Method to toggle camera inputs does not work in reverse

I have a video preview layer which shows the video feed from a given input. I want the user to be able to switch camera with single button. The button I have created will switch the camera from the initial back facing camera to the front facing camera but does not work in reverse.
Here is the method:
- (IBAction)CameraToggleButtonPressed:(id)sender {
if (_captureSession) {
NSLog(#"Toggle camera");
NSError *error;
AVCaptureInput *videoInput = [self videoInput];
AVCaptureInput *newVideoInput;
AVCaptureDevicePosition position = [[VideoInputDevice device] position];
[_captureSession beginConfiguration];
[_captureSession removeInput:_videoInput];
[_captureSession removeInput:_audioInput];
[_captureSession removeOutput:_movieOutput];
[_captureSession removeOutput:_stillImageOutput];
//Get new input
AVCaptureDevice *newCamera = nil;
if(((AVCaptureDeviceInput*)videoInput).device.position == AVCaptureDevicePositionBack)
{
newCamera = [self cameraWithPosition:AVCaptureDevicePositionFront];
}
else
{
newCamera = [self cameraWithPosition:AVCaptureDevicePositionBack];
}
newVideoInput = [[AVCaptureDeviceInput alloc] initWithDevice:newCamera error:&error];
if(!newVideoInput || error)
{
NSLog(#"Error creating capture device input: %#", error.localizedDescription);
}
else
{
[self.captureSession addInput:newVideoInput];
[self.captureSession addInput:self.audioInput];
[self.captureSession addOutput:self.movieOutput];
[self.captureSession addOutput:self.stillImageOutput];
}
[_captureSession commitConfiguration];
}
}
Why is does the button not work in reverse?

Resources