Camera takes dark image with iPhone - ios

I am working on a media app and for capturing images. I am using AVcapture. Everything is working fine but I am facing an issue : when I am taking 5 images in a row using the camera, the first picture usually comes darker than the others.
My code for capture is as below:
[device lockForConfiguration:nil];
AVCaptureSession *session = [[AVCaptureSession alloc] init];
session.sessionPreset = AVCaptureSessionPresetMedium;
[session beginConfiguration];
// Create device input and add to current session
AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:device error: nil];
[session addInput:input];
// Create video output and add to current session
AVCaptureVideoDataOutput *output = [[AVCaptureVideoDataOutput alloc] init];
[session addOutput:output];
Does anyone know where does my issue come from?

Related

iOS get Camera orientation

on iOS, is-it possible to get the Camera orientation (in degrees) like Android ?
Camera.CameraInfo info = new Camera.CameraInfo();
Camera.getCameraInfo(cameraId, info);
int degrees = info.orientation;
I tried
NSArray *devices = [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo];
for(AVCaptureDevice *camera in devices) {
AVCaptureDeviceInput *deviceInput = [[AVCaptureDeviceInput alloc] initWithDevice:camera error:nil];
AVCaptureSession *session = [[AVCaptureSession alloc] init];
[session addInput:deviceInput];
AVCaptureMetadataOutput *output = [[AVCaptureMetadataOutput alloc] init];
[session addOutput:output];
AVCaptureConnection *connection = [output connectionWithMediaType:AVMediaTypeMetadata];
NSLog(#((int)connection.videoOrientation).stringValue);
}
but it return "0" for all devices
Thanks
AVCaptureConnection has the videoOrientation property.
You can also try using device orientation, if you need to transform photo output somehow. Here you can find some instruction, how to work with device orientation if you need.

Use Bluetooth device to record audio in AVCaptureSession

I am using Apple's RosyWriter sample to record video and audio. Now I need to record audio using Bluetooth headset but it is not working for me. I am doing the below work for this
captureSession = [[AVCaptureSession alloc] init];
/*
* Create audio connection
*/
if(SYSTEM_VERSION_GREATER_THAN_OR_EQUAL_TO(#"7.0")){
[[AVAudioSession sharedInstance] setCategory:AVAudioSessionCategoryPlayAndRecord withOptions:AVAudioSessionCategoryOptionAllowBluetooth error:nil];
captureSession.usesApplicationAudioSession = true;
captureSession.automaticallyConfiguresApplicationAudioSession = true;
}
AVCaptureDevice *audioDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeAudio];
// AVCaptureDeviceInput *audioIn = [[AVCaptureDeviceInput alloc] initWithDevice:[self audioDevice] error:nil];
AVCaptureDeviceInput *audioIn = [[AVCaptureDeviceInput alloc] initWithDevice:audioDevice error:nil];
if ([captureSession canAddInput:audioIn])
[captureSession addInput:audioIn];
I have also followed this question
I was able to do this by changing audio out settings like below.
_audioCompressionSettings = [[audioOut recommendedAudioSettingsForAssetWriterWithOutputFileType:AVFileTypeQuickTimeMovie] copy];

My Application get crashed for Barcode Scanning in above iOS 9.3 version

Hi I am trying to develop app for 1D and 2D Barcode scanning, it works well in iOS 9.3 and Xcode 7.3 but when I am trying to run same application in iOS 10 and Xcode 8.2 application get crashed on below line.
Please help on it.
[_session addOutput:_output];
-(void)setupCaptureSession{
_session = [[AVCaptureSession alloc] init];
_device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
NSError *error = nil;
_input = [AVCaptureDeviceInput deviceInputWithDevice:_device error:&error];
if (_input) {
[_session addInput:_input];
} else {
NSLog(#"Error: %#", error);
}
_output = [[AVCaptureMetadataOutput alloc] init];
[_output setMetadataObjectsDelegate:self queue:dispatch_get_main_queue()];
[_session addOutput:_output]; // here Application get crashed.
_output.metadataObjectTypes = [_output availableMetadataObjectTypes];
_prevLayer = [AVCaptureVideoPreviewLayer layerWithSession:_session];
_prevLayer.frame = _previewView.bounds;
_prevLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;
[_previewView.layer addSublayer:_prevLayer];
//[self.view];
//[_session startRunning];
[_previewView bringSubviewToFront:_highlightView];
}
Hi I have just commented, following code of line:
[_previewView.layer addSublayer:_prevLayer];
and added below code of line in my app and app works without error:
[_previewView.layer insertSublayer:_prevLayer atIndex:0];
Did you add Camera Usage Description to your plist file? If no, take a look on this blog.
iOS 10 requires more privacy with the usage of hardware input sources.
Your app works with a camera, and you need to provide additional explanation of why it needs a camera.
So, go to your Info.plist file and add an additional key-value pair dictionary there.
For key choose - Privacy Camera Usage Description
For value add some string like next - App needs a camera to make amazing photos, scan barcodes, etc...
To be sure if everything is Ok, go to Settings iOS app and check there for camera toggle switched to On for you application.

Camera feed slow to load with AVCaptureSession on iOS - How can I speed it up?

Right now I'm trying to allow users to take pictures in my app without using UIImagePickerController. I'm using AVCaptureSession and all the related classes to load a camera feed as a sublayer on a full-screen view I have on one of my view controllers. The code works but unfortunately the camera is very slow to load. Usually takes 2-3 seconds. Here is my code:
session = [[AVCaptureSession alloc] init];
session.sessionPreset = AVCaptureSessionPresetMedium;
if ([session canSetSessionPreset:AVCaptureSessionPresetHigh])
//Check size based configs are supported before setting them
[session setSessionPreset:AVCaptureSessionPresetHigh];
[session setSessionPreset:AVCaptureSessionPreset1280x720];
CALayer *viewLayer = self.liveCameraFeed.layer;
//NSLog(#"viewLayer = %#", viewLayer);
AVCaptureVideoPreviewLayer *captureVideoPreviewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:session];
captureVideoPreviewLayer.frame = viewLayer.bounds;
[viewLayer addSublayer:captureVideoPreviewLayer];
AVCaptureDevice *device;
if(isFront)
{
device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
}
else
{
device = [self frontCamera];
}
AVCaptureDevice *audioDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeAudio];
AVCaptureDeviceInput * audioInput = [AVCaptureDeviceInput deviceInputWithDevice:audioDevice error:nil];
[session addInput:audioInput];
NSError *error = nil;
input = [AVCaptureDeviceInput deviceInputWithDevice:device error:&error];
if (!input) {
// Handle the error appropriately.
//NSLog(#"ERROR: trying to open camera: %#", error);
}
[session addInput:input];
[session startRunning];
stillImageOutput = [[AVCaptureStillImageOutput alloc] init];
NSDictionary *outputSettings = [[NSDictionary alloc] initWithObjectsAndKeys: AVVideoCodecJPEG, AVVideoCodecKey, nil];
[stillImageOutput setOutputSettings:outputSettings];
[session addOutput:stillImageOutput];
Is there any way to speed it up? I've already tried loading it on another thread using Grand Central Dispatch and NSThread and though that stopped the app from freezing it made the loading of the camera take even longer. Any help is appreciated.
In my case, I need to wait for session to start running
dispatch_async(queue) {
self.session.startRunning()
dispatch_async(dispatch_get_main_queue()) {
self.delegate?.cameraManDidStart(self)
let layer = AVCaptureVideoPreviewLayer(session: self.session)
}
}
Waiting for AVCaptureSession's startRunning function was my solution too. You can run startRunning in global async and then in main thread you can add your AVCaptureVideoPreviewLayer.
Swift 4 sample
DispatchQueue.global().async {
self.captureSession.startRunning()
DispatchQueue.main.async {
let videoPreviewLayer = AVCaptureVideoPreviewLayer(session: self.captureSession)
}
}
You can load the AVCaptureSession at the time of viewWillAppear. It works for me. When I switch to the view with the AVCaptureSession from other view, then I see the camera running immediately.
For anyone interested the solution I came up with was preloading the camera on a different thread and keeping it open.
I tried all the above methods but it was not as good as Instagram or Facebook, So I loaded the AVCaptureDevice, AVCaptureVideoPreviewLayer, AVCaptureSession in the Parent Screen and passed it as parameter to the Child Screen. It was loading very rapidly.

I want to throttle video capture frame rate in AVCapture framework

I am trying to throttle my video capture framerate for my application, as I have found that it is impacting VoiceOver performance.
At the moment, it captures frames from the video camera, and then processes the frames using OpenGL routines as quickly as possible. I would like to set a specific framerate in the capture process.
I was expecting to be able to do this by using videoMinFrameDuration or minFrameDuration, but this seems to make no difference to performance. Any ideas?
NSArray *devices = [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo];
for (AVCaptureDevice *device in devices)
{
if ([device position] == AVCaptureDevicePositionBack)
{
backFacingCamera = device;
// SET SOME OTHER PROPERTIES
}
}
// Create the capture session
captureSession = [[AVCaptureSession alloc] init];
// Add the video input
NSError *error = nil;
videoInput = [[[AVCaptureDeviceInput alloc] initWithDevice:backFacingCamera error:&error] autorelease];
// Add the video frame output
videoOutput = [[AVCaptureVideoDataOutput alloc] init];
[videoOutput setAlwaysDiscardsLateVideoFrames:YES];
[videoOutput setVideoSettings:[NSDictionary dictionaryWithObject:[NSNumber numberWithInt:kCVPixelFormatType_32BGRA] forKey:(id)kCVPixelBufferPixelFormatTypeKey]];
[videoOutput setSampleBufferDelegate:self queue:dispatch_get_main_queue()];
// Start capturing
if([backFacingCamera supportsAVCaptureSessionPreset:AVCaptureSessionPreset1920x1080])
{
[captureSession setSessionPreset:AVCaptureSessionPreset1920x1080];
captureDeviceWidth = 1920;
captureDeviceHeight = 1080;
#if defined(VA_DEBUG)
NSLog(#"Video AVCaptureSessionPreset1920x1080");
#endif
}
else do some fall back stuff
// If you wish to cap the frame rate to a known value, such as 15 fps, set
// minFrameDuration.
AVCaptureConnection *conn = [videoOutput connectionWithMediaType:AVMediaTypeVideo];
if (conn.supportsVideoMinFrameDuration)
conn.videoMinFrameDuration = CMTimeMake(1,2);
else
videoOutput.minFrameDuration = CMTimeMake(1,2);
if ([captureSession canAddInput:videoInput])
[captureSession addInput:videoInput];
if ([captureSession canAddOutput:videoOutput])
[captureSession addOutput:videoOutput];
if (![captureSession isRunning])
[captureSession startRunning];
Any ideas? Am I missing something? Is this the best way to throttle?
AVCaptureConnection *conn = [videoOutput connectionWithMediaType:AVMediaTypeVideo];
if (conn.supportsVideoMinFrameDuration)
conn.videoMinFrameDuration = CMTimeMake(1,2);
else
videoOutput.minFrameDuration = CMTimeMake(1,2);
Mike Ullrich's answer worked up until ios 7. These two methods are unfortunately deprecated in ios7. You have to set the activeVideo{Min|Max}FrameDuration on the AVCaptureDevice itself. Something like:
int fps = 30; // Change this value
AVCaptureDevice *device = ...; // Get the active capture device
[device lockForConfiguration:nil];
[device setActiveVideoMinFrameDuration:CMTimeMake(1, fps)];
[device setActiveVideoMaxFrameDuration:CMTimeMake(1, fps)];
[device unlockForConfiguration];
Turns out you need to set both videoMinFrameDuration and videoMaxFrameDuration for either one to work.
eg:
[conn setVideoMinFrameDuration:CMTimeMake(1,1)];
[conn setVideoMaxFrameDuration:CMTimeMake(1,1)];

Resources