Method to toggle camera inputs does not work in reverse - ios

I have a video preview layer which shows the video feed from a given input. I want the user to be able to switch camera with single button. The button I have created will switch the camera from the initial back facing camera to the front facing camera but does not work in reverse.
Here is the method:
- (IBAction)CameraToggleButtonPressed:(id)sender {
if (_captureSession) {
NSLog(#"Toggle camera");
NSError *error;
AVCaptureInput *videoInput = [self videoInput];
AVCaptureInput *newVideoInput;
AVCaptureDevicePosition position = [[VideoInputDevice device] position];
[_captureSession beginConfiguration];
[_captureSession removeInput:_videoInput];
[_captureSession removeInput:_audioInput];
[_captureSession removeOutput:_movieOutput];
[_captureSession removeOutput:_stillImageOutput];
//Get new input
AVCaptureDevice *newCamera = nil;
if(((AVCaptureDeviceInput*)videoInput).device.position == AVCaptureDevicePositionBack)
{
newCamera = [self cameraWithPosition:AVCaptureDevicePositionFront];
}
else
{
newCamera = [self cameraWithPosition:AVCaptureDevicePositionBack];
}
newVideoInput = [[AVCaptureDeviceInput alloc] initWithDevice:newCamera error:&error];
if(!newVideoInput || error)
{
NSLog(#"Error creating capture device input: %#", error.localizedDescription);
}
else
{
[self.captureSession addInput:newVideoInput];
[self.captureSession addInput:self.audioInput];
[self.captureSession addOutput:self.movieOutput];
[self.captureSession addOutput:self.stillImageOutput];
}
[_captureSession commitConfiguration];
}
}
Why is does the button not work in reverse?

Related

AVCaptureSession Low Preview Quality

My app uses video recording, and so I use the AVCaptureSession. When I see the Capture Session preview, however, I notice that the quality is lower, especially when pointing the camera over text on a television or computer screen. How can I increase the quality of my Capture Session so that it will display text on computer screens more clearly? This is the part of my code that deals with video quality.
self.CaptureSession = [[AVCaptureSession alloc] init];
self.CaptureSession.automaticallyConfiguresApplicationAudioSession = NO;
[self.CaptureSession setSessionPreset:AVCaptureSessionPresetHigh];
//----- ADD INPUTS -----
//ADD VIDEO INPUT
self.VideoDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
if ([self.VideoDevice hasTorch] == YES){
self.flashSet.hidden = NO;
self.flashOverlayObject.hidden = NO;
}
[self.VideoDevice lockForConfiguration:nil];
[self.VideoDevice setAutoFocusRangeRestriction:AVCaptureAutoFocusRangeRestrictionNear];
[self.VideoDevice setFocusMode:AVCaptureFocusModeContinuousAutoFocus];
[self.VideoDevice setExposureMode:AVCaptureExposureModeContinuousAutoExposure];
[self.VideoDevice setWhiteBalanceMode:AVCaptureWhiteBalanceModeContinuousAutoWhiteBalance];
if(self.VideoDevice.lowLightBoostSupported){
self.VideoDevice.automaticallyEnablesLowLightBoostWhenAvailable = YES;
}
[self.VideoDevice unlockForConfiguration];
if (self.VideoDevice)
{
NSError *error;
self.VideoInputDevice = [AVCaptureDeviceInput deviceInputWithDevice:self.VideoDevice error:&error];
if (!error)
{
if ([self.CaptureSession canAddInput:self.VideoInputDevice])
[self.CaptureSession addInput:self.VideoInputDevice];
}
}

I need to use both AVCaptureVideoDataOutput and AVCaptureMetadataOutput

I'm writing an app that needs to look at the raw video (custom edge detection etc) and use the meta data barcode reader.
even though the AVCaptureSession has an addOutput: method instead of setOutput: method, that's exactly what it's doing - first one in wins.
if I add AVCaptureVideoDataOutput as output first - it's delegate gets called.
if I add AVCaptureMetadataOutput as output first - it's delegate gets called.
Has anyone figured out a way around this?
short of removing the other one every other frame?
I was able to add both AVCaptureVideoDataOutput and AVCaptureMetadataOutput.
NSError *error = nil;
self.captureSession = [[AVCaptureSession alloc] init];
[self.captureSession setSessionPreset:AVCaptureSessionPresetHigh];
// Select a video device, make an input
AVCaptureDevice *captureDevice;
AVCaptureDevicePosition desiredPosition = AVCaptureDevicePositionFront;
// Find the front facing camera
for (AVCaptureDevice *device in [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo]) {
if ([device position] == desiredPosition) {
captureDevice = device;
break;
}
}
AVCaptureDeviceInput *deviceInput = [AVCaptureDeviceInput deviceInputWithDevice:captureDevice error:&error];
if (!error) {
[self.captureSession beginConfiguration];
// add the input to the session
if ([self.captureSession canAddInput:deviceInput]) {
[self.captureSession addInput:deviceInput];
}
AVCaptureMetadataOutput *metadataOutput = [AVCaptureMetadataOutput new];
if ([self.captureSession canAddOutput:metadataOutput]) {
[self.captureSession addOutput:metadataOutput];
self.metaDataOutputQueue = dispatch_queue_create("MetaDataOutputQueue", DISPATCH_QUEUE_SERIAL);
[metadataOutput setMetadataObjectsDelegate:self queue:self.metaDataOutputQueue];
[metadataOutput setMetadataObjectTypes:#[AVMetadataObjectTypeQRCode]];
}
self.videoDataOutput = [AVCaptureVideoDataOutput new];
if ([self.captureSession canAddOutput:self.videoDataOutput]) {
[self.captureSession addOutput:self.videoDataOutput];
NSDictionary *rgbOutputSettings = [NSDictionary dictionaryWithObject:[NSNumber numberWithInt:kCMPixelFormat_32BGRA] forKey:(id)kCVPixelBufferPixelFormatTypeKey];
[self.videoDataOutput setVideoSettings:rgbOutputSettings];
[self.videoDataOutput setAlwaysDiscardsLateVideoFrames:YES];
self.videoDataOutputQueue = dispatch_queue_create("VideoDataOutputQueue", DISPATCH_QUEUE_SERIAL);
[self.videoDataOutput setSampleBufferDelegate:self queue:self.videoDataOutputQueue];
[[self.videoDataOutput connectionWithMediaType:AVMediaTypeVideo] setEnabled:YES];
}
[self.captureSession commitConfiguration];
[self.captureSession startRunning];
}

iOS : AVFoundation Image Capture Dark

I'm developping an app which captures photos from my iPad front camera.
The photos are coming very dark.
Does someone have an idea about how to fix this issue, please ?
Here is my code and some explainations :
1) I initialize my capture session
-(void)viewDidAppear:(BOOL)animated{
captureSession = [[AVCaptureSession alloc] init];
NSArray *devices = [AVCaptureDevice devices];
AVCaptureDevice *frontCamera;
for (AVCaptureDevice *device in devices){
if ([device position] == AVCaptureDevicePositionFront) {
frontCamera = device;
}
}
if ([frontCamera isExposureModeSupported:AVCaptureExposureModeContinuousAutoExposure]){
NSError *error=nil;
if ([frontCamera lockForConfiguration:&error]){
frontCamera.exposureMode = AVCaptureExposureModeContinuousAutoExposure;
frontCamera.focusMode=AVCaptureFocusModeAutoFocus;
[frontCamera unlockForConfiguration];
}
}
NSError *error = nil;
AVCaptureDeviceInput *frontFacingCameraDeviceInput = [AVCaptureDeviceInput deviceInputWithDevice:frontCamera error:&error];
[captureSession addInput:frontFacingCameraDeviceInput];
[captureSession setSessionPreset:AVCaptureSessionPresetHigh];
captureVideoOutput = [[AVCaptureVideoDataOutput alloc] init];
captureImageOutput =[[AVCaptureStillImageOutput alloc] init];
[captureSession addOutput:captureVideoOutput];
[captureSession addOutput:captureImageOutput];
}
2) When the user presses the button Record, it starts a timer and preview the content of the camera to a preview layer
- (IBAction)but_record:(UIButton *)sender {
MainInt = 4;
timer = [NSTimer scheduledTimerWithTimeInterval:1.0 target:self selector:#selector(countup) userInfo:nil repeats:YES];
previewLayer = [[AVCaptureVideoPreviewLayer alloc]initWithSession:captureSession];
previewLayer.connection.videoOrientation = AVCaptureVideoOrientationPortrait;
CGRect rect = CGRectMake(0, 0, self.aView.bounds.size.width, self.aView.bounds.size.height);
previewLayer.frame = rect;
[self.aView.layer addSublayer:previewLayer];
[captureSession startRunning];
}
3) At the end of the timer, the photo is taken and saved
- (void)countup {
MainInt -=1;
if (MainInt == 0) {
[timer invalidate];
timer = nil;
[captureSession stopRunning];
AVCaptureConnection *videoConnection = nil;
for (AVCaptureConnection *connection in captureImageOutput.connections)
{
for (AVCaptureInputPort *port in [connection inputPorts])
{
if ([[port mediaType] isEqual:AVMediaTypeVideo] )
{
videoConnection = connection;
break;
}
}
if (videoConnection) { break; }
}
[captureImageOutput captureStillImageAsynchronouslyFromConnection:videoConnection completionHandler: ^(CMSampleBufferRef imageSampleBuffer, NSError *error)
{
CFDictionaryRef exifAttachments = CMGetAttachment( imageSampleBuffer, kCGImagePropertyExifDictionary, NULL);
NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageSampleBuffer];
stillImage = [[UIImage alloc] initWithData:imageData];
}];
[captureSession startRunning];
[captureSession stopRunning];
}
}
4) Finally, when the user press the save button, the image is recorded in a specific album
- (IBAction)but_save:(UIButton *)sender {
UIImage *img = stillImage;
[self.library saveImage:img toAlbum:#"mySpecificAlbum" withCompletionBlock:^(NSError *error)];
}
In fact, all the code works properly but the resulting images are very dark...
This was happening to me as well and it turned out I was trying to capture too soon and the camera didn't have enough time to stabilize. I had to add about 0.5 seconds of delay before the pictures would be normal brightness.
HTH
I had the same issue on an iOS 7 5th gen ipod touch, but not on a 4th gen ipod touch with iOS 6.1.
I found that the fix is to show a preview of the camera:
// Setup camera preview image
AVCaptureVideoPreviewLayer *previewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:_captureSession];
[_previewImage.layer addSublayer:previewLayer];
As instructed at https://developer.apple.com/library/ios/documentation/AudioVideo/Conceptual/AVFoundationPG/Articles/04_MediaCapture.html#//apple_ref/doc/uid/TP40010188-CH5-SW22
Note: I did not investigate accomplishing this without a preview

IOS Toggle AVFoundation Camera

In my App I'm capturing images using AVFoundation
I made a button to switch between front and back cameras but it won't work.
Here's the code I used :
if (captureDevice.position == AVCaptureDevicePositionFront) {
for ( AVCaptureDevice *device in [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo] ) {
if ( device.position == AVCaptureDevicePositionBack) {
NSError * error;
AVCaptureDeviceInput * newDeviceInput = [[AVCaptureDeviceInput alloc]initWithDevice:device error:&error];
[captureSesion beginConfiguration];
for (AVCaptureDeviceInput *oldInput in [captureSesion inputs]) {
[captureSesion removeInput:oldInput];
}
if ([captureSesion canAddInput:newDeviceInput]) {
[captureSesion addInput:newDeviceInput];
}
[captureSesion commitConfiguration];
break;
}
}
}
THX.
If your captureSession's sessionPreset is not compatible with the camera you're switching to it will fail the canAddInput test. I always reset to AVCaptureSessionPresetHigh before toggling cameras then try to switch it to whatever preset I have preferred. Here's the code I use:
- (void)toggleCamera {
AVCaptureDevicePosition newPosition = self.currentCameraPossition == AVCaptureDevicePositionBack ? AVCaptureDevicePositionFront : AVCaptureDevicePositionBack;
AVCaptureDevice *device = [self videoDeviceWithPosition:newPosition];
AVCaptureDeviceInput *deviceInput = [[AVCaptureDeviceInput alloc] initWithDevice:device error:nil];
[_captureSession beginConfiguration];
[_captureSession removeInput:self.deviceInput];
[_captureSession setSessionPreset:AVCaptureSessionPresetHigh]; //Always reset preset before testing canAddInput because preset will cause it to return NO
if ([_captureSession canAddInput:deviceInput]) {
[_captureSession addInput:deviceInput];
self.deviceInput = deviceInput;
self.currentCameraPossition = newPosition;
} else {
[_captureSession addInput:self.deviceInput];
}
if ([device supportsAVCaptureSessionPreset:self.sessionPreset]) {
[_captureSession setSessionPreset:self.sessionPreset];
}
if ([device lockForConfiguration:nil]) {
[device setSubjectAreaChangeMonitoringEnabled:YES];
[device unlockForConfiguration];
}
[_captureSession commitConfiguration];
}
I have seen issues with toggle code not working if it is not run on the main thread. Can you try wrapping your code with the following block:
dispatch_async(dispatch_get_main_queue(), ^{
// Your camera toggle code goes here
});

iOS 5 rear camera preview in non full-screen mode

just wondering if this is possible:
I've been looking at various solutions for displaying the camera preview; and while doing so in full-screen mode is relatively straight-forward, what I'd like to do is to have it scaled to 50% of the screen and presented side by side with a graphic (not an overlay, but a separate graphic to the left of the camera preview which takes up equal space). Basically the purpose is to allow the user to compare the camera preview with the graphic.
So, what I need to know is:
a) is it possible to scale the camera preview to a lower resolution
b) can it share the screen on an iPad with another graphic which isn't an overlay
c) if a and b are true, is there any example source I might be pointed to please?
Thanks!
You can just use next code:
previewLayer = [AVCaptureVideoPreviewLayer layerWithSession:captureSession];
previewLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;
previewLayer.opaque = YES;
previewLayer.contentsScale = self.view.contentScaleFactor;
previewLayer.frame = self.view.bounds;
previewLayer.needsDisplayOnBoundsChange = YES;
[self.view.layer addSublayer:previewLayer];
Just replace line 5 to set preview layer another frame.
You can create captureSession with this code
captureSession = [[AVCaptureSession alloc] init];
if(!captureSession)
{
NSLog(#"Failed to create video capture session");
return NO;
}
[captureSession beginConfiguration];
captureSession.sessionPreset = AVCaptureSessionPreset640x480;
AVCaptureDevice *videoDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
videoDevice.position = AVCaptureDevicePositionFront;
if(!videoDevice)
{
NSLog(#"Couldn't create video capture device");
[captureSession release];
captureSession = nil;
return NO;
}
if([videoDevice isFocusModeSupported:AVCaptureFocusModeContinuousAutoFocus])
{
NSError *deviceError = nil;
if([videoDevice lockForConfiguration:&deviceError])
{
[videoDevice setFocusMode:AVCaptureFocusModeContinuousAutoFocus];
[videoDevice unlockForConfiguration];
}
else
{
NSLog(#"Couldn't lock device for configuration");
}
}
NSError *error;
AVCaptureDeviceInput *videoIn = [AVCaptureDeviceInput deviceInputWithDevice:videoDevice error:&error];
if(!videoIn)
{
NSLog(#"Couldn't create video capture device input: %# - %#", [error localizedDescription], [error localizedFailureReason]);
[captureSession release];
captureSession = nil;
return NO;
}
if(![captureSession canAddInput:videoIn])
{
NSLog(#"Couldn't add video capture device input");
[captureSession release];
captureSession = nil;
return NO;
}
[captureSession addInput:videoIn];
[captureSession commitConfiguration];

Resources