I am trying to switch this code to use the front camera - ios

I am new to programming, and trying to fix this code to use the front camera instead of defaulting to the back. I'm not sure what I need to change to make this work properly.
Here is my sample code:
-(void)viewWillAppear:(BOOL)animated{
session = [[AVCaptureSession alloc] init];
[session setSessionPreset:AVCaptureSessionPresetPhoto];
AVCaptureDevice *inputDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
NSError *error;
AVCaptureDeviceInput *deviceInput = [AVCaptureDeviceInput deviceInputWithDevice:inputDevice error:&error];
if ([session canAddInput:deviceInput]) {
[session addInput:deviceInput];
}
AVCaptureVideoPreviewLayer *previewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:session];
[previewLayer setVideoGravity:AVLayerVideoGravityResizeAspectFill];
CALayer *rootLayer = [[self view] layer];
[rootLayer setMasksToBounds:YES];
CGRect frame = self.frameForCapture.frame;
[previewLayer setFrame:frame];
[rootLayer insertSublayer:previewLayer atIndex:0];
stillImageOutput = [[AVCaptureStillImageOutput alloc] init];
NSDictionary *outputSettings = [[NSDictionary alloc] initWithObjectsAndKeys:AVVideoCodecJPEG, AVVideoCodecKey, nil];
[stillImageOutput setOutputSettings:outputSettings];
[session addOutput:stillImageOutput];
[session startRunning];

Replace this:
AVCaptureDevice *inputDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
NSError *error;
AVCaptureDeviceInput *deviceInput = [AVCaptureDeviceInput deviceInputWithDevice:inputDevice error:&error];
With this:
NSArray *devices = [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo];
AVCaptureDevice *frontCamera;
for (AVCaptureDevice *dev in devices) {
if (dev.position == AVCaptureDevicePositionFront) {
frontCamera = dev;
break;
}
}
if (!frontCamera) {
NSLog(#"No front camera found!");
// Handle no front camera error
}
NSError *error;
AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:frontCamera error:&error];

Related

Video Frame from camera using AVFoundation Framework in iOS?

I never worked with AVFoundation Framework, I want to get video frames from the back camera and process with these frames. Any one to help me, your experience will be appreciated. Thanks
You can use the following code to start camera session with AVFoundation in order to capture a still image:
AVCaptureSession *session;
AVCaptureStillImageOutput *stillImageOutput;
session = [[AVCaptureSession alloc] init];
[session setSessionPreset:AVCaptureSessionPresetPhoto];
AVCaptureDevice *inputDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
NSError *error;
AVCaptureDeviceInput *deviceInput = [AVCaptureDeviceInput deviceInputWithDevice:inputDevice error:&error];
if ([session canAddInput:deviceInput]) {
[session addInput:deviceInput];
}
AVCaptureVideoPreviewLayer *previewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:session];
[previewLayer setVideoGravity:AVLayerVideoGravityResizeAspectFill];
CALayer *rootLayer = [[self view] layer];
[rootLayer setMasksToBounds:YES];
CGRect frame = self.frameForCapture.frame;
[previewLayer setFrame:frame];
[rootLayer insertSublayer:previewLayer atIndex:0];
stillImageOutput = [[AVCaptureStillImageOutput alloc] init];
NSDictionary *outputSettings = [[NSDictionary alloc] initWithObjectsAndKeys:AVVideoCodecJPEG, AVVideoCodecKey, nil];
[stillImageOutput setOutputSettings:outputSettings];
[session addOutput:stillImageOutput];
[session startRunning];
Then, in order to actually capture the image, you can use a button with the following code:
- (IBAction)takePhoto:(id)sender {
AVCaptureConnection *videoConnection = nil;
for (AVCaptureConnection *connection in stillImageOutput.connections) {
for (AVCaptureInputPort *port in [connection inputPorts]) {
if ([[port mediaType] isEqual:AVMediaTypeVideo]) {
videoConnection = connection;
break;
}
}
if (videoConnection) {
break;
}
}
[stillImageOutput captureStillImageAsynchronouslyFromConnection:videoConnection
completionHandler:^(CMSampleBufferRef imageDataSampleBuffer, NSError *error) {
if (imageDataSampleBuffer != NULL) {
NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageDataSampleBuffer];
UIImage *image = [UIImage imageWithData:imageData];
}
}];
}
Then, you can do whatever you want to do with the saved image.

Custom Camera View Not Working on iOS 8/Xcode 6

So I had this code working on another app of mine to take photos on a custom camera view when I had iOS 7 on my phone and Xcode 5.1, now on iOS 8 and Xcode 6, the camera works but I can't see the live view of the camera in my leftVertical UIView. Here's my code, would appreciate any help Thanks!
#import <AVFoundation/AVFoundation.h>
session = [[AVCaptureSession alloc] init];
if ([[UIDevice currentDevice] userInterfaceIdiom] == UIUserInterfaceIdiomPhone)
[session setSessionPreset:AVCaptureSessionPreset352x288];
else
[session setSessionPreset:AVCaptureSessionPreset352x288];
AVCaptureDevice *inputDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
NSError *error;
AVCaptureDeviceInput *deviceInput = [AVCaptureDeviceInput deviceInputWithDevice:inputDevice error:&error];
if ([session canAddInput:deviceInput]) {
[session addInput:deviceInput];
}
AVCaptureVideoPreviewLayer *previewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:session];
[previewLayer setVideoGravity:AVLayerVideoGravityResizeAspectFill];
CALayer *rootLayer = [[self view] layer];
[rootLayer setMasksToBounds:YES];
CGRect frame = self.leftVertical.frame;
[previewLayer setFrame:frame];
[rootLayer insertSublayer:previewLayer atIndex:0];
//////////////////////////
stillImageOutput = [[AVCaptureStillImageOutput alloc] init];
NSDictionary *outputSettings = [[NSDictionary alloc] initWithObjectsAndKeys:AVVideoCodecJPEG, AVVideoCodecKey, nil];
[stillImageOutput setOutputSettings:outputSettings];
[session addOutput:stillImageOutput];
[session startRunning];
try this...
In –viewWillAppear: start camera capture on main thread, like this..
dispatch_async(dispatch_get_main_queue(), ^{
if (![session isRunning])
{
[session startRunning];
}
});

Rotating AV video preview area on iPad

I'd like to use the video preview layer on the iPhone and the iPad, but on the iPad it doesn't rotate and it has the wrong height.
Here is the code I have:
- (void)capture
{
AVCaptureSession *session = [[AVCaptureSession alloc] init];
AVCaptureDevice *device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
NSError *error = nil;
AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:device
error:&error];
if (!input)
{
NSLog(#"Error: %#", error);
return;
}
[session addInput:input];
//Turn on point autofocus for middle of view
[device lockForConfiguration:&error];
CGPoint point = CGPointMake(0.5,0.5);
[device setFocusPointOfInterest:point];
[device setFocusMode:AVCaptureFocusModeContinuousAutoFocus];
[device unlockForConfiguration];
//Add the metadata output device
AVCaptureMetadataOutput *output = [[AVCaptureMetadataOutput alloc] init];
[output setMetadataObjectsDelegate:self queue:dispatch_get_main_queue()];
[session addOutput:output];
NSLog(#"%lu",output.availableMetadataObjectTypes.count);
for (NSString *s in output.availableMetadataObjectTypes)
NSLog(#"%#",s);
//You should check here to see if the session supports these types, if they aren't support you'll get an exception
output.metadataObjectTypes = #[AVMetadataObjectTypeEAN13Code, AVMetadataObjectTypeEAN8Code, AVMetadataObjectTypeUPCECode, AVMetadataObjectTypeQRCode];
output.rectOfInterest = self.livevideo.bounds;
AVCaptureVideoPreviewLayer *newCaptureVideoPreviewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:session];
newCaptureVideoPreviewLayer.frame = self.view.layer.bounds;
newCaptureVideoPreviewLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;
[self.livevideo.layer insertSublayer:newCaptureVideoPreviewLayer above:self.livevideo.layer];
[session startRunning];
}
And:
- (void)captureFront
{
AVCaptureSession *session = [[AVCaptureSession alloc] init];
AVCaptureDevice *device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
NSError *error = nil;
AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:device
error:&error];
if (!input)
{
NSLog(#"Error: %#", error);
return;
}
[session addInput:input];
//Turn on point autofocus for middle of view
[device lockForConfiguration:&error];
CGPoint point = CGPointMake(0.5,0.5);
[device setFocusPointOfInterest:point];
[device setFocusMode:AVCaptureFocusModeContinuousAutoFocus];
[device unlockForConfiguration];
//Add the metadata output device
AVCaptureMetadataOutput *output = [[AVCaptureMetadataOutput alloc] init];
[output setMetadataObjectsDelegate:self queue:dispatch_get_main_queue()];
[session addOutput:output];
NSLog(#"%lu",output.availableMetadataObjectTypes.count);
for (NSString *s in output.availableMetadataObjectTypes)
NSLog(#"%#",s);
//You should check here to see if the session supports these types, if they aren't support you'll get an exception
output.metadataObjectTypes = #[AVMetadataObjectTypeEAN13Code, AVMetadataObjectTypeEAN8Code, AVMetadataObjectTypeUPCECode, AVMetadataObjectTypeQRCode];
output.rectOfInterest = self.livevideo.bounds;
AVCaptureVideoPreviewLayer *newCaptureVideoPreviewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:session];
newCaptureVideoPreviewLayer.frame = self.view.layer.bounds;
newCaptureVideoPreviewLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;
[session startRunning];
}
Image

Get Camera Preview to AVCaptureVideoPreviewLayer

I was trying to get the camera input to show on a preview layer view.
self.cameraPreviewView is tied to a UIView in IB
Here is my current code that I put together from the AV Foundation Programming Guide. But the preview never shows
AVCaptureSession *session = [[AVCaptureSession alloc] init];
session.sessionPreset = AVCaptureSessionPresetHigh;
AVCaptureDevice *device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
NSError *error = nil;
AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:device error:&error];
if (!input) {
NSLog(#"Couldn't create video capture device");
}
[session addInput:input];
// Create video preview layer and add it to the UI
AVCaptureVideoPreviewLayer *newCaptureVideoPreviewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:session];
UIView *view = self.cameraPreviewView;
CALayer *viewLayer = [view layer];
newCaptureVideoPreviewLayer.frame = view.bounds;
[viewLayer addSublayer:newCaptureVideoPreviewLayer];
self.cameraPreviewLayer = newCaptureVideoPreviewLayer;
[session startRunning];
So after some trial and error and looking at apple's AVCam Sample Code
I wrapped the PreviewLayer code and session startRunning into a grand central dispatch block like so and it started working.
dispatch_async(dispatch_get_main_queue(), ^{
AVCaptureVideoPreviewLayer *newCaptureVideoPreviewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:session];
UIView *view = self.cameraPreviewView;
CALayer *viewLayer = [view layer];
newCaptureVideoPreviewLayer.frame = view.bounds;
[viewLayer addSublayer:newCaptureVideoPreviewLayer];
self.cameraPreviewLayer = newCaptureVideoPreviewLayer;
[session startRunning];
});
here is my code, it works perfect for me , you can refer to it
- (void)initCapture
{
AVCaptureDevice *inputDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
AVCaptureDeviceInput *captureInput = [AVCaptureDeviceInput deviceInputWithDevice:inputDevice error:nil];
if (!captureInput) {
return;
}
AVCaptureVideoDataOutput *captureOutput = [[AVCaptureVideoDataOutput alloc] init];
/* captureOutput:didOutputSampleBuffer:fromConnection delegate method !*/
[captureOutput setSampleBufferDelegate:self queue:dispatch_get_main_queue()];
NSString* key = (NSString*)kCVPixelBufferPixelFormatTypeKey;
NSNumber* value = [NSNumber numberWithUnsignedInt:kCVPixelFormatType_32BGRA];
NSDictionary* videoSettings = [NSDictionary dictionaryWithObject:value forKey:key];
[captureOutput setVideoSettings:videoSettings];
self.captureSession = [[AVCaptureSession alloc] init];
NSString* preset = 0;
if (!preset) {
preset = AVCaptureSessionPresetMedium;
}
self.captureSession.sessionPreset = preset;
if ([self.captureSession canAddInput:captureInput]) {
[self.captureSession addInput:captureInput];
}
if ([self.captureSession canAddOutput:captureOutput]) {
[self.captureSession addOutput:captureOutput];
}
//handle prevLayer
if (!self.captureVideoPreviewLayer) {
self.captureVideoPreviewLayer = [AVCaptureVideoPreviewLayer layerWithSession:self.captureSession];
}
//if you want to adjust the previewlayer frame, here!
self.captureVideoPreviewLayer.frame = self.view.bounds;
self.captureVideoPreviewLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;
[self.view.layer addSublayer: self.captureVideoPreviewLayer];
[self.captureSession startRunning];
}

Camera Preview does not fill screen

This is how I am setting up the camera preview in landscape mode in my iPad3
- (void)viewDidAppear:(BOOL)animated
{
AVCaptureSession *session = [[AVCaptureSession alloc] init];
session.sessionPreset = AVCaptureSessionPreset640x480;
CALayer *viewLayer = self.vImagePreview.layer;
NSLog(#"viewLayer = %#", viewLayer);
AVCaptureVideoPreviewLayer *captureVideoPreviewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:session];
captureVideoPreviewLayer.frame = self.vImagePreview.bounds;
[self.vImagePreview.layer addSublayer:captureVideoPreviewLayer];
AVCaptureDevice *device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
NSError *error = nil;
AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:device error:&error];
if (!input) {
// Handle the error appropriately.
NSLog(#"ERROR: trying to open camera: %#", error);
}
[session addInput:input];
stillImageOutput = [[AVCaptureStillImageOutput alloc] init];
NSDictionary *outputSettings = [[NSDictionary alloc] initWithObjectsAndKeys: AVVideoCodecJPEG, AVVideoCodecKey, nil];
[stillImageOutput setOutputSettings:outputSettings];
[session addOutput:stillImageOutput];
[session startRunning];
}
In the storyboard, I set the view to fill the entire screen, however it ends up looking like this:
And furthermore, the image is also rotated 90º degrees. How can I change this, so I can make the camera preview fill the entire screen?
Thanks.
Rotate your layer:
switch (self.interfaceOrientation)
{
case UIInterfaceOrientationLandscapeRight:
[captureVideoPreviewLayer setAffineTransform:CGAffineTransformMakeRotation(-M_PI / 2)];
break;
case UIInterfaceOrientationLandscapeLeft:
[captureVideoPreviewLayer setAffineTransform:CGAffineTransformMakeRotation(M_PI / 2)];
break;
}
[self.vImagePreview.layer addSublayer:captureVideoPreviewLayer];

Resources