AVFoundation camera not visible - ios

I have just started AVFoundation and I'm trying to capture an image.The problem is that I'm able to capture the image but unable to see the camera in my view.Here is my code:
session = [AVCaptureSession new];
if([session canSetSessionPreset:AVCaptureSessionPresetHigh])
[session setSessionPreset:AVCaptureSessionPresetHigh];//specifies photo quality
AVCaptureDevice *inputDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
NSError *error;
AVCaptureDeviceInput *deviceInput = [AVCaptureDeviceInput deviceInputWithDevice:inputDevice error:&error];
[session addInput:deviceInput];
AVCaptureVideoPreviewLayer *previewLayer = [[AVCaptureVideoPreviewLayer alloc]initWithSession:session];
[previewLayer setVideoGravity:AVLayerVideoGravityResizeAspectFill];
CALayer *rootLayer = [[self view]layer];
[rootLayer setMasksToBounds:YES];
[previewLayer setFrame:_cameraView.frame];
[rootLayer insertSublayer:previewLayer atIndex:0];
stillImageOutput = [AVCaptureStillImageOutput new];
[stillImageOutput setOutputSettings:[[NSDictionary alloc]initWithObjectsAndKeys:AVVideoCodecJPEG,AVVideoCodecKey,nil]];
[session addOutput:stillImageOutput];
[session startRunning];
Is there anything wrong with my code?

Try [[rootLayer layer] addSublayer:previewLayer];
For me it works!

Related

Video Frame from camera using AVFoundation Framework in iOS?

I never worked with AVFoundation Framework, I want to get video frames from the back camera and process with these frames. Any one to help me, your experience will be appreciated. Thanks
You can use the following code to start camera session with AVFoundation in order to capture a still image:
AVCaptureSession *session;
AVCaptureStillImageOutput *stillImageOutput;
session = [[AVCaptureSession alloc] init];
[session setSessionPreset:AVCaptureSessionPresetPhoto];
AVCaptureDevice *inputDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
NSError *error;
AVCaptureDeviceInput *deviceInput = [AVCaptureDeviceInput deviceInputWithDevice:inputDevice error:&error];
if ([session canAddInput:deviceInput]) {
[session addInput:deviceInput];
}
AVCaptureVideoPreviewLayer *previewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:session];
[previewLayer setVideoGravity:AVLayerVideoGravityResizeAspectFill];
CALayer *rootLayer = [[self view] layer];
[rootLayer setMasksToBounds:YES];
CGRect frame = self.frameForCapture.frame;
[previewLayer setFrame:frame];
[rootLayer insertSublayer:previewLayer atIndex:0];
stillImageOutput = [[AVCaptureStillImageOutput alloc] init];
NSDictionary *outputSettings = [[NSDictionary alloc] initWithObjectsAndKeys:AVVideoCodecJPEG, AVVideoCodecKey, nil];
[stillImageOutput setOutputSettings:outputSettings];
[session addOutput:stillImageOutput];
[session startRunning];
Then, in order to actually capture the image, you can use a button with the following code:
- (IBAction)takePhoto:(id)sender {
AVCaptureConnection *videoConnection = nil;
for (AVCaptureConnection *connection in stillImageOutput.connections) {
for (AVCaptureInputPort *port in [connection inputPorts]) {
if ([[port mediaType] isEqual:AVMediaTypeVideo]) {
videoConnection = connection;
break;
}
}
if (videoConnection) {
break;
}
}
[stillImageOutput captureStillImageAsynchronouslyFromConnection:videoConnection
completionHandler:^(CMSampleBufferRef imageDataSampleBuffer, NSError *error) {
if (imageDataSampleBuffer != NULL) {
NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageDataSampleBuffer];
UIImage *image = [UIImage imageWithData:imageData];
}
}];
}
Then, you can do whatever you want to do with the saved image.

I am trying to switch this code to use the front camera

I am new to programming, and trying to fix this code to use the front camera instead of defaulting to the back. I'm not sure what I need to change to make this work properly.
Here is my sample code:
-(void)viewWillAppear:(BOOL)animated{
session = [[AVCaptureSession alloc] init];
[session setSessionPreset:AVCaptureSessionPresetPhoto];
AVCaptureDevice *inputDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
NSError *error;
AVCaptureDeviceInput *deviceInput = [AVCaptureDeviceInput deviceInputWithDevice:inputDevice error:&error];
if ([session canAddInput:deviceInput]) {
[session addInput:deviceInput];
}
AVCaptureVideoPreviewLayer *previewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:session];
[previewLayer setVideoGravity:AVLayerVideoGravityResizeAspectFill];
CALayer *rootLayer = [[self view] layer];
[rootLayer setMasksToBounds:YES];
CGRect frame = self.frameForCapture.frame;
[previewLayer setFrame:frame];
[rootLayer insertSublayer:previewLayer atIndex:0];
stillImageOutput = [[AVCaptureStillImageOutput alloc] init];
NSDictionary *outputSettings = [[NSDictionary alloc] initWithObjectsAndKeys:AVVideoCodecJPEG, AVVideoCodecKey, nil];
[stillImageOutput setOutputSettings:outputSettings];
[session addOutput:stillImageOutput];
[session startRunning];
Replace this:
AVCaptureDevice *inputDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
NSError *error;
AVCaptureDeviceInput *deviceInput = [AVCaptureDeviceInput deviceInputWithDevice:inputDevice error:&error];
With this:
NSArray *devices = [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo];
AVCaptureDevice *frontCamera;
for (AVCaptureDevice *dev in devices) {
if (dev.position == AVCaptureDevicePositionFront) {
frontCamera = dev;
break;
}
}
if (!frontCamera) {
NSLog(#"No front camera found!");
// Handle no front camera error
}
NSError *error;
AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:frontCamera error:&error];

Custom Camera View Not Working on iOS 8/Xcode 6

So I had this code working on another app of mine to take photos on a custom camera view when I had iOS 7 on my phone and Xcode 5.1, now on iOS 8 and Xcode 6, the camera works but I can't see the live view of the camera in my leftVertical UIView. Here's my code, would appreciate any help Thanks!
#import <AVFoundation/AVFoundation.h>
session = [[AVCaptureSession alloc] init];
if ([[UIDevice currentDevice] userInterfaceIdiom] == UIUserInterfaceIdiomPhone)
[session setSessionPreset:AVCaptureSessionPreset352x288];
else
[session setSessionPreset:AVCaptureSessionPreset352x288];
AVCaptureDevice *inputDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
NSError *error;
AVCaptureDeviceInput *deviceInput = [AVCaptureDeviceInput deviceInputWithDevice:inputDevice error:&error];
if ([session canAddInput:deviceInput]) {
[session addInput:deviceInput];
}
AVCaptureVideoPreviewLayer *previewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:session];
[previewLayer setVideoGravity:AVLayerVideoGravityResizeAspectFill];
CALayer *rootLayer = [[self view] layer];
[rootLayer setMasksToBounds:YES];
CGRect frame = self.leftVertical.frame;
[previewLayer setFrame:frame];
[rootLayer insertSublayer:previewLayer atIndex:0];
//////////////////////////
stillImageOutput = [[AVCaptureStillImageOutput alloc] init];
NSDictionary *outputSettings = [[NSDictionary alloc] initWithObjectsAndKeys:AVVideoCodecJPEG, AVVideoCodecKey, nil];
[stillImageOutput setOutputSettings:outputSettings];
[session addOutput:stillImageOutput];
[session startRunning];
try this...
In –viewWillAppear: start camera capture on main thread, like this..
dispatch_async(dispatch_get_main_queue(), ^{
if (![session isRunning])
{
[session startRunning];
}
});

Custom camera works, live preview does not

I'm attempting to build a custom camera app.
Tasks:
Shows an live image preview.
Allows you to tap anywhere on the screen to capture an image.
Displays the resulting image above the live image preview.
Currently everything works except for task 1. The view that contains my live image preview remains it's background color (or remains transparent when no background color is selected), even though the camera is operational and an image is displayed when the screen is tapped. Any ideas? I've referred to this previous discussion and I think I'm covering all my bases: AVFoundation camera preview layer not working
What am I missing?
-(void)viewWillAppear:(BOOL)animated{
session = [[AVCaptureSession alloc] init];
[session setSessionPreset:AVCaptureSessionPresetPhoto];
AVCaptureDevice *inputDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
NSError *error;
AVCaptureDeviceInput *deviceInput = [AVCaptureDeviceInput deviceInputWithDevice:inputDevice error:&error];
if ([session canAddInput:deviceInput]) {
[session addInput:deviceInput];
}
AVCaptureVideoPreviewLayer *previewlayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:session];
[previewlayer setVideoGravity:AVLayerVideoGravityResizeAspectFill];
CALayer *rootLayer = [[self view] layer];
[rootLayer setMasksToBounds:YES];
CGRect frame = self.frameforCapture.frame;
[previewlayer setFrame:frame];
[rootLayer insertSublayer:previewlayer atIndex:0];
stillImageOutput = [[AVCaptureStillImageOutput alloc] init];
NSDictionary *outputSettings = [[NSDictionary alloc] initWithObjectsAndKeys:AVVideoCodecJPEG, AVVideoCodecKey, nil];
[stillImageOutput setOutputSettings:outputSettings];
[session addOutput:stillImageOutput];
[session startRunning];
//....
}

Output from video recording with AVFoundation

I am recording a video using AVFoundation. I am using below code.
-(IBAction)record:(id)sender
{
AVCaptureSession *session = [[AVCaptureSession alloc] init];
[session setSessionPreset:AVCaptureSessionPresetHigh];
AVCaptureDevice *inputDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
NSError *error;
AVCaptureDeviceInput *deviceInput = [AVCaptureDeviceInput deviceInputWithDevice:inputDevice error:&error];
if ( [session canAddInput:deviceInput] )
[session addInput:deviceInput];
AVCaptureVideoPreviewLayer *previewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:session];
[previewLayer setVideoGravity:AVLayerVideoGravityResizeAspectFill];
CALayer *rootLayer = [[self view] layer];
[rootLayer setMasksToBounds:YES];
[previewLayer setFrame:CGRectMake(-70, 0, rootLayer.bounds.size.height, rootLayer.bounds.size.height)];
[rootLayer insertSublayer:previewLayer atIndex:0];
[session startRunning];
}
I can see the video on my previewlayer, how can I make this to NSData, so that I can send this to server instantaneously while recording for live broadcast.
You will need to create a AVCaptureVideoDataOutput and connect it to your session, and then implement
- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection
Inside here sampleBuffer holds the data of the frame
Again, I recommend you look at RosyWriter sample App as it has an example of getting the data of the frame in pixels

Resources