I'm attempting to build a custom camera app.
Tasks:
Shows an live image preview.
Allows you to tap anywhere on the screen to capture an image.
Displays the resulting image above the live image preview.
Currently everything works except for task 1. The view that contains my live image preview remains it's background color (or remains transparent when no background color is selected), even though the camera is operational and an image is displayed when the screen is tapped. Any ideas? I've referred to this previous discussion and I think I'm covering all my bases: AVFoundation camera preview layer not working
What am I missing?
-(void)viewWillAppear:(BOOL)animated{
session = [[AVCaptureSession alloc] init];
[session setSessionPreset:AVCaptureSessionPresetPhoto];
AVCaptureDevice *inputDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
NSError *error;
AVCaptureDeviceInput *deviceInput = [AVCaptureDeviceInput deviceInputWithDevice:inputDevice error:&error];
if ([session canAddInput:deviceInput]) {
[session addInput:deviceInput];
}
AVCaptureVideoPreviewLayer *previewlayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:session];
[previewlayer setVideoGravity:AVLayerVideoGravityResizeAspectFill];
CALayer *rootLayer = [[self view] layer];
[rootLayer setMasksToBounds:YES];
CGRect frame = self.frameforCapture.frame;
[previewlayer setFrame:frame];
[rootLayer insertSublayer:previewlayer atIndex:0];
stillImageOutput = [[AVCaptureStillImageOutput alloc] init];
NSDictionary *outputSettings = [[NSDictionary alloc] initWithObjectsAndKeys:AVVideoCodecJPEG, AVVideoCodecKey, nil];
[stillImageOutput setOutputSettings:outputSettings];
[session addOutput:stillImageOutput];
[session startRunning];
//....
}
Related
I have made a View Controller on my storyboard and added to buttons to it,
I am starting a AVCaptureSession on my View Controller's view like this,
AVCaptureSession *session = [[AVCaptureSession alloc] init];
session.sessionPreset = AVCaptureSessionPresetMedium;
CALayer *viewLayer = self.view.layer;
NSLog(#"viewLayer = %#", viewLayer);
AVCaptureVideoPreviewLayer *captureVideoPreviewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:session];
captureVideoPreviewLayer.frame = self.view.layer.bounds;
[self.view.layer addSublayer: captureVideoPreviewLayer];
AVCaptureDevice *device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
NSError *error = nil;
AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:device error:&error];
if (!input) {
// Handle the error appropriately.
NSLog(#"ERROR: trying to open camera: %#", error);
}
[session addInput:input];
[session startRunning];
stillImageOutput = [[AVCaptureStillImageOutput alloc] init];
NSDictionary *outputSettings = [[NSDictionary alloc] initWithObjectsAndKeys: AVVideoCodecJPEG, AVVideoCodecKey, nil];
[stillImageOutput setOutputSettings:outputSettings];
[session addOutput:stillImageOutput];
My problem is that my buttons get hidden as the preview layer takes up the whole screen I want the preview layer to assume whole screen but also want to display the buttons for capturing and closing the session
How can I go about it?
Try moving your button on top of the view controller views hierarchy with:
[self.view sendSubviewToFront:button];
Do it after you've added captureVideoPreviewLayer as sublayer.
I have just started AVFoundation and I'm trying to capture an image.The problem is that I'm able to capture the image but unable to see the camera in my view.Here is my code:
session = [AVCaptureSession new];
if([session canSetSessionPreset:AVCaptureSessionPresetHigh])
[session setSessionPreset:AVCaptureSessionPresetHigh];//specifies photo quality
AVCaptureDevice *inputDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
NSError *error;
AVCaptureDeviceInput *deviceInput = [AVCaptureDeviceInput deviceInputWithDevice:inputDevice error:&error];
[session addInput:deviceInput];
AVCaptureVideoPreviewLayer *previewLayer = [[AVCaptureVideoPreviewLayer alloc]initWithSession:session];
[previewLayer setVideoGravity:AVLayerVideoGravityResizeAspectFill];
CALayer *rootLayer = [[self view]layer];
[rootLayer setMasksToBounds:YES];
[previewLayer setFrame:_cameraView.frame];
[rootLayer insertSublayer:previewLayer atIndex:0];
stillImageOutput = [AVCaptureStillImageOutput new];
[stillImageOutput setOutputSettings:[[NSDictionary alloc]initWithObjectsAndKeys:AVVideoCodecJPEG,AVVideoCodecKey,nil]];
[session addOutput:stillImageOutput];
[session startRunning];
Is there anything wrong with my code?
Try [[rootLayer layer] addSublayer:previewLayer];
For me it works!
So I had this code working on another app of mine to take photos on a custom camera view when I had iOS 7 on my phone and Xcode 5.1, now on iOS 8 and Xcode 6, the camera works but I can't see the live view of the camera in my leftVertical UIView. Here's my code, would appreciate any help Thanks!
#import <AVFoundation/AVFoundation.h>
session = [[AVCaptureSession alloc] init];
if ([[UIDevice currentDevice] userInterfaceIdiom] == UIUserInterfaceIdiomPhone)
[session setSessionPreset:AVCaptureSessionPreset352x288];
else
[session setSessionPreset:AVCaptureSessionPreset352x288];
AVCaptureDevice *inputDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
NSError *error;
AVCaptureDeviceInput *deviceInput = [AVCaptureDeviceInput deviceInputWithDevice:inputDevice error:&error];
if ([session canAddInput:deviceInput]) {
[session addInput:deviceInput];
}
AVCaptureVideoPreviewLayer *previewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:session];
[previewLayer setVideoGravity:AVLayerVideoGravityResizeAspectFill];
CALayer *rootLayer = [[self view] layer];
[rootLayer setMasksToBounds:YES];
CGRect frame = self.leftVertical.frame;
[previewLayer setFrame:frame];
[rootLayer insertSublayer:previewLayer atIndex:0];
//////////////////////////
stillImageOutput = [[AVCaptureStillImageOutput alloc] init];
NSDictionary *outputSettings = [[NSDictionary alloc] initWithObjectsAndKeys:AVVideoCodecJPEG, AVVideoCodecKey, nil];
[stillImageOutput setOutputSettings:outputSettings];
[session addOutput:stillImageOutput];
[session startRunning];
try this...
In –viewWillAppear: start camera capture on main thread, like this..
dispatch_async(dispatch_get_main_queue(), ^{
if (![session isRunning])
{
[session startRunning];
}
});
I need my app to display a live preview of what the front camera is filming, and this video stream's dimensions need to match the screen size of the device.
But there are black bars appearing at the top and bottom of the preview view...
How can capture and preview video that has the dimensions of the full screen? Note I don't want to resize the video!
Below is my code:
session = [[AVCaptureSession alloc] init];
session.sessionPreset = AVCaptureSessionPresetPhoto;
AVCaptureDevice *device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:device error:&error];
[session addInput:input];
UIView *cameraPreviewView = [[UIView alloc] initWithFrame:self.view.frame] ;
[self.view addSubview:cameraPreviewView];
AVCaptureVideoDataOutput *output = [[AVCaptureVideoDataOutput alloc] init];
[session addOutput:output];
dispatch_queue_t queue = dispatch_queue_create("myQueue", NULL);
[output setSampleBufferDelegate:self queue:queue];
AVCaptureVideoPreviewLayer *cameraPreviewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:session];
cameraPreviewLayer.frame = cameraPreviewView.bounds;
[cameraPreviewView.layer addSublayer:cameraPreviewLayer];
[cameraPreviewLayer setBackgroundColor:[[UIColor blackColor] CGColor]];
[cameraPreviewLayer setVideoGravity:AVLayerVideoGravityResizeAspect];
[session startRunning];
I am recording a video using AVFoundation. I am using below code.
-(IBAction)record:(id)sender
{
AVCaptureSession *session = [[AVCaptureSession alloc] init];
[session setSessionPreset:AVCaptureSessionPresetHigh];
AVCaptureDevice *inputDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
NSError *error;
AVCaptureDeviceInput *deviceInput = [AVCaptureDeviceInput deviceInputWithDevice:inputDevice error:&error];
if ( [session canAddInput:deviceInput] )
[session addInput:deviceInput];
AVCaptureVideoPreviewLayer *previewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:session];
[previewLayer setVideoGravity:AVLayerVideoGravityResizeAspectFill];
CALayer *rootLayer = [[self view] layer];
[rootLayer setMasksToBounds:YES];
[previewLayer setFrame:CGRectMake(-70, 0, rootLayer.bounds.size.height, rootLayer.bounds.size.height)];
[rootLayer insertSublayer:previewLayer atIndex:0];
[session startRunning];
}
I can see the video on my previewlayer, how can I make this to NSData, so that I can send this to server instantaneously while recording for live broadcast.
You will need to create a AVCaptureVideoDataOutput and connect it to your session, and then implement
- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection
Inside here sampleBuffer holds the data of the frame
Again, I recommend you look at RosyWriter sample App as it has an example of getting the data of the frame in pixels