I'm working on an app that allows the user to record video in realtime while some HUD info is displayed in a preview layer.
I've got a lot of this worked out, but one problem I have is that the video itself is rotated right by 90 degrees when the iPad is in landscape mode. When the home button is on the left, the video is rotated 90 degrees CCW, while it goes 90 degrees CW when the home button is to the right. The video should be upright regardless of which landscape mode I'm using. This [SO message] seems to tally pretty closely to what I'm dealing with, but the solution has deprecations included. 1
In my main controller:
[self setCaptureManager:[[[CaptureSessionManager alloc] init] autorelease]];
[[self captureManager] addVideoInput];
[[self captureManager] addVideoPreviewLayer];
CGRect layerRect = [[[self view] layer] bounds];
[[[self captureManager] previewLayer] setBounds:layerRect];
[[[self captureManager] previewLayer] setPosition:CGPointMake(CGRectGetMidX(layerRect),
CGRectGetMidY(layerRect))];
[[[self view] layer] addSublayer:[[self captureManager] previewLayer]];
CaptureSessionManager.h
#import <CoreMedia/CoreMedia.h>
#import <AVFoundation/AVFoundation.h>
#interface CaptureSessionManager : NSObject {
}
#property(retain) AVCaptureVideoPreviewLayer *previewLayer;
#property(retain) AVCaptureSession *captureSession;
- (void)addVideoPreviewLayer;
- (void)addVideoInput;
#end
CaptureSessionManager.m
#import "CaptureSessionManager.h"
#implementation CaptureSessionManager
#synthesize captureSession;
#synthesize previewLayer;
#pragma mark Capture Session Configuration
- (id)init {
if ((self = [super init])) {
[self setCaptureSession:[[AVCaptureSession alloc] init]];
if ([captureSession canSetSessionPreset:AVCaptureSessionPresetHigh]) {
captureSession.sessionPreset = AVCaptureSessionPresetHigh;
}
}
return self;
}
- (void)addVideoPreviewLayer {
[self setPreviewLayer:[[[AVCaptureVideoPreviewLayer alloc] initWithSession:[self captureSession]] autorelease]];
[[self previewLayer] setVideoGravity:AVLayerVideoGravityResizeAspectFill];
}
- (void)addVideoInput {
AVCaptureDevice *videoDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
if (videoDevice) {
NSError *error;
AVCaptureDeviceInput *videoIn = [AVCaptureDeviceInput deviceInputWithDevice:videoDevice error:&error];
if (!error) {
if ([[self captureSession] canAddInput:videoIn])
[[self captureSession] addInput:videoIn];
else
NSLog(#"Couldn't add video input");
}
else
NSLog(#"Couldn't create video input");
}
else
NSLog(#"Couldn't create video capture device");
}
- (void)dealloc {
[[self captureSession] stopRunning];
[previewLayer release], previewLayer = nil;
[captureSession release], captureSession = nil;
[super dealloc];
}
#end
If answers in that question you mentioned are not working, then you can achieve it other way around. Use CATransform3DMakeRotation to rotate your preview layer according to the orientation of the device.
For example, for UIDeviceOrientationLandscapeLeft you can rotate your preview layer like this:
preview.transform = CATransform3DMakeRotation(-M_PI/2, 0, 0, 1);
Found that the videoOrientation attribute of the preview layer connection needed to be set.
Here's the change to my addVideoPreviewLayer method...
- (void)addVideoPreviewLayer {
[self setPreviewLayer:[[[AVCaptureVideoPreviewLayer alloc] initWithSession:[self captureSession]] autorelease]];
[[self previewLayer] setVideoGravity:AVLayerVideoGravityResizeAspectFill];
[[previewLayer connection] setVideoOrientation:AVCaptureVideoOrientationLandscapeRight];
}
Related
I am trying to implement side by side two camera views for vr application. I found some useful info on at this site:
How to show 2 camera preview side by side?[For cardboard apps]
but I want how to create it with iOS?
I am trying to create same behaviour with AVFoundation. This is my current code
CameraViewController
#import <AVFoundation/AVFoundation.h>
#import "CameraViewController.h"
#interface CameraViewController ()
#property (strong, nonatomic) AVCaptureSession *captureSession;
#property (strong, nonatomic) AVCaptureStillImageOutput *stillImageOutput;
#property (strong, nonatomic) AVCaptureVideoPreviewLayer *previewLayerLeft;
#property (strong, nonatomic) AVCaptureVideoPreviewLayer *previewLayerRight;
#end
#implementation CameraViewController
#pragma mark - lazy instantiation
- (AVCaptureSession *)captureSession {
if (!_captureSession)
_captureSession = [[AVCaptureSession alloc] init];
return _captureSession;
}
- (AVCaptureStillImageOutput *)stillImageOutput {
if (!_stillImageOutput)
_stillImageOutput = [[AVCaptureStillImageOutput alloc] init];
return _stillImageOutput;
}
- (AVCaptureVideoPreviewLayer *)previewLayerLeft {
if (!_previewLayerLeft)
_previewLayerLeft = [[AVCaptureVideoPreviewLayer alloc] initWithSession:self.captureSession];
return _previewLayerLeft;
}
- (AVCaptureVideoPreviewLayer *)previewLayerRight {
if (!_previewLayerRight)
_previewLayerRight = [[AVCaptureVideoPreviewLayer alloc] initWithSession:self.captureSession];
return _previewLayerRight;
}
#pragma mark - view controller's life cycle methods
- (void)viewDidLoad {
[super viewDidLoad];
}
- (void)viewWillAppear:(BOOL)animated {
[super viewWillAppear:animated];
[self.captureSession setSessionPreset:AVCaptureSessionPresetPhoto];
NSError *error;
AVCaptureDeviceInput *deviceInput = [AVCaptureDeviceInput deviceInputWithDevice:[AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo] error:&error];
if ([self.captureSession canAddInput:deviceInput])
[self.captureSession addInput:deviceInput];
[self.previewLayerLeft setVideoGravity:AVLayerVideoGravityResizeAspect];
[self.previewLayerRight setVideoGravity:AVLayerVideoGravityResizeAspect];
CALayer *rootLayer = [self.view layer];
[rootLayer setMasksToBounds:YES];
NSLog(#"%#", NSStringFromCGRect([rootLayer frame]));
if ([self.previewLayerLeft.connection isVideoOrientationSupported])
[self.previewLayerLeft.connection setVideoOrientation:AVCaptureVideoOrientationLandscapeRight]; //home button on right. Refer to .h not doc
[self.previewLayerLeft setFrame:CGRectMake(0, 0, [rootLayer frame].size.width / 2, [rootLayer frame].size.height)];
if ([self.previewLayerRight.connection isVideoOrientationSupported])
[self.previewLayerRight.connection setVideoOrientation:AVCaptureVideoOrientationLandscapeRight]; //home button on right. Refer to .h not doc
[self.previewLayerRight setFrame:CGRectMake([rootLayer frame].size.width / 2, 0, [rootLayer frame].size.width / 2, [rootLayer frame].size.height)];
[rootLayer insertSublayer:self.previewLayerLeft atIndex:0];
[rootLayer insertSublayer:self.previewLayerRight atIndex:1];
[self.stillImageOutput setOutputSettings:[[NSDictionary alloc] initWithObjectsAndKeys:AVVideoCodecJPEG, AVVideoCodecKey, nil]];
if ([self.captureSession canAddOutput:self.stillImageOutput])
[self.captureSession addOutput:self.stillImageOutput];
[self.captureSession startRunning];
}
- (void)viewDidAppear:(BOOL)animated {
[super viewDidAppear:animated];
}
- (void)viewWillDisappear:(BOOL)animated {
[super viewWillDisappear:animated];
}
- (void)viewDidDisappear:(BOOL)animated {
[super viewDidDisappear:animated];
}
My iPad app can only be used in landscape orientations. After what feels like I've looked at dozens of stack answers, I am unable to properly take a photo the right way up (the meta data corrects it but is only useful on macs). When the app is physically in LandscapeLeft it returns an upside down photo, whereas in LandscapeRight, it's correct.
As far as I can tell, setting VideoOrientation for the preview layer provides me the correct orientation when previewing the photo.
_capturePreview = [[ISCapturePreview alloc] init];
[self addSubview:_capturePreview];
// Session
self.session = [AVCaptureSession new];
[self.session setSessionPreset:AVCaptureSessionPresetPhoto];
// Capture device
AVCaptureDevice* inputDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
NSError *error;
// Device input
self.deviceInput = [AVCaptureDeviceInput deviceInputWithDevice:inputDevice error:&error];
if ( [self.session canAddInput:self.deviceInput] )
[self.session addInput:self.deviceInput];
UIInterfaceOrientation orientation = [[UIApplication sharedApplication] statusBarOrientation];
// Preview
AVCaptureVideoPreviewLayer *previewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:self.session];
[previewLayer setBackgroundColor:[[UIColor blackColor] CGColor]];
[previewLayer.connection setVideoOrientation:(AVCaptureVideoOrientation)orientation];
[previewLayer setVideoGravity:AVLayerVideoGravityResizeAspectFill];
CALayer *rootLayer = [[self capturePreview] layer];
[rootLayer setMasksToBounds:YES];
[previewLayer setFrame:CGRectMake(0, 0, 1024, 600)];
[rootLayer insertSublayer:previewLayer atIndex:0];
AVCaptureStillImageOutput *stillImageOutput = [[AVCaptureStillImageOutput alloc] init];
if ([self.session canAddOutput:stillImageOutput])
{
[stillImageOutput setOutputSettings:#{AVVideoCodecKey : AVVideoCodecJPEG}];
[self.session addOutput:stillImageOutput];
[self setStillImageOutput:stillImageOutput];
}
[self.session startRunning];
And the output code is here. Setting the VideoOrientation here in my tests has no impact.
- (void)takePhotoButtonWasPressed:(id)sender{
[[[self stillImageOutput] connectionWithMediaType:AVMediaTypeVideo] setVideoOrientation:[[(AVCaptureVideoPreviewLayer *)[[self capturePreview] layer] connection] videoOrientation]];
// Capture a still image.
[[self stillImageOutput] captureStillImageAsynchronouslyFromConnection:[[self stillImageOutput] connectionWithMediaType:AVMediaTypeVideo] completionHandler:^(CMSampleBufferRef imageDataSampleBuffer, NSError *error) {
if (imageDataSampleBuffer)
{
NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageDataSampleBuffer];
UIImage *image = [[UIImage alloc] initWithData:imageData];
}];}
I currently cannot test the code myself, but if you set the available orientations (in the project settings) to both landscapeLeft and landscapeRight, iOS might be handling the orientation itself, so try not setting the orientation. Furthermore if you change the orientation of the device after the view has loaded, you will be using the wrong orientation, so think of adding an observer to check for changes. In addition apple says you should use UITraitCollection and UITraitEnvironment instead of UIInterfaceOrientation starting with iOS8.
I hope this helps you :)
My app has a button that should take a screenshot of whatever is on the screen by calling "takeSnapShot" (see code below).
I have two views and one of them is a picker view that I use for the camera, so in the screen I see the image coming from the camera and next to it another view with images.
The thing is that I capture the screen but it doesn't capture the image coming from the camera.
Also, I think I render the view.layer but the debugger keeps saying>
"Snapshotting a view that has not been rendered results in an empty
snapshot. Ensure your view has been rendered at least once before
snapshotting or snapshot after screen updates."
Any ideas? Thanks!
This is the code:
#import "ViewController.h"
#interface ViewController ()
#end
#implementation ViewController
UIImage *capturaPantalla;
-(void)takeSnapShot:(id)sender{
UIGraphicsBeginImageContext(picker.view.window.bounds.size);
[picker.view.layer renderInContext:UIGraphicsGetCurrentContext()];
capturaPantalla = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
UIImageWriteToSavedPhotosAlbum(capturaPantalla,nil,nil,nil);
}
I did the similar function before, and used AVCaptureSession to make it.
// 1. Create the AVCaptureSession
AVCaptureSession *session = [[AVCaptureSession alloc] init];
[self setSession:session];
// 2. Setup the preview view
[[self previewView] setSession:session];
// 3. Check for device authorization
[self checkDeviceAuthorizationStatus];
dispatch_queue_t sessionQueue = dispatch_queue_create("session queue", DISPATCH_QUEUE_SERIAL);
[self setSessionQueue:sessionQueue];
============================================
My countdown snapshot function:
- (void) takePhoto
{
[timer1 invalidate];
dispatch_async([self sessionQueue], ^{
// 4. Update the orientation on the still image output video connection before capturing.
[[[self stillImageOutput] connectionWithMediaType:AVMediaTypeVideo] setVideoOrientation:[[(AVCaptureVideoPreviewLayer *)[[self previewView] layer] connection] videoOrientation]];
// 5. Flash set to Auto for Still Capture
[iSKITACamViewController setFlashMode:flashMode forDevice:[[self videoDeviceInput] device]];
AVCaptureConnection *stillImageConnection = [[self stillImageOutput] connectionWithMediaType:AVMediaTypeVideo];
CGFloat maxScale = stillImageConnection.videoMaxScaleAndCropFactor;
if (effectiveScale > 1.0f && effectiveScale < maxScale)
{
stillImageConnection.videoScaleAndCropFactor = effectiveScale;;
}
[self playSound];
// 6.Capture a still image.
[[self stillImageOutput] captureStillImageAsynchronouslyFromConnection:[[self stillImageOutput] connectionWithMediaType:AVMediaTypeVideo] completionHandler:^(CMSampleBufferRef imageDataSampleBuffer, NSError *error) {
if (imageDataSampleBuffer)
{
NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageDataSampleBuffer];
UIImage *image = [[UIImage alloc] initWithData:imageData];
if(iPhone5 && [self.modeLabel.text isEqualToString:#"PhotoMode"]) {
UIImage *image1 = [image crop:CGRectMake(0, 0, image.size.width, image.size.height)];
[[[ALAssetsLibrary alloc] init] writeImageToSavedPhotosAlbum:[image1 CGImage] orientation:(ALAssetOrientation)[image1 imageOrientation] completionBlock:nil];
} else {
[[[ALAssetsLibrary alloc] init] writeImageToSavedPhotosAlbum:[image CGImage] orientation:(ALAssetOrientation)[image imageOrientation] completionBlock:nil];
}
self.priveImageView.image = image;
}
}];
});
}
I have an application with AVCaptureSession which work correctly with previous iOS versions, but then I tried run it on device with ios8, application crashed sporadic. but the problem wasn't solved. Exception getting in "[session addInput:input];" . Please advice how to resolve. Please verify my below code and im getting error in [session addInput:input];
Printing description of error: Error Domain=AVFoundationErrorDomain
Code=-11852 "Cannot use Back Camera" UserInfo=0x17c076e0
{NSLocalizedDescription=Cannot use Back Camera,
AVErrorDeviceKey=,
NSLocalizedFailureReason=This app is not authorized to use Back
Camera.}
#import "CameraViewController.h"
#import "MAImagePickerControllerAdjustViewController.h"
#import "PopupViewController.h"
#import "MAImagePickerFinalViewController.h"
#implementation CameraViewController
#synthesize vImagePreview;
#synthesize vImage;
#synthesize stillImageOutput;
#synthesize lFrameCount;
#synthesize session;
#synthesize device;
#synthesize oneOff;
#synthesize captureManager = _captureManager;
#synthesize flashButton = _flashButton;
#synthesize vImage1;
#synthesize vImage2;
#synthesize vImage3;
#synthesize vImage4;
#synthesize vImage5;
#synthesize vImage6;
/////////////////////////////////////////////////////////////////////
#pragma mark - UI Actions
/////////////////////////////////////////////////////////////////////
-(IBAction) captureNow
{
AVCaptureConnection *videoConnection = nil;
for (AVCaptureConnection *connection in stillImageOutput.connections)
{
for (AVCaptureInputPort *port in [connection inputPorts])
{
if ([[port mediaType] isEqual:AVMediaTypeVideo] )
{
videoConnection = connection;
break;
}
}
if (videoConnection) { break; }
}
NSLog(#"about to request a capture from: %#", stillImageOutput);
[stillImageOutput captureStillImageAsynchronouslyFromConnection:videoConnection completionHandler: ^(CMSampleBufferRef imageSampleBuffer, NSError *error)
{
CFDictionaryRef exifAttachments = CMGetAttachment( imageSampleBuffer, kCGImagePropertyExifDictionary, NULL);
if (exifAttachments)
{
// Do something with the attachments.
NSLog(#"attachements: %#", exifAttachments);
}
else
NSLog(#"no attachments");
NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageSampleBuffer];
UIImage *image = [[UIImage alloc] initWithData:imageData];
NSUserDefaults *standardUserDefaults = [NSUserDefaults standardUserDefaults];
NSString *val1 = nil;
if (standardUserDefaults)
{
val1 = [standardUserDefaults objectForKey:#"clickTypeTwo"];
}
if([val1 isEqualToString:#"cameraType"])
{
dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0), ^{
[session stopRunning];
});
FinalViewController *finalView;
if ([[UIScreen mainScreen] bounds].size.height == 568)
finalView = [[FinalViewController alloc] initWithNibName:IS_IPAD()?#"FinalViewController_iPad":#"FinalViewController" bundle:nil];
else
finalView =[[FinalViewController alloc] initWithNibName:IS_IPAD()?#"FinalViewController_iPad":#"FinalViewController" bundle:nil];
finalView.sourceImage = image;
//finalView.imageFrameEdited = YES;
CATransition* transition = [CATransition animation];
transition.duration = 0.4;
transition.type = kCATransitionFade;
transition.subtype = kCATransitionFromBottom;
[self.navigationController.view.layer addAnimation:transition forKey:kCATransition];
[self.navigationController pushViewController:finalView animated:NO];
}
else
{
[session stopRunning];
AdjustViewController *adjustViewController;
if ([[UIScreen mainScreen] bounds].size.height == 568)
adjustViewController = [[AdjustViewController alloc] initWithNibName:IS_IPAD()?#"AdjustViewController_iPad":#"AdjustViewController" bundle:nil];
else
adjustViewController =[[AdjustViewController alloc] initWithNibName:IS_IPAD()?#"AdjustViewController_iPad":#"AdjustViewController" bundle:nil];
adjustViewController.sourceImage = image;
CATransition* transition = [CATransition animation];
transition.duration = 0.4;
transition.type = kCATransitionFade;
transition.subtype = kCATransitionFromBottom;
[self.navigationController.view.layer addAnimation:transition forKey:kCATransition];
[self.navigationController pushViewController:adjustViewController animated:NO];
}
}];
}
-(void)cropImageViewControllerDidFinished:(UIImage *)image{
FinalViewController *finalView;
if ([[UIScreen mainScreen] bounds].size.height == 568)
finalView = [[MAImagePickerFinalViewController alloc] initWithNibName:IS_IPAD()?#"FinalViewController_iPad":#"FinalViewController" bundle:nil];
else
finalView =[[MAImagePickerFinalViewController alloc] initWithNibName:IS_IPAD()?#"FinalViewController_iPad":#"FinalViewController" bundle:nil];
finalView.sourceImage = image;
//finalView.imageFrameEdited = YES;
CATransition* transition = [CATransition animation];
transition.duration = 0.4;
transition.type = kCATransitionFade;
transition.subtype = kCATransitionFromBottom;
[self.navigationController.view.layer addAnimation:transition forKey:kCATransition];
[self.navigationController pushViewController:finalView animated:NO];
}
/////////////////////////////////////////////////////////////////////
#pragma mark - Video Frame Delegate
/////////////////////////////////////////////////////////////////////
- (void)captureOutput:(AVCaptureOutput *)captureOutput
didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer
fromConnection:(AVCaptureConnection *)connection
{
//NSLog(#"got frame");
iFrameCount++;
// Update Display
// We are running the the context of the capture session. To update the UI in real time, We have to do this in the context of the main thread.
NSString * frameCountString = [[NSString alloc] initWithFormat:#"%4.4d", iFrameCount];
[lFrameCount performSelectorOnMainThread: #selector(setText:) withObject:frameCountString waitUntilDone:YES];
//NSLog(#"frame count %d", iFrameCount);
}
- (IBAction)showLeftSideBar
{
//[self dismissModalViewControllerAnimated:YES];
if ([[SidebarViewController share] respondsToSelector:#selector(showSideBarControllerWithDirection:)]) {
[[SidebarViewController share] showSideBarControllerWithDirection:SideBarShowDirectionLeft];
}
}
- (IBAction)showRightSideBar:(id)sender
{
}
- (IBAction)flipCamera:(id)sender
{
AVCaptureDevicePosition desiredPosition;
if (isUsingFrontFacingCamera)
desiredPosition = AVCaptureDevicePositionBack;
else
desiredPosition = AVCaptureDevicePositionFront;
for (AVCaptureDevice *d in [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo]) {
if ([d position] == desiredPosition) {
[[self session] beginConfiguration];
AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:d error:nil];
for (AVCaptureInput *oldInput in [[self session] inputs]) {
[[self session] removeInput:oldInput];
}
[[self session] addInput:input];
[[self session] commitConfiguration];
break;
}
}
isUsingFrontFacingCamera = !isUsingFrontFacingCamera;
}
BOOL isUsingFrontFacingCamera;
/////////////////////////////////////////////////////////////////////
#pragma mark - Guts
/////////////////////////////////////////////////////////////////////
- (void)didReceiveMemoryWarning
{
[super didReceiveMemoryWarning];
// Release any cached data, images, etc that aren't in use.
}
/////////////////////////////////////////////////////////////////////
#pragma mark - View lifecycle
/////////////////////////////////////////////////////////////////////
- (void)viewDidLoad
{
[super viewDidLoad];
}
- (void)viewDidUnload
{
[super viewDidUnload];
// Release any retained subviews of the main view.
// e.g. self.myOutlet = nil;
}
- (void)viewWillAppear:(BOOL)animated
{
[super viewWillAppear:animated];
}
- (void)viewDidAppear:(BOOL)animated
{
[super viewDidAppear:animated];
flashIsOn=YES;
/////////////////////////////////////////////////////////////////////////////
// Create a preview layer that has a capture session attached to it.
// Stick this preview layer into our UIView.
/////////////////////////////////////////////////////////////////////////////
session = [[AVCaptureSession alloc] init];
session.sessionPreset = AVCaptureSessionPreset640x480;
CALayer *viewLayer = self.vImagePreview.layer;
NSLog(#"viewLayer = %#", viewLayer);
// viewLayer.frame = CGRectMake(-70, 150, 480, 336);
// UIGraphicsBeginImageContextWithOptions(CGSizeMake(400, 400), NO, 1);
AVCaptureVideoPreviewLayer *captureVideoPreviewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:session];
CGRect bounds=vImagePreview.layer.bounds;
captureVideoPreviewLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;
captureVideoPreviewLayer.bounds=bounds;
captureVideoPreviewLayer.position=CGPointMake(CGRectGetMidX(bounds), CGRectGetMidY(bounds));
captureVideoPreviewLayer.frame = self.vImagePreview.bounds;
[self.vImagePreview.layer addSublayer:captureVideoPreviewLayer];
//[self addVideoInputFrontCamera:YES]; // set to YES for Front Camera, No for Back camera
device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
NSError *error = nil;
AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:device error:&error];
if (!input) {
// Handle the error appropriately.
NSLog(#"ERROR: trying to open camera: %#", error);
}
[session addInput:input];
/////////////////////////////////////////////////////////////
// OUTPUT #1: Still Image
/////////////////////////////////////////////////////////////
// Add an output object to our session so we can get a still image
// We retain a handle to the still image output and use this when we capture an image.
stillImageOutput = [[AVCaptureStillImageOutput alloc] init];
NSDictionary *outputSettings = [[NSDictionary alloc] initWithObjectsAndKeys: AVVideoCodecJPEG, AVVideoCodecKey, nil];
[stillImageOutput setOutputSettings:outputSettings];
[session addOutput:stillImageOutput];
/////////////////////////////////////////////////////////////
// OUTPUT #2: Video Frames
/////////////////////////////////////////////////////////////
// Create Video Frame Outlet that will send each frame to our delegate
AVCaptureVideoDataOutput *captureOutput = [[AVCaptureVideoDataOutput alloc] init];
captureOutput.alwaysDiscardsLateVideoFrames = YES;
//captureOutput.minFrameDuration = CMTimeMake(1, 3); // deprecated in IOS5
// We need to create a queue to funnel the frames to our delegate
dispatch_queue_t queue;
queue = dispatch_queue_create("cameraQueue", NULL);
[captureOutput setSampleBufferDelegate:self queue:queue];
dispatch_release(queue);
// Set the video output to store frame in BGRA (It is supposed to be faster)
NSString* key = (NSString*)kCVPixelBufferPixelFormatTypeKey;
// let's try some different keys,
NSNumber* value = [NSNumber numberWithUnsignedInt:kCVPixelFormatType_32BGRA];
NSDictionary* videoSettings = [NSDictionary dictionaryWithObject:value forKey:key];
[captureOutput setVideoSettings:videoSettings];
[session addOutput:captureOutput];
/////////////////////////////////////////////////////////////
// start the capture session
[session startRunning];
/////////////////////////////////////////////////////////////////////////////
// initialize frame counter
iFrameCount = 0;
}
- (void)viewWillDisappear:(BOOL)animated
{
[super viewWillDisappear:animated];
}
- (void)viewDidDisappear:(BOOL)animated
{
[super viewDidDisappear:animated];
[session stopRunning];
}
- (BOOL)shouldAutorotateToInterfaceOrientation:(UIInterfaceOrientation)interfaceOrientation
{
// Return YES for supported orientations
if ([[UIDevice currentDevice] userInterfaceIdiom] == UIUserInterfaceIdiomPhone) {
return (interfaceOrientation != UIInterfaceOrientationPortraitUpsideDown);
} else {
return YES;
}
}
- (IBAction)cancelButton:(id)sender{
}
- (IBAction)flashOn:(id)sender{
Class captureDeviceClass = NSClassFromString(#"AVCaptureDevice");
if (captureDeviceClass != nil) {
if ([device hasTorch] && [device hasFlash]){
[device lockForConfiguration:nil];
if (flashIsOn) {
[device setTorchMode:AVCaptureTorchModeOn];
[device setFlashMode:AVCaptureFlashModeOn];
oneOff.text=#"On";
[_flashButton setImage:[UIImage imageNamed:#"flash-on-button"]];
_flashButton.accessibilityLabel = #"Disable Camera Flash";
flashIsOn = NO; //define as a variable/property if you need to know status
} else {
[_flashButton setImage:[UIImage imageNamed:#"flash-off-button"]];
_flashButton.accessibilityLabel = #"Enable Camera Flash";
oneOff.text=#"Off";
[device setTorchMode:AVCaptureTorchModeOff];
[device setFlashMode:AVCaptureFlashModeOff];
flashIsOn = YES;
}
[device unlockForConfiguration];
}
}
}
- (void)dealloc {
[[self session] stopRunning];
[super dealloc];
}
- (void)storeFlashSettingWithBool:(BOOL)flashSetting
{
[[NSUserDefaults standardUserDefaults] setBool:flashSetting forKey:kCameraFlashDefaultsKey];
[[NSUserDefaults standardUserDefaults] synchronize];
}
#end
Please check your device settings.
Goto privacy ---> Camera ---> check setting for your app-----> turn it on
Run the app. It works.
Cheers
Change your dealloc Method
[self.captureSession removeInput:self.videoInput];
[self.captureSession removeOutput:self.videoOutput];
self.captureSession = nil;
self.videoOutput = nil;
self.videoInput = nil;
We had a problem with this today, essentially from iOS 8.0.2 and above access to the camera requires privacy settings to the camera and not the camera roll, once this was enabled the code then worked.
Saw the same error in my app today, I am handling it with an alert that contains a Settings button shortcut to the app's privacy settings.
do {
let captureInput:AVCaptureDeviceInput = try AVCaptureDeviceInput(device: self.device)
...
} catch let error as NSError {
let alert = UIAlertController(title:error.localizedDescription, message:error.localizedFailureReason, preferredStyle:.Alert)
let settingsAction = UIAlertAction(title: "Settings", style: .Default) { (action) in
UIApplication.sharedApplication().openURL(NSURL(string:UIApplicationOpenSettingsURLString)!)
}
alert.addAction(settingsAction)
self.presentViewController(alert,animated:true,completion:nil)
}
I want to have a preview of the camera in 2 views so I made a class CameraSingleton:
CameraSingleton.h:
#property (retain) AVCaptureVideoPreviewLayer *previewLayer;
#property (retain) AVCaptureSession *session;
- (void)addVideoPreviewLayer;
- (void)addVideoInput;
CameraSingleton.m:
#implementation CameraSingleton
#synthesize session;
#synthesize previewLayer;
- (id)init {
if ((self = [super init])) {
[self setSession:[[AVCaptureSession alloc] init]];
}
return self;
}
- (void)addVideoPreviewLayer {
[self setPreviewLayer:[[AVCaptureVideoPreviewLayer alloc] initWithSession:self.session]];
[[self previewLayer] setVideoGravity:AVLayerVideoGravityResizeAspectFill];
}
- (void)addVideoInput {
AVCaptureDevice *videoDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
if (videoDevice) {
NSError *error;
AVCaptureDeviceInput *videoIn = [AVCaptureDeviceInput deviceInputWithDevice:videoDevice error:&error];
if (!error) {
if ([[self session] canAddInput:videoIn])
[[self session] addInput:videoIn];
else
NSLog(#"Couldn't add video input");
}
else
NSLog(#"Couldn't create video input");
}
else
NSLog(#"Couldn't create video capture device");
}
In both ViewControllers is this method implemented:
- (void)showPreview {
[cameraSingleton addVideoInput];
[cameraSingleton addVideoPreviewLayer];
CGRect layerRect = [[self.mainView layer] bounds];
[cameraSingleton.previewLayer setBounds:layerRect];
[cameraSingleton.previewLayer setPosition:CGPointMake(CGRectGetMidX(layerRect),
CGRectGetMidY(layerRect))];
[[[self view] layer] addSublayer:cameraSingleton.previewLayer];
}
but I don't see the preview. I checked the IBOutlet for mainView and read Apple's documentation but I can't figure out the problem. Any suggestions?