I have a view controller in which I have added an UIView full screen size, In that UIView I have the AVCapturesession that helps me to capture photos,
My view controller opens good in portrait mode but opens abruptly in landscape mode.
The code is as follows,
- (void)viewDidLoad {
[super viewDidLoad];
self.view.backgroundColor = [UIColor whiteColor];
[[UIDevice currentDevice]beginGeneratingDeviceOrientationNotifications];
[[NSNotificationCenter defaultCenter]addObserver:self selector:#selector(orientationDidChange:) name:UIDeviceOrientationDidChangeNotification object:nil];
self.camera.userInteractionEnabled = YES;
UIPinchGestureRecognizer *pinchGesture = [[UIPinchGestureRecognizer alloc]initWithTarget:self action:#selector(handlePinchGesture:)];
pinchGesture.delegate=self;
[self.camera addGestureRecognizer:pinchGesture];
[self.navigationController setNavigationBarHidden:YES animated:YES];
}
The camera is the UIView which is the property of my UIViewController,
Again,
-(void) viewDidAppear:(BOOL)animated
{
AVCaptureSession *session = [[AVCaptureSession alloc] init];
session.sessionPreset = AVCaptureSessionPresetMedium;
CALayer *viewLayer = self.camera.layer;
NSLog(#"viewLayer = %#", viewLayer);
captureVideoPreviewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:session];
captureVideoPreviewLayer.frame = self.camera.layer.bounds;
[self.camera.layer addSublayer: captureVideoPreviewLayer];
device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
NSError *error = nil;
AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:device error:&error];
if (!input) {
// Handle the error appropriately.
NSLog(#"ERROR: trying to open camera: %#", error);
}
[session addInput:input];
[session startRunning];
stillImageOutput = [[AVCaptureStillImageOutput alloc] init];
NSDictionary *outputSettings = [[NSDictionary alloc] initWithObjectsAndKeys: AVVideoCodecJPEG, AVVideoCodecKey, nil];
[stillImageOutput setOutputSettings:outputSettings];
[session addOutput:stillImageOutput];
isUsingFlash = NO;
isUsingFrontFacingCamera = NO;
effectiveScale = 1.0;
}
My view opens wrong in landscape mode once i rotate to portrait it gets fine and on again rotation to landscape it works good, only it does not launch properly in landscape mode why?
Here I am setting the root view controller,
sb = [UIStoryboard storyboardWithName:#"Main" bundle:nil];
cameraFirstController = [sb instantiateViewControllerWithIdentifier:#"FirstViewController"];
cameraFirstController.delegate = self;
nav = [[CustomNavigationController alloc]initWithRootViewController:cameraFirstController];
[self.viewController presentViewController:nav animated:YES completion:nil];
seems like problem is order of calls when you set up the window. You need to call makeKeyAndVisible before you assign the rootViewController.
self.window = [[UIWindow alloc] initWithFrame:[UIScreen mainScreen].bounds];
[self.window makeKeyAndVisible];
self.window.rootViewController = self.YourViewController;
Related
I want to add a png image (blue rect) on camera preview view layer. I get the preview image from this function:
-(void) captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection
{
UIImage * image1 = [self ImageFromSampleBuffer:sampleBuffer];
NSData *imageData1;
}
And this function is that set preview image to an UIImageview.
-(void) SetPreview
{
session =[[AVCaptureSession alloc]init];
session.sessionPreset = AVCaptureSessionPresetPhoto;
//Add device;
AVCaptureDevice*device = nil;
device = [self cameraWithPosition:AVCaptureDevicePositionFront];
AVCaptureDeviceInput*input = [AVCaptureDeviceInput deviceInputWithDevice:device error:nil];
if(!input)
{
NSLog(#"NO Input");
}
[session addInput:input];
//Output
output = [[AVCaptureVideoDataOutput alloc] init];
[session addOutput:output];
output.videoSettings = #{(NSString*)kCVPixelBufferPixelFormatTypeKey:#(kCVPixelFormatType_32BGRA)};
//Preview layer
AVCaptureVideoPreviewLayer *previewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:session];
cameraView = self.view;
previewLayer.frame =CGRectMake(cameraView.bounds.origin.x+5, cameraView.bounds.origin.y+5, cameraView.bounds.size.width - 10, cameraView.bounds.size.height-10);
previewLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;
[_vImage.layer addSublayer:previewLayer];
timer =[NSTimer scheduledTimerWithTimeInterval:1 target:self selector:#selector(snapshot) userInfo:nil repeats:YES];
[session startRunning];
}
How can I implement this feature?
I solved this problem putting "Back Img" on the View.
I have been trying for very long and could not get it to work out but basically I would like to display the live camera feed in the background behind my labels and buttons. Here is the code I am working with to make the camera appear
- (void)viewDidLoad {
[super viewDidLoad];
AVCaptureSession *session = [[AVCaptureSession alloc] init];
session.sessionPreset = AVCaptureSessionPresetHigh;
AVCaptureDevice *device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
NSError *error = nil;
AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:device error:&error];
[session addInput:input];
AVCaptureVideoPreviewLayer *newCaptureVideoPreviewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:session];
newCaptureVideoPreviewLayer.frame = self.view.bounds;
[self.view addSublayer:newCaptureVideoPreviewLayer.view];
[self.view sendSubviewToBack:newCaptureVideoPreviewLayer.view];
[session startRunning];
}
I do not know how to place it behind the labels on the view did load. Any help would be much appreciated!
You just need to add the view in background by using sendSubviewToBack
For more detail you can check Apple AVCam Example
sendSubviewToBack:
Moves the specified subview so that it appears behind its siblings.
AVCaptureVideoPreviewLayer *newCaptureVideoPreviewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:session];
newCaptureVideoPreviewLayer.frame = self.view.bounds;
[self.view.layer addSublayer:newCaptureVideoPreviewLayer];
[session startRunning];
OR
[[self.view superView] insertSubview:newCaptureVideoPreviewLayer belowSubview:self.view];
I have an application with AVCaptureSession which work correctly with previous iOS versions, but then I tried run it on device with ios8, application crashed sporadic. but the problem wasn't solved. Exception getting in "[session addInput:input];" . Please advice how to resolve. Please verify my below code and im getting error in [session addInput:input];
Printing description of error: Error Domain=AVFoundationErrorDomain
Code=-11852 "Cannot use Back Camera" UserInfo=0x17c076e0
{NSLocalizedDescription=Cannot use Back Camera,
AVErrorDeviceKey=,
NSLocalizedFailureReason=This app is not authorized to use Back
Camera.}
#import "CameraViewController.h"
#import "MAImagePickerControllerAdjustViewController.h"
#import "PopupViewController.h"
#import "MAImagePickerFinalViewController.h"
#implementation CameraViewController
#synthesize vImagePreview;
#synthesize vImage;
#synthesize stillImageOutput;
#synthesize lFrameCount;
#synthesize session;
#synthesize device;
#synthesize oneOff;
#synthesize captureManager = _captureManager;
#synthesize flashButton = _flashButton;
#synthesize vImage1;
#synthesize vImage2;
#synthesize vImage3;
#synthesize vImage4;
#synthesize vImage5;
#synthesize vImage6;
/////////////////////////////////////////////////////////////////////
#pragma mark - UI Actions
/////////////////////////////////////////////////////////////////////
-(IBAction) captureNow
{
AVCaptureConnection *videoConnection = nil;
for (AVCaptureConnection *connection in stillImageOutput.connections)
{
for (AVCaptureInputPort *port in [connection inputPorts])
{
if ([[port mediaType] isEqual:AVMediaTypeVideo] )
{
videoConnection = connection;
break;
}
}
if (videoConnection) { break; }
}
NSLog(#"about to request a capture from: %#", stillImageOutput);
[stillImageOutput captureStillImageAsynchronouslyFromConnection:videoConnection completionHandler: ^(CMSampleBufferRef imageSampleBuffer, NSError *error)
{
CFDictionaryRef exifAttachments = CMGetAttachment( imageSampleBuffer, kCGImagePropertyExifDictionary, NULL);
if (exifAttachments)
{
// Do something with the attachments.
NSLog(#"attachements: %#", exifAttachments);
}
else
NSLog(#"no attachments");
NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageSampleBuffer];
UIImage *image = [[UIImage alloc] initWithData:imageData];
NSUserDefaults *standardUserDefaults = [NSUserDefaults standardUserDefaults];
NSString *val1 = nil;
if (standardUserDefaults)
{
val1 = [standardUserDefaults objectForKey:#"clickTypeTwo"];
}
if([val1 isEqualToString:#"cameraType"])
{
dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0), ^{
[session stopRunning];
});
FinalViewController *finalView;
if ([[UIScreen mainScreen] bounds].size.height == 568)
finalView = [[FinalViewController alloc] initWithNibName:IS_IPAD()?#"FinalViewController_iPad":#"FinalViewController" bundle:nil];
else
finalView =[[FinalViewController alloc] initWithNibName:IS_IPAD()?#"FinalViewController_iPad":#"FinalViewController" bundle:nil];
finalView.sourceImage = image;
//finalView.imageFrameEdited = YES;
CATransition* transition = [CATransition animation];
transition.duration = 0.4;
transition.type = kCATransitionFade;
transition.subtype = kCATransitionFromBottom;
[self.navigationController.view.layer addAnimation:transition forKey:kCATransition];
[self.navigationController pushViewController:finalView animated:NO];
}
else
{
[session stopRunning];
AdjustViewController *adjustViewController;
if ([[UIScreen mainScreen] bounds].size.height == 568)
adjustViewController = [[AdjustViewController alloc] initWithNibName:IS_IPAD()?#"AdjustViewController_iPad":#"AdjustViewController" bundle:nil];
else
adjustViewController =[[AdjustViewController alloc] initWithNibName:IS_IPAD()?#"AdjustViewController_iPad":#"AdjustViewController" bundle:nil];
adjustViewController.sourceImage = image;
CATransition* transition = [CATransition animation];
transition.duration = 0.4;
transition.type = kCATransitionFade;
transition.subtype = kCATransitionFromBottom;
[self.navigationController.view.layer addAnimation:transition forKey:kCATransition];
[self.navigationController pushViewController:adjustViewController animated:NO];
}
}];
}
-(void)cropImageViewControllerDidFinished:(UIImage *)image{
FinalViewController *finalView;
if ([[UIScreen mainScreen] bounds].size.height == 568)
finalView = [[MAImagePickerFinalViewController alloc] initWithNibName:IS_IPAD()?#"FinalViewController_iPad":#"FinalViewController" bundle:nil];
else
finalView =[[MAImagePickerFinalViewController alloc] initWithNibName:IS_IPAD()?#"FinalViewController_iPad":#"FinalViewController" bundle:nil];
finalView.sourceImage = image;
//finalView.imageFrameEdited = YES;
CATransition* transition = [CATransition animation];
transition.duration = 0.4;
transition.type = kCATransitionFade;
transition.subtype = kCATransitionFromBottom;
[self.navigationController.view.layer addAnimation:transition forKey:kCATransition];
[self.navigationController pushViewController:finalView animated:NO];
}
/////////////////////////////////////////////////////////////////////
#pragma mark - Video Frame Delegate
/////////////////////////////////////////////////////////////////////
- (void)captureOutput:(AVCaptureOutput *)captureOutput
didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer
fromConnection:(AVCaptureConnection *)connection
{
//NSLog(#"got frame");
iFrameCount++;
// Update Display
// We are running the the context of the capture session. To update the UI in real time, We have to do this in the context of the main thread.
NSString * frameCountString = [[NSString alloc] initWithFormat:#"%4.4d", iFrameCount];
[lFrameCount performSelectorOnMainThread: #selector(setText:) withObject:frameCountString waitUntilDone:YES];
//NSLog(#"frame count %d", iFrameCount);
}
- (IBAction)showLeftSideBar
{
//[self dismissModalViewControllerAnimated:YES];
if ([[SidebarViewController share] respondsToSelector:#selector(showSideBarControllerWithDirection:)]) {
[[SidebarViewController share] showSideBarControllerWithDirection:SideBarShowDirectionLeft];
}
}
- (IBAction)showRightSideBar:(id)sender
{
}
- (IBAction)flipCamera:(id)sender
{
AVCaptureDevicePosition desiredPosition;
if (isUsingFrontFacingCamera)
desiredPosition = AVCaptureDevicePositionBack;
else
desiredPosition = AVCaptureDevicePositionFront;
for (AVCaptureDevice *d in [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo]) {
if ([d position] == desiredPosition) {
[[self session] beginConfiguration];
AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:d error:nil];
for (AVCaptureInput *oldInput in [[self session] inputs]) {
[[self session] removeInput:oldInput];
}
[[self session] addInput:input];
[[self session] commitConfiguration];
break;
}
}
isUsingFrontFacingCamera = !isUsingFrontFacingCamera;
}
BOOL isUsingFrontFacingCamera;
/////////////////////////////////////////////////////////////////////
#pragma mark - Guts
/////////////////////////////////////////////////////////////////////
- (void)didReceiveMemoryWarning
{
[super didReceiveMemoryWarning];
// Release any cached data, images, etc that aren't in use.
}
/////////////////////////////////////////////////////////////////////
#pragma mark - View lifecycle
/////////////////////////////////////////////////////////////////////
- (void)viewDidLoad
{
[super viewDidLoad];
}
- (void)viewDidUnload
{
[super viewDidUnload];
// Release any retained subviews of the main view.
// e.g. self.myOutlet = nil;
}
- (void)viewWillAppear:(BOOL)animated
{
[super viewWillAppear:animated];
}
- (void)viewDidAppear:(BOOL)animated
{
[super viewDidAppear:animated];
flashIsOn=YES;
/////////////////////////////////////////////////////////////////////////////
// Create a preview layer that has a capture session attached to it.
// Stick this preview layer into our UIView.
/////////////////////////////////////////////////////////////////////////////
session = [[AVCaptureSession alloc] init];
session.sessionPreset = AVCaptureSessionPreset640x480;
CALayer *viewLayer = self.vImagePreview.layer;
NSLog(#"viewLayer = %#", viewLayer);
// viewLayer.frame = CGRectMake(-70, 150, 480, 336);
// UIGraphicsBeginImageContextWithOptions(CGSizeMake(400, 400), NO, 1);
AVCaptureVideoPreviewLayer *captureVideoPreviewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:session];
CGRect bounds=vImagePreview.layer.bounds;
captureVideoPreviewLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;
captureVideoPreviewLayer.bounds=bounds;
captureVideoPreviewLayer.position=CGPointMake(CGRectGetMidX(bounds), CGRectGetMidY(bounds));
captureVideoPreviewLayer.frame = self.vImagePreview.bounds;
[self.vImagePreview.layer addSublayer:captureVideoPreviewLayer];
//[self addVideoInputFrontCamera:YES]; // set to YES for Front Camera, No for Back camera
device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
NSError *error = nil;
AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:device error:&error];
if (!input) {
// Handle the error appropriately.
NSLog(#"ERROR: trying to open camera: %#", error);
}
[session addInput:input];
/////////////////////////////////////////////////////////////
// OUTPUT #1: Still Image
/////////////////////////////////////////////////////////////
// Add an output object to our session so we can get a still image
// We retain a handle to the still image output and use this when we capture an image.
stillImageOutput = [[AVCaptureStillImageOutput alloc] init];
NSDictionary *outputSettings = [[NSDictionary alloc] initWithObjectsAndKeys: AVVideoCodecJPEG, AVVideoCodecKey, nil];
[stillImageOutput setOutputSettings:outputSettings];
[session addOutput:stillImageOutput];
/////////////////////////////////////////////////////////////
// OUTPUT #2: Video Frames
/////////////////////////////////////////////////////////////
// Create Video Frame Outlet that will send each frame to our delegate
AVCaptureVideoDataOutput *captureOutput = [[AVCaptureVideoDataOutput alloc] init];
captureOutput.alwaysDiscardsLateVideoFrames = YES;
//captureOutput.minFrameDuration = CMTimeMake(1, 3); // deprecated in IOS5
// We need to create a queue to funnel the frames to our delegate
dispatch_queue_t queue;
queue = dispatch_queue_create("cameraQueue", NULL);
[captureOutput setSampleBufferDelegate:self queue:queue];
dispatch_release(queue);
// Set the video output to store frame in BGRA (It is supposed to be faster)
NSString* key = (NSString*)kCVPixelBufferPixelFormatTypeKey;
// let's try some different keys,
NSNumber* value = [NSNumber numberWithUnsignedInt:kCVPixelFormatType_32BGRA];
NSDictionary* videoSettings = [NSDictionary dictionaryWithObject:value forKey:key];
[captureOutput setVideoSettings:videoSettings];
[session addOutput:captureOutput];
/////////////////////////////////////////////////////////////
// start the capture session
[session startRunning];
/////////////////////////////////////////////////////////////////////////////
// initialize frame counter
iFrameCount = 0;
}
- (void)viewWillDisappear:(BOOL)animated
{
[super viewWillDisappear:animated];
}
- (void)viewDidDisappear:(BOOL)animated
{
[super viewDidDisappear:animated];
[session stopRunning];
}
- (BOOL)shouldAutorotateToInterfaceOrientation:(UIInterfaceOrientation)interfaceOrientation
{
// Return YES for supported orientations
if ([[UIDevice currentDevice] userInterfaceIdiom] == UIUserInterfaceIdiomPhone) {
return (interfaceOrientation != UIInterfaceOrientationPortraitUpsideDown);
} else {
return YES;
}
}
- (IBAction)cancelButton:(id)sender{
}
- (IBAction)flashOn:(id)sender{
Class captureDeviceClass = NSClassFromString(#"AVCaptureDevice");
if (captureDeviceClass != nil) {
if ([device hasTorch] && [device hasFlash]){
[device lockForConfiguration:nil];
if (flashIsOn) {
[device setTorchMode:AVCaptureTorchModeOn];
[device setFlashMode:AVCaptureFlashModeOn];
oneOff.text=#"On";
[_flashButton setImage:[UIImage imageNamed:#"flash-on-button"]];
_flashButton.accessibilityLabel = #"Disable Camera Flash";
flashIsOn = NO; //define as a variable/property if you need to know status
} else {
[_flashButton setImage:[UIImage imageNamed:#"flash-off-button"]];
_flashButton.accessibilityLabel = #"Enable Camera Flash";
oneOff.text=#"Off";
[device setTorchMode:AVCaptureTorchModeOff];
[device setFlashMode:AVCaptureFlashModeOff];
flashIsOn = YES;
}
[device unlockForConfiguration];
}
}
}
- (void)dealloc {
[[self session] stopRunning];
[super dealloc];
}
- (void)storeFlashSettingWithBool:(BOOL)flashSetting
{
[[NSUserDefaults standardUserDefaults] setBool:flashSetting forKey:kCameraFlashDefaultsKey];
[[NSUserDefaults standardUserDefaults] synchronize];
}
#end
Please check your device settings.
Goto privacy ---> Camera ---> check setting for your app-----> turn it on
Run the app. It works.
Cheers
Change your dealloc Method
[self.captureSession removeInput:self.videoInput];
[self.captureSession removeOutput:self.videoOutput];
self.captureSession = nil;
self.videoOutput = nil;
self.videoInput = nil;
We had a problem with this today, essentially from iOS 8.0.2 and above access to the camera requires privacy settings to the camera and not the camera roll, once this was enabled the code then worked.
Saw the same error in my app today, I am handling it with an alert that contains a Settings button shortcut to the app's privacy settings.
do {
let captureInput:AVCaptureDeviceInput = try AVCaptureDeviceInput(device: self.device)
...
} catch let error as NSError {
let alert = UIAlertController(title:error.localizedDescription, message:error.localizedFailureReason, preferredStyle:.Alert)
let settingsAction = UIAlertAction(title: "Settings", style: .Default) { (action) in
UIApplication.sharedApplication().openURL(NSURL(string:UIApplicationOpenSettingsURLString)!)
}
alert.addAction(settingsAction)
self.presentViewController(alert,animated:true,completion:nil)
}
I am developing an custom camera application for ios 7 with xcode 5. I have a class where the view controller is follows
#import "ADMSImageUploader.h"
#import "ADMSViewController.h"
#interface ADMSImageUploader ()
#end
#implementation ADMSImageUploader
AVCaptureVideoPreviewLayer *captureVideoPreviewLayer;
- (id)initWithNibName:(NSString *)nibNameOrNil bundle:(NSBundle *)nibBundleOrNil
{
self = [super initWithNibName:nibNameOrNil bundle:nibBundleOrNil];
if (self) {
// Custom initialization
}
return self;
}
- (void)viewDidLoad
{
[super viewDidLoad];
// Do any additional setup after loading the view.
FrontCamera = NO;
[self initializeCamera];
}
- (void)didReceiveMemoryWarning
{
[super didReceiveMemoryWarning];
// Dispose of any resources that can be recreated.
}
- (void)viewDidAppear:(BOOL)animated {
}
//AVCaptureSession to show live video feed in view
- (void) initializeCamera {
session = [[AVCaptureSession alloc] init];
session.sessionPreset = AVCaptureSessionPresetPhoto;
captureVideoPreviewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:session];
[self.imagePreview.layer setMasksToBounds:YES];
// CGSize landscapeSize;
// landscapeSize.width = self.imagePreview.bounds.size.width;
// landscapeSize.height = self.view.bounds.size.width;
//CGRect rect=[[UIScreen mainScreen]bounds];
captureVideoPreviewLayer.frame = CGRectMake(0.0, 0.0, self.imagePreview.frame.size.width, self.imagePreview.frame.size.height);
connection = captureVideoPreviewLayer.connection;
orientation = AVCaptureVideoOrientationLandscapeRight;
//connection.videoOrientation = AVCaptureVideoOrientationLandscapeRight;
[captureVideoPreviewLayer setVideoGravity:AVLayerVideoGravityResizeAspectFill];
// if ([[UIDevice currentDevice] orientation] == UIInterfaceOrientationPortrait ) {
// captureVideoPreviewLayer.frame = CGRectMake(0.0, 0.0, self.imagePreview.frame.size.width, self.imagePreview.frame.size.height);
// connection = captureVideoPreviewLayer.connection;
// orientation = AVCaptureVideoOrientationPortrait;
// //connection.videoOrientation = AVCaptureVideoOrientationLandscapeRight;
// [captureVideoPreviewLayer setVideoGravity:AVLayerVideoGravityResizeAspectFill];
//
// }else{
// captureVideoPreviewLayer.frame = CGRectMake(0.0, 0.0, self.imagePreview.frame.size.width, self.imagePreview.frame.size.height);
// connection = captureVideoPreviewLayer.connection;
// orientation = AVCaptureVideoOrientationLandscapeRight;
// //connection.videoOrientation = AVCaptureVideoOrientationLandscapeRight;
// [captureVideoPreviewLayer setVideoGravity:AVLayerVideoGravityResizeAspectFill];
//
// }
[self.imagePreview.layer addSublayer:captureVideoPreviewLayer];
NSArray *devices = [AVCaptureDevice devices];
AVCaptureDevice *frontCamera;
AVCaptureDevice *backCamera;
for (AVCaptureDevice *device in devices) {
NSLog(#"Device name: %#", [device localizedName]);
if ([device hasMediaType:AVMediaTypeVideo]) {
if ([device position] == AVCaptureDevicePositionBack) {
NSLog(#"Device position : back");
backCamera = device;
}
else {
NSLog(#"Device position : front");
frontCamera = device;
}
}
else{
if (!([device position] == AVCaptureDevicePositionBack)) {
UIAlertView *alert = [[UIAlertView alloc] initWithTitle:#"Cannot take photos using rear camera"
message:#"Your device doesnot support this feature."
delegate:nil
cancelButtonTitle:#"OK"
otherButtonTitles:nil];
[alert show];
}
}
}
if (!FrontCamera) {
NSError *error = nil;
AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:backCamera error:&error];
if (!input) {
NSLog(#"ERROR: trying to open camera: %#", error);
}
[session addInput:input];
}
if (FrontCamera) {
NSError *error = nil;
AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:frontCamera error:&error];
if (!input) {
NSLog(#"ERROR: trying to open camera: %#", error);
}
[session addInput:input];
}
_stillImageOutput = [[AVCaptureStillImageOutput alloc] init];
NSDictionary *outputSettings = [[NSDictionary alloc] initWithObjectsAndKeys: AVVideoCodecJPEG, AVVideoCodecKey, nil];
[_stillImageOutput setOutputSettings:outputSettings];
[session addOutput:_stillImageOutput];
[session startRunning];
}
- (BOOL)shouldAutorotate{
return YES;
}
//-(void)openCamera:(NSString *)dealerId:(NSString *)inventoryId
//{
//
//}
- (void)viewWillAppear:(BOOL)animated
{
[self.navigationController setNavigationBarHidden:YES animated:animated];
[super viewWillAppear:animated];
_uploadButtonBehaviour.hidden = YES;
_discardButtonBehaviour.hidden = YES;
}
- (void)viewWillDisappear:(BOOL)animated
{
[self.navigationController setNavigationBarHidden:NO animated:animated];
[super viewWillDisappear:animated];
}
- (NSUInteger)supportedInterfaceOrientations;
{
return UIInterfaceOrientationMaskLandscape;
}
- (void)didRotateFromInterfaceOrientation:(UIInterfaceOrientation)fromInterfaceOrientation;
{
captureVideoPreviewLayer.frame = CGRectMake(0.0, 0.0,self.imagePreview.frame.size.width,self.imagePreview.frame.size.height);
[captureVideoPreviewLayer setVideoGravity:AVLayerVideoGravityResizeAspectFill];
connection = captureVideoPreviewLayer.connection;
[captureVideoPreviewLayer setVideoGravity:AVLayerVideoGravityResizeAspectFill];
// connection.videoOrientation = AVCaptureVideoOrientationLandscapeRight;
[connection setVideoOrientation:AVCaptureVideoOrientationLandscapeRight];
}
-(UIInterfaceOrientation)preferredInterfaceOrientationForPresentation
{
return UIInterfaceOrientationLandscapeRight;
}
- (IBAction)captureButton:(id)sender {
}
- (IBAction)exitButton:(id)sender {
[self.navigationController popToRootViewControllerAnimated:YES];
}
- (IBAction)uploadButton:(id)sender {
}
- (IBAction)discardButton:(id)sender {
}
#end
i want to start this view controller in land scape mode only. I am using navigation controller to go to different view controller.
When my previous view controller is in vertical orientation the camera preview is as follows
when i rotate the ios device to landscape the following camera preview gets displayed.the issue here is that the preview image gets stretched.
When my previous view controller is in horizontal orientation the camera preview is as follows. The camera preview appears inverted.
. I request you to help me in this issue.
I ever got the same problem. I solve this by rotate the captured image manually using CGAffineTransformMakeScale by detecting device orientation.
if ([[UIDevice currentDevice] orientation] == UIDeviceOrientationLandscapeRight) {
CIImage *c = [[CIImage alloc] initWithImage:imageFromCamera];
c = [c imageByApplyingTransform:CGAffineTransformTranslate(CGAffineTransformMakeScale(-1, -1), 0, c.extent.size.height)];
imageFromCamera = [UIImage imageWithCGImage:[[CIContext contextWithOptions:nil] createCGImage:c fromRect:c.extent]];
}
if the rotation isn't right, change value of CGAffineTransformMakeScale.
//AVCaptureSession to show live video feed in view
- (void) initializeCamera
{
session = [[AVCaptureSession alloc] init];
if ([session canSetSessionPreset:AVCaptureSessionPresetPhoto]) {
session.sessionPreset = AVCaptureSessionPresetPhoto;
devicesArray = [AVCaptureDevice devices];
for (AVCaptureDevice *device in devicesArray){
if ([device hasMediaType:AVMediaTypeVideo]){
if ([device position] == AVCaptureDevicePositionBack){
NSLog(#"Device name: %#", [device localizedName]);
NSLog(#"Device position : back");
backCamera = device;
backCameraCheck = YES;
}
}
}
if (backCameraCheck) {
NSError *error = nil;
inputDevice = [AVCaptureDeviceInput deviceInputWithDevice:backCamera error:&error];
if (!inputDevice){
NSLog(#"ERROR: trying to open camera: %#", error);
}else{
[session addInput:inputDevice];
}
}else{
UIAlertView *alert = [[UIAlertView alloc] initWithTitle:#"Cannot take photos using rear camera"
message:#"Your device doesnot support this feature."
delegate:nil
cancelButtonTitle:#"OK"
otherButtonTitles:nil];
[alert show];
}
output = [[AVCaptureVideoDataOutput alloc] init];
[session addOutput:output];
captureVideoPreviewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:session];
[captureVideoPreviewLayer setVideoGravity:AVLayerVideoGravityResizeAspectFill];
connection = [output connectionWithMediaType:AVMediaTypeVideo];
connection = captureVideoPreviewLayer.connection;
if ([[UIDevice currentDevice] orientation] == UIInterfaceOrientationPortrait ) {
captureVideoPreviewLayer.frame = CGRectMake(0.0, 0.0, self.imagePreview.frame.size.width, self.imagePreview.frame.size.height);
if ([connection isVideoOrientationSupported]){
orientation = AVCaptureVideoOrientationPortrait;
[connection setVideoOrientation:orientation];
}
}else{
NSLog(#"view width = %f %f", self.imagePreview.frame.size.width, self.imagePreview.frame.size.height);
captureVideoPreviewLayer.frame = CGRectMake(0.0, 0.0, self.imagePreview.frame.size.height,self.imagePreview.frame.size.width);
if ([connection isVideoOrientationSupported]){
orientation = AVCaptureVideoOrientationLandscapeRight;
[connection setVideoOrientation:orientation];
}
}
[self.imagePreview.layer setMasksToBounds:YES];
[self.imagePreview.layer addSublayer:captureVideoPreviewLayer];
_stillImageOutput = [[AVCaptureStillImageOutput alloc] init];
NSDictionary *outputSettings = [[NSDictionary alloc] initWithObjectsAndKeys: AVVideoCodecJPEG, AVVideoCodecKey, nil];
[_stillImageOutput setOutputSettings:outputSettings];
[session addOutput:_stillImageOutput];
[session startRunning];
}else {
// Handle the failure.
}
}
I have a UIImagePickerController that can be used for uploading profile images in my social networking application.
It works fine when it is used alone, i.e. no other camera interfering.
In another view, I am using AVCaptureSession and AVCaptureVideoPreviewLayer to embed the camera view inside the view. Here users can upload various photos that they have captured.
This also works fine when it is used alone, i.e. no other camera interfering.
(This is a Tab-Bar Application)
Whenever the AVCapturePreviewLayer is active, and I enter the view with the UIImagePickerController, the imagePicker takes a very long time to load, and sometimes it just freezes.
This is how I initialise the AVSession/AVCapturePreviewLayer:
self.session = [[AVCaptureSession alloc] init];
self.session.sessionPreset = AVCaptureSessionPreset640x480;
self.captureVideoPreviewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:self.session];
self.captureVideoPreviewLayer.frame = self.cameraView.bounds;
[self.captureVideoPreviewLayer setVideoGravity:AVLayerVideoGravityResizeAspectFill];
[self.cameraView.layer addSublayer:self.captureVideoPreviewLayer];
AVCaptureDevice *device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
NSError *error = nil;
AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:device error:&error];
if (!input) {
// Handle the error appropriately.
NSLog(#"ERROR: trying to open camera: %#", error);
}else
[self.session addInput:input];
self.stillImageOutput = [[AVCaptureStillImageOutput alloc] init];
NSDictionary *outputSettings = [[NSDictionary alloc] initWithObjectsAndKeys: AVVideoCodecJPEG, AVVideoCodecKey, nil];
[self.stillImageOutput setOutputSettings:outputSettings];
[self.session addOutput:self.stillImageOutput];
This is how i initialise the UIImagePickerController:
self.picker = [[UIImagePickerController alloc] init];
self.picker.delegate = self;
self.picker.allowsEditing = YES;
self.picker.sourceType = UIImagePickerControllerSourceTypeCamera;
[self presentViewController:self.picker animated:YES completion:NULL];
Why does the UIImagePickerController take forever to load when the previewLayer is active in another view?
How can I reduce the loading time for the UIImagePickerController?
Ok. It seems like calling:
[AVSession stopRunning];
in viewDidDisappear:(BOOL)animated
fixes this issue for me.