In my App I'm capturing images using AVFoundation
I made a button to switch between front and back cameras but it won't work.
Here's the code I used :
if (captureDevice.position == AVCaptureDevicePositionFront) {
for ( AVCaptureDevice *device in [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo] ) {
if ( device.position == AVCaptureDevicePositionBack) {
NSError * error;
AVCaptureDeviceInput * newDeviceInput = [[AVCaptureDeviceInput alloc]initWithDevice:device error:&error];
[captureSesion beginConfiguration];
for (AVCaptureDeviceInput *oldInput in [captureSesion inputs]) {
[captureSesion removeInput:oldInput];
}
if ([captureSesion canAddInput:newDeviceInput]) {
[captureSesion addInput:newDeviceInput];
}
[captureSesion commitConfiguration];
break;
}
}
}
THX.
If your captureSession's sessionPreset is not compatible with the camera you're switching to it will fail the canAddInput test. I always reset to AVCaptureSessionPresetHigh before toggling cameras then try to switch it to whatever preset I have preferred. Here's the code I use:
- (void)toggleCamera {
AVCaptureDevicePosition newPosition = self.currentCameraPossition == AVCaptureDevicePositionBack ? AVCaptureDevicePositionFront : AVCaptureDevicePositionBack;
AVCaptureDevice *device = [self videoDeviceWithPosition:newPosition];
AVCaptureDeviceInput *deviceInput = [[AVCaptureDeviceInput alloc] initWithDevice:device error:nil];
[_captureSession beginConfiguration];
[_captureSession removeInput:self.deviceInput];
[_captureSession setSessionPreset:AVCaptureSessionPresetHigh]; //Always reset preset before testing canAddInput because preset will cause it to return NO
if ([_captureSession canAddInput:deviceInput]) {
[_captureSession addInput:deviceInput];
self.deviceInput = deviceInput;
self.currentCameraPossition = newPosition;
} else {
[_captureSession addInput:self.deviceInput];
}
if ([device supportsAVCaptureSessionPreset:self.sessionPreset]) {
[_captureSession setSessionPreset:self.sessionPreset];
}
if ([device lockForConfiguration:nil]) {
[device setSubjectAreaChangeMonitoringEnabled:YES];
[device unlockForConfiguration];
}
[_captureSession commitConfiguration];
}
I have seen issues with toggle code not working if it is not run on the main thread. Can you try wrapping your code with the following block:
dispatch_async(dispatch_get_main_queue(), ^{
// Your camera toggle code goes here
});
Related
I am developing a camera related app. Now I succeeded to capture camera preview. However, when I was trying to set autofocus to my app, it didn't work. I tried both AVCaptureFocusModeContinuousAutoFocus and AVCaptureFocusModeAutoFocus, neither of them worked.
By the way, I tested on iPhone 6s.
my ViewController.h file
#import <UIKit/UIKit.h>
#include <AVFoundation/AVFoundation.h>
#interface ViewController : UIViewController
{
AVCaptureSession *cameraCaptureSession;
AVCaptureVideoPreviewLayer *cameraPreviewLayer;
}
- (void) initializeCaptureSession;
#end
my ViewController.m file
#import "ViewController.h"
#interface ViewController ()
#end
#implementation ViewController
- (void)viewDidLoad {
[super viewDidLoad];
// Do any additional setup after loading the view, typically from a nib.
[self initializeCaptureSession];
}
- (void) initializeCaptureSession
{
//Attempt to initialize AVCaptureDevice with back camera
NSArray *videoDevices = [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo];
AVCaptureDevice *captureDevice = nil;
for (AVCaptureDevice *device in videoDevices){
if (device.position == AVCaptureDevicePositionBack)
{
captureDevice = device;
break;
}
}
//If camera is accessible by capture session
if (captureDevice)
{
if ([captureDevice isFocusModeSupported:AVCaptureFocusModeAutoFocus]){
NSError *error;
if ([captureDevice lockForConfiguration:&error]){
[captureDevice setFocusPointOfInterest:CGPointMake(0.5f, 0.5f)];
[captureDevice setFocusMode:AVCaptureFocusModeAutoFocus];
[captureDevice unlockForConfiguration];
}else{
NSLog(#"Error: %#", error);
}
}
//Allocate camera capture session
cameraCaptureSession = [[AVCaptureSession alloc] init];
cameraCaptureSession.sessionPreset = AVCaptureSessionPresetMedium;
//Configure capture session input
AVCaptureDeviceInput *videoIn = [AVCaptureDeviceInput deviceInputWithDevice:captureDevice error:nil];
[cameraCaptureSession addInput:videoIn];
//Configure capture session output
AVCaptureVideoDataOutput *videoOut = [[AVCaptureVideoDataOutput alloc] init];
[videoOut setAlwaysDiscardsLateVideoFrames:YES];
[cameraCaptureSession addOutput:videoOut];
//Bind preview layer to capture session data
cameraPreviewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:cameraCaptureSession];
CGRect layerRect = self.view.bounds;
cameraPreviewLayer.bounds = self.view.bounds;
cameraPreviewLayer.position = CGPointMake(CGRectGetMidX(layerRect), CGRectGetMidY(layerRect));
cameraPreviewLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;
//Add preview layer to UIView layer
[self.view.layer addSublayer:cameraPreviewLayer];
//Begin camera capture
[cameraCaptureSession startRunning];
}
}
- (void)didReceiveMemoryWarning {
[super didReceiveMemoryWarning];
// Dispose of any resources that can be recreated.
[cameraCaptureSession stopRunning];
}
#end
here is of my code on swift, its tested on my iPhone 5s and working only on back camera, i don't know how about iPhone 6s
`if let device = captureDevice {
try! device.lockForConfiguration()
if device.isFocusModeSupported(.autoFocus) {
device.focusMode = .autoFocus
}
device.unlockForConfiguration()
}`
i think you need to remove
[captureDevice setFocusPointOfInterest:CGPointMake(0.5f, 0.5f)];
Try some thing like this:
if ([self.session canAddInput:videoDeviceInput]) {
[[NSNotificationCenter defaultCenter] removeObserver:self name:AVCaptureDeviceSubjectAreaDidChangeNotification object:currentDevice];
[[NSNotificationCenter defaultCenter] addObserver:self selector:#selector(subjectAreaDidChange:) name:AVCaptureDeviceSubjectAreaDidChangeNotification object:videoDevice];
[self.session addInput:videoDeviceInput];
self.videoDeviceInput = videoDeviceInput;
}
Then:
- (void)subjectAreaDidChange:(NSNotification *)notification
{
CGPoint devicePoint = CGPointMake( 0.5, 0.5 );
[self focusWithMode:AVCaptureFocusModeContinuousAutoFocus exposeWithMode:AVCaptureExposureModeContinuousAutoExposure atDevicePoint:devicePoint monitorSubjectAreaChange:NO];
}
And Then:
- (void)focusWithMode:(AVCaptureFocusMode)focusMode exposeWithMode:(AVCaptureExposureMode)exposureMode atDevicePoint:(CGPoint)point monitorSubjectAreaChange:(BOOL)monitorSubjectAreaChange
{
dispatch_async( self.sessionQueue, ^{
AVCaptureDevice *device = self.videoDeviceInput.device;
NSError *error = nil;
if ( [device lockForConfiguration:&error] ) {
// Setting (focus/exposure)PointOfInterest alone does not initiate a (focus/exposure) operation.
// Call -set(Focus/Exposure)Mode: to apply the new point of interest.
if ( device.isFocusPointOfInterestSupported && [device isFocusModeSupported:focusMode] ) {
device.focusPointOfInterest = point;
device.focusMode = focusMode;
}
if ( device.isExposurePointOfInterestSupported && [device isExposureModeSupported:exposureMode] ) {
device.exposurePointOfInterest = point;
device.exposureMode = exposureMode;
}
device.subjectAreaChangeMonitoringEnabled = monitorSubjectAreaChange;
[device unlockForConfiguration];
}
else {
NSLog( #"Could not lock device for configuration: %#", error );
}
} );
}
This was a code from a project I worked. Feel free to ask if you have any doubt.
AVCaptureSession *captureSession = [[AVCaptureSession alloc]init];
previewLayer = [AVCaptureVideoPreviewLayer layerWithSession:captureSession];
previewLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;
AVCaptureDevice *videoDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
NSError *err = nil;
AVCaptureDeviceInput *videoIn = [AVCaptureDeviceInput deviceInputWithDevice:videoDevice error: &err];
if (err == nil){
if ([captureSession canAddInput:videoIn]){
[captureSession addInput:videoIn];
}else{
NSLog(#"Failed to add video input");
}
}else{
NSLog(#"Error: %#",err);
}
[captureSession startRunning];
previewLayer.frame = [self.view bounds];
[self.view.layer addSublayer:previewLayer];
I am not sure what happened but if I used the code above the camera can give me a clear preview.
My app uses video recording, and so I use the AVCaptureSession. When I see the Capture Session preview, however, I notice that the quality is lower, especially when pointing the camera over text on a television or computer screen. How can I increase the quality of my Capture Session so that it will display text on computer screens more clearly? This is the part of my code that deals with video quality.
self.CaptureSession = [[AVCaptureSession alloc] init];
self.CaptureSession.automaticallyConfiguresApplicationAudioSession = NO;
[self.CaptureSession setSessionPreset:AVCaptureSessionPresetHigh];
//----- ADD INPUTS -----
//ADD VIDEO INPUT
self.VideoDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
if ([self.VideoDevice hasTorch] == YES){
self.flashSet.hidden = NO;
self.flashOverlayObject.hidden = NO;
}
[self.VideoDevice lockForConfiguration:nil];
[self.VideoDevice setAutoFocusRangeRestriction:AVCaptureAutoFocusRangeRestrictionNear];
[self.VideoDevice setFocusMode:AVCaptureFocusModeContinuousAutoFocus];
[self.VideoDevice setExposureMode:AVCaptureExposureModeContinuousAutoExposure];
[self.VideoDevice setWhiteBalanceMode:AVCaptureWhiteBalanceModeContinuousAutoWhiteBalance];
if(self.VideoDevice.lowLightBoostSupported){
self.VideoDevice.automaticallyEnablesLowLightBoostWhenAvailable = YES;
}
[self.VideoDevice unlockForConfiguration];
if (self.VideoDevice)
{
NSError *error;
self.VideoInputDevice = [AVCaptureDeviceInput deviceInputWithDevice:self.VideoDevice error:&error];
if (!error)
{
if ([self.CaptureSession canAddInput:self.VideoInputDevice])
[self.CaptureSession addInput:self.VideoInputDevice];
}
}
I have a video preview layer which shows the video feed from a given input. I want the user to be able to switch camera with single button. The button I have created will switch the camera from the initial back facing camera to the front facing camera but does not work in reverse.
Here is the method:
- (IBAction)CameraToggleButtonPressed:(id)sender {
if (_captureSession) {
NSLog(#"Toggle camera");
NSError *error;
AVCaptureInput *videoInput = [self videoInput];
AVCaptureInput *newVideoInput;
AVCaptureDevicePosition position = [[VideoInputDevice device] position];
[_captureSession beginConfiguration];
[_captureSession removeInput:_videoInput];
[_captureSession removeInput:_audioInput];
[_captureSession removeOutput:_movieOutput];
[_captureSession removeOutput:_stillImageOutput];
//Get new input
AVCaptureDevice *newCamera = nil;
if(((AVCaptureDeviceInput*)videoInput).device.position == AVCaptureDevicePositionBack)
{
newCamera = [self cameraWithPosition:AVCaptureDevicePositionFront];
}
else
{
newCamera = [self cameraWithPosition:AVCaptureDevicePositionBack];
}
newVideoInput = [[AVCaptureDeviceInput alloc] initWithDevice:newCamera error:&error];
if(!newVideoInput || error)
{
NSLog(#"Error creating capture device input: %#", error.localizedDescription);
}
else
{
[self.captureSession addInput:newVideoInput];
[self.captureSession addInput:self.audioInput];
[self.captureSession addOutput:self.movieOutput];
[self.captureSession addOutput:self.stillImageOutput];
}
[_captureSession commitConfiguration];
}
}
Why is does the button not work in reverse?
I am using video recording from AVCam. Auto focus works fine before video recording is started. But it keeps on focusing after the video has started recording which is an annoying behavior. I have disabled video stabilization in AVCaptureConnection but still no use
AVCaptureConnection *connection = [movieFileOutput connectionWithMediaType:AVMediaTypeVideo];
if ([connection isVideoStabilizationSupported]) {
if (SYSTEM_VERSION_LESS_THAN(#"8.0")) {
// For iOS 7
[connection setEnablesVideoStabilizationWhenAvailable:NO];
} else {
connection.preferredVideoStabilizationMode = AVCaptureVideoStabilizationModeOff;
}
}
Please guide me how can I disable auto focus during video recording?
I have called following method from viewDidLoad to achieve required functionality
- (void)disableAutoFocus {
AVCaptureDevice *device = [[self videoDeviceInput] device];
[device lockForConfiguration:nil];
NSArray *devices = [AVCaptureDevice devices];
NSError *error;
for (AVCaptureDevice *device in devices) {
if (([device hasMediaType:AVMediaTypeVideo]) &&
([device position] == AVCaptureDevicePositionBack) ) {
[device lockForConfiguration:&error];
if ([device isFocusModeSupported:AVCaptureFocusModeLocked]) {
device.focusMode = AVCaptureFocusModeLocked;
NSLog(#"Focus locked");
}
[device unlockForConfiguration];
}
}
}
I know that the only way to turn on the flash and keep it on on iPhone 4 is by turning the video camera on. I'm not too sure of the code though. Here is what I am trying:
-(IBAction)turnTorchOn {
AVCaptureSession *captureSession = [[AVCaptureSession alloc] init];
AVCaptureDevice *videoCaptureDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
NSError *error = nil;
AVCaptureDeviceInput *videoInput = [AVCaptureDeviceInput deviceInputWithDevice:videoCaptureDevice error:&error];
if (videoInput) {
[captureSession addInput:videoInput];
AVCaptureVideoDataOutput* videoOutput = [[AVCaptureVideoDataOutput alloc] init];
[videoOutput setSampleBufferDelegate:self queue:dispatch_get_current_queue()];
[captureSession addOutput:videoOutput];
[captureSession startRunning];
videoCaptureDevice.torchMode = AVCaptureTorchModeOn;
}
}
Does anybody know if this would work or am I missing anything? (I don't have an iPhone 4 yet to test on -just trying out some of the new API's).
Thanks
Here's a shorter version you can now use to turn the light on or off:
AVCaptureDevice *device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
if ([device hasTorch]) {
[device lockForConfiguration:nil];
[device setTorchMode:AVCaptureTorchModeOn]; // use AVCaptureTorchModeOff to turn off
[device unlockForConfiguration];
}
UPDATE: (March 2015)
With iOS 6.0 and later, you can control the brightness or level of the torch using the following method:
- (void)setTorchToLevel:(float)torchLevel
{
AVCaptureDevice *device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
if ([device hasTorch]) {
[device lockForConfiguration:nil];
if (torchLevel <= 0.0) {
[device setTorchMode:AVCaptureTorchModeOff];
}
else {
if (torchLevel >= 1.0)
torchLevel = AVCaptureMaxAvailableTorchLevel;
BOOL success = [device setTorchModeOnWithLevel:torchLevel error:nil];
}
[device unlockForConfiguration];
}
}
You may also want to monitor the return value (success) from setTorchModeOnWithLevel:. You may get a failure if you try to set the level too high and the torch is overheating. In that case setting the level to AVCaptureMaxAvailableTorchLevel will set the level to the highest level that is allowed given the temperature of the torch.
iWasRobbed's answer is great, except there is an AVCaptureSession running in the background all the time. On my iPhone 4s it takes about 12% CPU power according to Instrument so my app took about 1% battery in a minute. In other words if the device is prepared for AV capture it's not cheap.
Using the code below my app requires 0.187% a minute so the battery life is more than 5x longer.
This code works just fine on any device (tested on both 3GS (no flash) and 4s). Tested on 4.3 in simulator as well.
#import <AVFoundation/AVFoundation.h>
- (void) turnTorchOn:(BOOL)on {
Class captureDeviceClass = NSClassFromString(#"AVCaptureDevice");
if (captureDeviceClass != nil) {
AVCaptureDevice *device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
if ([device hasTorch] && [device hasFlash]){
[device lockForConfiguration:nil];
if (on) {
[device setTorchMode:AVCaptureTorchModeOn];
[device setFlashMode:AVCaptureFlashModeOn];
torchIsOn = YES;
} else {
[device setTorchMode:AVCaptureTorchModeOff];
[device setFlashMode:AVCaptureFlashModeOff];
torchIsOn = NO;
}
[device unlockForConfiguration];
}
}
}
See a better answer below: https://stackoverflow.com/a/10054088/308315
Old answer:
First, in your AppDelegate .h file:
#import <AVFoundation/AVFoundation.h>
#interface AppDelegate : NSObject <UIApplicationDelegate> {
AVCaptureSession *torchSession;
}
#property (nonatomic, retain) AVCaptureSession * torchSession;
#end
Then in your AppDelegate .m file:
#implementation AppDelegate
#synthesize torchSession;
- (void)dealloc {
[torchSession release];
[super dealloc];
}
- (id) init {
if ((self = [super init])) {
// initialize flashlight
// test if this class even exists to ensure flashlight is turned on ONLY for iOS 4 and above
Class captureDeviceClass = NSClassFromString(#"AVCaptureDevice");
if (captureDeviceClass != nil) {
AVCaptureDevice *device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
if ([device hasTorch] && [device hasFlash]){
if (device.torchMode == AVCaptureTorchModeOff) {
NSLog(#"Setting up flashlight for later use...");
AVCaptureDeviceInput *flashInput = [AVCaptureDeviceInput deviceInputWithDevice:device error: nil];
AVCaptureVideoDataOutput *output = [[AVCaptureVideoDataOutput alloc] init];
AVCaptureSession *session = [[AVCaptureSession alloc] init];
[session beginConfiguration];
[device lockForConfiguration:nil];
[session addInput:flashInput];
[session addOutput:output];
[device unlockForConfiguration];
[output release];
[session commitConfiguration];
[session startRunning];
[self setTorchSession:session];
[session release];
}
}
}
}
return self;
}
Then anytime you want to turn it on, just do something like this:
// test if this class even exists to ensure flashlight is turned on ONLY for iOS 4 and above
Class captureDeviceClass = NSClassFromString(#"AVCaptureDevice");
if (captureDeviceClass != nil) {
AVCaptureDevice *device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
[device lockForConfiguration:nil];
[device setTorchMode:AVCaptureTorchModeOn];
[device setFlashMode:AVCaptureFlashModeOn];
[device unlockForConfiguration];
}
And similar for turning it off:
// test if this class even exists to ensure flashlight is turned on ONLY for iOS 4 and above
Class captureDeviceClass = NSClassFromString(#"AVCaptureDevice");
if (captureDeviceClass != nil) {
AVCaptureDevice *device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
[device lockForConfiguration:nil];
[device setTorchMode:AVCaptureTorchModeOff];
[device setFlashMode:AVCaptureFlashModeOff];
[device unlockForConfiguration];
}
the lockforConfiguration is set in your code, where you declare your AVCaptureDevice is a property.
[videoCaptureDevice lockForConfiguration:nil];
From iOS 6.0 and above, toggling torch flash on/off,
- (void) toggleFlash {
AVCaptureDevice *device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
if ([device hasTorch] && [device hasFlash]){
[device lockForConfiguration:nil];
[device setFlashMode:(device.flashActive) ? AVCaptureFlashModeOff : AVCaptureFlashModeOn];
[device setTorchMode:(device.torchActive) ? AVCaptureTorchModeOff : AVCaptureTorchModeOn];
[device unlockForConfiguration];
}
}
P.S. This approach is only suggestible if you don't have on/off function. Remember there's one more option Auto. i.e. AVCaptureFlashModeAuto and AVCaptureTorchModeAuto. To support auto mode as well, you've keep track of current mode and based on that change mode of flash & torch.
Swift 2.0 version:
func setTorchLevel(torchLevel: Float)
{
self.captureSession?.beginConfiguration()
defer {
self.captureSession?.commitConfiguration()
}
if let device = backCamera?.device where device.hasTorch && device.torchAvailable {
do {
try device.lockForConfiguration()
defer {
device.unlockForConfiguration()
}
if torchLevel <= 0.0 {
device.torchMode = .Off
}
else if torchLevel >= 1.0 {
try device.setTorchModeOnWithLevel(min(torchLevel, AVCaptureMaxAvailableTorchLevel))
}
}
catch let error {
print("Failed to set up torch level with error \(error)")
return
}
}
}
//import fremework in .h file
#import <AVFoundation/AVFoundation.h>
{
AVCaptureSession *torchSession;
}
#property(nonatomic,retain)AVCaptureSession *torchSession;
-(IBAction)onoff:(id)sender;
//implement in .m file
#synthesize torchSession;
-(IBAction)onoff:(id)sender
{
AVCaptureDevice *device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
if ([device hasTorch] && [device hasFlash])
{
if (device.torchMode == AVCaptureTorchModeOff)
{
[button setTitle:#"OFF" forState:UIControlStateNormal];
AVCaptureDeviceInput *flashInput = [AVCaptureDeviceInput deviceInputWithDevice:device error: nil];
AVCaptureVideoDataOutput *output = [[AVCaptureVideoDataOutput alloc] init];
AVCaptureSession *session = [[AVCaptureSession alloc] init];
[session beginConfiguration];
[device lockForConfiguration:nil];
[device setTorchMode:AVCaptureTorchModeOn];
[device setFlashMode:AVCaptureFlashModeOn];
[session addInput:flashInput];
[session addOutput:output];
[device unlockForConfiguration];
[output release];
[session commitConfiguration];
[session startRunning];
[self setTorchSession:session];
[session release];
}
else
{
[button setTitle:#"ON" forState:UIControlStateNormal];
[torchSession stopRunning];
}
}
}
- (void)dealloc
{
[torchSession release];
[super dealloc];
}
This work's very well.. hope it help's someone !
-(IBAction)flashlight:(id)sender {
AVCaptureDevice *device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
if ([device hasTorch] && [device hasFlash]){
if (device.torchMode == AVCaptureTorchModeOff) {
[sender setTitle:#"Torch Off" forState:UIControlStateNormal];
AVCaptureDeviceInput *flashInput = [AVCaptureDeviceInput deviceInputWithDevice:device error: nil];
AVCaptureVideoDataOutput *output = [[AVCaptureVideoDataOutput alloc] init];
AVCaptureSession *cam = [[AVCaptureSession alloc] init];
[cam beginConfiguration];
[device lockForConfiguration:nil];
[device setTorchMode:AVCaptureTorchModeOn];
[device setFlashMode:AVCaptureFlashModeOn];
[cam addInput:flashInput];
[cam addOutput:output];
[device unlockForConfiguration];
[cam commitConfiguration];
[cam startRunning];
[self setTorchSession:cam];
}
else {
[sender setTitle:#"Torch On" forState:UIControlStateNormal];
[_torchSession stopRunning];
}
}
}