AVCaptureDevice Low Light Boost Does Not work - ios

The low light boost property of AVCaptureDevice is not enabling when it should. I am testing this on an iPhone 5 with iOS 6. Here's the code:
// finds a device that supports the video media type
AVCaptureDevice *device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
NSArray *allDevices = [AVCaptureDevice devices];
for (AVCaptureDevice *currentDevice in allDevices) {
if (currentDevice.position == AVCaptureDevicePositionBack) {
device = currentDevice;
}
}
NSError *deviceError = nil;
if (device.isFlashAvailable){
[device lockForConfiguration:&deviceError];
device.flashMode = AVCaptureFlashModeAuto;
[device unlockForConfiguration];
}
if ([device respondsToSelector:#selector(isLowLightBoostSupported)]) {
if ([device lockForConfiguration:nil]) {
if (device.isLowLightBoostSupported)
device.automaticallyEnablesLowLightBoostWhenAvailable = YES;
[device unlockForConfiguration];
}
}
if ([device isExposureModeSupported:AVCaptureExposureModeContinuousAutoExposure]) {
[device lockForConfiguration:&deviceError];
device.exposureMode = AVCaptureExposureModeContinuousAutoExposure;
// CODE FOR device.exposurePointOfInterest determined from wherever the face is based off of the faceScan method
[device unlockForConfiguration];
}
AVCaptureDeviceInput *newVideoInput = [AVCaptureDeviceInput deviceInputWithDevice:device error:&deviceError];
AVCaptureStillImageOutput *newStillImageOutput = [[AVCaptureStillImageOutput alloc] init];
NSDictionary *outputSettings = [[NSDictionary alloc] initWithObjectsAndKeys:
AVVideoCodecJPEG, AVVideoCodecKey,
nil];
[newStillImageOutput setOutputSettings:outputSettings];
self.sess = [[AVCaptureSession alloc] init];
if ([self.sess canAddInput:newVideoInput]) {
[self.sess addInput:newVideoInput];
}
if ([self.sess canAddOutput:newStillImageOutput]) {
[self.sess addOutput:newStillImageOutput];
}
self.stillImageOutput = newStillImageOutput;
if (device.lowLightBoostEnabled) {
NSLog(#"ENABLED");
}
// register as an observer of changes to lowLightBoostEnabled
[device addObserver:self forKeyPath:#"automaticallyEnablesLowLightBoostWhenAvailable" options:(NSKeyValueObservingOptionNew | NSKeyValueObservingOptionOld) context:NULL];
- (void)observeValueForKeyPath:(NSString *)keyPath ofObject:(id)object change:(NSDictionary *)change context:(void *)context {
if ([keyPath isEqual:#"lowLightBoostEnabled"]) {
NSLog(#"lowLightBoostEnabled changed");
NSNumber *boostIsActiveValue = [change objectForKey:NSKeyValueChangeNewKey];
BOOL boostIsActive = boostIsActiveValue.boolValue;
NSLog(#"is low light boost currently active: %d", boostIsActive);
}
}
Can anyone give me any help? I've looked online but haven't found very conclusive results. I'd appreciate all the help I can get.

You need to lockForConfiguration, according to the docs (well, the header file):
if ([[self backFacingCamera] respondsToSelector:#selector(isLowLightBoostSupported)]) {
if ([[self backFacingCamera] lockForConfiguration:nil]) {
if ([self backFacingCamera].isLowLightBoostSupported)
[self backFacingCamera].automaticallyEnablesLowLightBoostWhenAvailable = YES;
[[self backFacingCamera] unlockForConfiguration];
}
}
Also, isLowLightBoostEnabled tells you whether or not the low light is actually being boosted, not whether it can be. That's the isLowLightBoostSupported selector, as above (to which only iOS 6 devices respond).

Related

AVCaptureDevice depreciated to AVCaptureDeviceDiscoverySession use front camera

Can't find an up-to-date answer on this:
- (void)viewDidLoad {
[super viewDidLoad];
AVCaptureSession *session = [[AVCaptureSession alloc] init];
session.sessionPreset = AVCaptureSessionPresetHigh;
AVCaptureDevice *device = [self frontCamera];
NSError *error = nil;
AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:device error:&error];
[session addInput:input];
AVCaptureVideoPreviewLayer *newCaptureVideoPreviewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:session];
newCaptureVideoPreviewLayer.frame = self.view.bounds;
[self.view.layer addSublayer:newCaptureVideoPreviewLayer];
[self.view bringSubviewToFront:_logo];
[session startRunning];
}
- (AVCaptureDevice *)frontCamera {
NSArray *devices = [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo];
for (AVCaptureDevice *device in devices) {
if ([device position] == AVCaptureDevicePositionFront) {
return device;
}
}
return nil;
}
The NSArray *devices = [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo]; is depreciated and is asking to use AVCaptureDeviceDiscoverySession. How can I correctly implement AVCaptureDeviceDiscoverySession to this code?
I would just like to use the front camera.
Thanks
You can access front camera by code blow.
- (AVCaptureDevice *)frontCamera {
NSArray *devices;
if (#available(iOS 10.0, *)) {
AVCaptureDeviceDiscoverySession *session = [AVCaptureDeviceDiscoverySession discoverySessionWithDeviceTypes:#[AVCaptureDeviceTypeBuiltInTrueDepthCamera, AVCaptureDeviceTypeBuiltInWideAngleCamera] mediaType:AVMediaTypeVideo position:AVCaptureDevicePositionFront];
devices = session.devices;
}else{
devices = [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo];
}
for (AVCaptureDevice *device in devices) {
if ([device position] == AVCaptureDevicePositionFront) {
return device;
}
}
return nil;
}
According to Apple's docs:
AVCaptureDeviceDiscoverySession can find available capture devices currently as also as monitor availability of specified device.
AVCaptureDeviceTypeBuiltInWideAngleCamera specify normal front camera.
AVCaptureDeviceTypeBuiltInTrueDepthCamera specify depth front camera base infrared, which only available on iPhone X.

Why the autofocus didn't work?

I am developing a camera related app. Now I succeeded to capture camera preview. However, when I was trying to set autofocus to my app, it didn't work. I tried both AVCaptureFocusModeContinuousAutoFocus and AVCaptureFocusModeAutoFocus, neither of them worked.
By the way, I tested on iPhone 6s.
my ViewController.h file
#import <UIKit/UIKit.h>
#include <AVFoundation/AVFoundation.h>
#interface ViewController : UIViewController
{
AVCaptureSession *cameraCaptureSession;
AVCaptureVideoPreviewLayer *cameraPreviewLayer;
}
- (void) initializeCaptureSession;
#end
my ViewController.m file
#import "ViewController.h"
#interface ViewController ()
#end
#implementation ViewController
- (void)viewDidLoad {
[super viewDidLoad];
// Do any additional setup after loading the view, typically from a nib.
[self initializeCaptureSession];
}
- (void) initializeCaptureSession
{
//Attempt to initialize AVCaptureDevice with back camera
NSArray *videoDevices = [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo];
AVCaptureDevice *captureDevice = nil;
for (AVCaptureDevice *device in videoDevices){
if (device.position == AVCaptureDevicePositionBack)
{
captureDevice = device;
break;
}
}
//If camera is accessible by capture session
if (captureDevice)
{
if ([captureDevice isFocusModeSupported:AVCaptureFocusModeAutoFocus]){
NSError *error;
if ([captureDevice lockForConfiguration:&error]){
[captureDevice setFocusPointOfInterest:CGPointMake(0.5f, 0.5f)];
[captureDevice setFocusMode:AVCaptureFocusModeAutoFocus];
[captureDevice unlockForConfiguration];
}else{
NSLog(#"Error: %#", error);
}
}
//Allocate camera capture session
cameraCaptureSession = [[AVCaptureSession alloc] init];
cameraCaptureSession.sessionPreset = AVCaptureSessionPresetMedium;
//Configure capture session input
AVCaptureDeviceInput *videoIn = [AVCaptureDeviceInput deviceInputWithDevice:captureDevice error:nil];
[cameraCaptureSession addInput:videoIn];
//Configure capture session output
AVCaptureVideoDataOutput *videoOut = [[AVCaptureVideoDataOutput alloc] init];
[videoOut setAlwaysDiscardsLateVideoFrames:YES];
[cameraCaptureSession addOutput:videoOut];
//Bind preview layer to capture session data
cameraPreviewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:cameraCaptureSession];
CGRect layerRect = self.view.bounds;
cameraPreviewLayer.bounds = self.view.bounds;
cameraPreviewLayer.position = CGPointMake(CGRectGetMidX(layerRect), CGRectGetMidY(layerRect));
cameraPreviewLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;
//Add preview layer to UIView layer
[self.view.layer addSublayer:cameraPreviewLayer];
//Begin camera capture
[cameraCaptureSession startRunning];
}
}
- (void)didReceiveMemoryWarning {
[super didReceiveMemoryWarning];
// Dispose of any resources that can be recreated.
[cameraCaptureSession stopRunning];
}
#end
here is of my code on swift, its tested on my iPhone 5s and working only on back camera, i don't know how about iPhone 6s
`if let device = captureDevice {
try! device.lockForConfiguration()
if device.isFocusModeSupported(.autoFocus) {
device.focusMode = .autoFocus
}
device.unlockForConfiguration()
}`
i think you need to remove
[captureDevice setFocusPointOfInterest:CGPointMake(0.5f, 0.5f)];
Try some thing like this:
if ([self.session canAddInput:videoDeviceInput]) {
[[NSNotificationCenter defaultCenter] removeObserver:self name:AVCaptureDeviceSubjectAreaDidChangeNotification object:currentDevice];
[[NSNotificationCenter defaultCenter] addObserver:self selector:#selector(subjectAreaDidChange:) name:AVCaptureDeviceSubjectAreaDidChangeNotification object:videoDevice];
[self.session addInput:videoDeviceInput];
self.videoDeviceInput = videoDeviceInput;
}
Then:
- (void)subjectAreaDidChange:(NSNotification *)notification
{
CGPoint devicePoint = CGPointMake( 0.5, 0.5 );
[self focusWithMode:AVCaptureFocusModeContinuousAutoFocus exposeWithMode:AVCaptureExposureModeContinuousAutoExposure atDevicePoint:devicePoint monitorSubjectAreaChange:NO];
}
And Then:
- (void)focusWithMode:(AVCaptureFocusMode)focusMode exposeWithMode:(AVCaptureExposureMode)exposureMode atDevicePoint:(CGPoint)point monitorSubjectAreaChange:(BOOL)monitorSubjectAreaChange
{
dispatch_async( self.sessionQueue, ^{
AVCaptureDevice *device = self.videoDeviceInput.device;
NSError *error = nil;
if ( [device lockForConfiguration:&error] ) {
// Setting (focus/exposure)PointOfInterest alone does not initiate a (focus/exposure) operation.
// Call -set(Focus/Exposure)Mode: to apply the new point of interest.
if ( device.isFocusPointOfInterestSupported && [device isFocusModeSupported:focusMode] ) {
device.focusPointOfInterest = point;
device.focusMode = focusMode;
}
if ( device.isExposurePointOfInterestSupported && [device isExposureModeSupported:exposureMode] ) {
device.exposurePointOfInterest = point;
device.exposureMode = exposureMode;
}
device.subjectAreaChangeMonitoringEnabled = monitorSubjectAreaChange;
[device unlockForConfiguration];
}
else {
NSLog( #"Could not lock device for configuration: %#", error );
}
} );
}
This was a code from a project I worked. Feel free to ask if you have any doubt.
AVCaptureSession *captureSession = [[AVCaptureSession alloc]init];
previewLayer = [AVCaptureVideoPreviewLayer layerWithSession:captureSession];
previewLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;
AVCaptureDevice *videoDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
NSError *err = nil;
AVCaptureDeviceInput *videoIn = [AVCaptureDeviceInput deviceInputWithDevice:videoDevice error: &err];
if (err == nil){
if ([captureSession canAddInput:videoIn]){
[captureSession addInput:videoIn];
}else{
NSLog(#"Failed to add video input");
}
}else{
NSLog(#"Error: %#",err);
}
[captureSession startRunning];
previewLayer.frame = [self.view bounds];
[self.view.layer addSublayer:previewLayer];
I am not sure what happened but if I used the code above the camera can give me a clear preview.

iOS - AVCaptureDevice - Autofocus & Exposure with camera capture

I have been implementing Custom Camera using AVCaptureDevice, which require AutoFocus & Exposure to work nicely. I am using the following code to do the camera initialisation
- (void) initializeCamera {
AVAuthorizationStatus status = [AVCaptureDevice authorizationStatusForMediaType:AVMediaTypeVideo];
if(status == AVAuthorizationStatusAuthorized) { // authorized
[self.captureVideoPreviewLayer removeFromSuperlayer];
self.captureSession = [[AVCaptureSession alloc] init];
self.captureSession.sessionPreset = AVCaptureSessionPresetPhoto;
[self removeDeviceObserverForFocus];
self.captureDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
[self addDeviceObserverForFocus];
NSError *error = nil;
[self.captureDevice lockForConfiguration:nil]; //you must lock before setting torch mode
[self.captureDevice setSubjectAreaChangeMonitoringEnabled:YES];
[self.captureDevice unlockForConfiguration];
//Capture layer
self.captureVideoPreviewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:self.captureSession];
self.captureVideoPreviewLayer.bounds = CGRectMake(0, 0, CGRectGetWidth([UIScreen mainScreen].bounds), CGRectGetHeight([UIScreen mainScreen].bounds));
self.captureVideoPreviewLayer.position = CGPointMake(CGRectGetMidX(self.captureVideoPreviewLayer.bounds), CGRectGetMidY(self.captureVideoPreviewLayer.bounds));
[self.captureVideoPreviewLayer setVideoGravity:AVLayerVideoGravityResizeAspectFill];
self.captureVideoPreviewLayer.connection.enabled = YES;
[self.viewCamera.layer insertSublayer:self.captureVideoPreviewLayer atIndex:0];
//Capture input
self.captureInput = [AVCaptureDeviceInput deviceInputWithDevice:self.captureDevice error:&error];
if (!self.captureInput) {
[self capturePhoto];
}
else {
if ([self.captureSession canAddInput:self.captureInput]) {
[self.captureSession addInput:self.captureInput];
}
}
self.captureOutput = [[AVCaptureStillImageOutput alloc] init];
[self.captureOutput setOutputSettings:#{AVVideoCodecKey : AVVideoCodecJPEG}];
[self.captureSession addOutput:self.captureOutput];
//THIS LINE
[self.captureSession setSessionPreset:AVCaptureSessionPresetPhoto];
// setup metadata capture
AVCaptureMetadataOutput *metadataOutput = [[AVCaptureMetadataOutput alloc] init];
CGRect visibleMetadataOutputRect = [self.captureVideoPreviewLayer metadataOutputRectOfInterestForRect:self.vwCamera.bounds];
metadataOutput.rectOfInterest = visibleMetadataOutputRect;
[self.captureSession addOutput:metadataOutput];
dispatch_async(dispatch_get_main_queue(), ^{
[self.captureSession startRunning];
});
}
else if(status == AVAuthorizationStatusNotDetermined){ // not determined
//Try for getting permission
[AVCaptureDevice requestAccessForMediaType:AVMediaTypeVideo completionHandler:^(BOOL granted) {
[self performSelectorOnMainThread:#selector(initializeCamera) withObject:nil waitUntilDone:NO];
}];
}
}
- (void)removeDeviceObserverForFocus {
#try {
while ([self.captureDevice observationInfo] != nil) {
[self.captureDevice removeObserver:self forKeyPath:#"adjustingFocus"];
}
}
#catch (NSException *exception) {
NSLog(#"Exception");
}
#finally {
}
}
- (void)addDeviceObserverForFocus {
[self.captureDevice addObserver:self forKeyPath:#"adjustingFocus" options:NSKeyValueObservingOptionNew context:nil];
}
- (void)observeValueForKeyPath:(NSString *)keyPath ofObject:(id)object change:(NSDictionary *)change context:(void *)context {
if( [keyPath isEqualToString:#"adjustingFocus"] ){
BOOL adjustingFocus = [ [change objectForKey:NSKeyValueChangeNewKey] isEqualToNumber:[NSNumber numberWithInt:1] ];
if (adjustingFocus) {
[self showFocusSquareAtPoint:self.viewCamera.center];
}
}
}
To monitor focus by movement of camera I am doing the following..
[[NSNotificationCenter defaultCenter] addObserver:self selector:#selector(avCaptureDeviceSubjectAreaDidChangeNotification:) name:AVCaptureDeviceSubjectAreaDidChangeNotification object:nil];
#pragma mark - AVCaptureDeviceSubjectAreaDidChangeNotification
-(void)avCaptureDeviceSubjectAreaDidChangeNotification:(NSNotification *)notification{
CGPoint devicePoint = CGPointMake( 0.5, 0.5 );
[self focusWithMode:AVCaptureFocusModeContinuousAutoFocus exposeWithMode:AVCaptureExposureModeContinuousAutoExposure atDevicePoint:devicePoint monitorSubjectAreaChange:NO];
[self showFocusSquareAtPoint:self.vwCamera.center];
}
- (void)focusWithMode:(AVCaptureFocusMode)focusMode exposeWithMode:(AVCaptureExposureMode)exposureMode atDevicePoint:(CGPoint)point monitorSubjectAreaChange:(BOOL)monitorSubjectAreaChange
{
dispatch_async( dispatch_get_main_queue(), ^{
AVCaptureDevice *device = self.captureDevice;
NSError *error = nil;
if ( [device lockForConfiguration:&error] ) {
// Setting (focus/exposure)PointOfInterest alone does not initiate a (focus/exposure) operation.
// Call -set(Focus/Exposure)Mode: to apply the new point of interest.
if ( device.isFocusPointOfInterestSupported && [device isFocusModeSupported:focusMode] ) {
device.focusPointOfInterest = point;
device.focusMode = focusMode;
}
if ( device.isExposurePointOfInterestSupported && [device isExposureModeSupported:exposureMode] ) {
device.exposurePointOfInterest = point;
device.exposureMode = exposureMode;
}
device.subjectAreaChangeMonitoringEnabled = monitorSubjectAreaChange;
[device unlockForConfiguration];
}
else {
NSLog( #"Could not lock device for configuration: %#", error );
}
} );
}
Everything works as expected when I use this [self.captureSession setSessionPreset:AVCaptureSessionPresetPhoto];
If I change the camera preset to something else like AVCaptureSessionPresetHigh AutoFocus and Exposure doesn't work well as expected..
Anyone who has come across such situation?
Thank you for help.
Are you trying to take a picture or record video? Cause the High preset is for video and the exposure and focus work differently(I believe). Here is info on the different presets in the docs - AVCaptureSessionPresets

Disable auto focus during video recording using AVCam

I am using video recording from AVCam. Auto focus works fine before video recording is started. But it keeps on focusing after the video has started recording which is an annoying behavior. I have disabled video stabilization in AVCaptureConnection but still no use
AVCaptureConnection *connection = [movieFileOutput connectionWithMediaType:AVMediaTypeVideo];
if ([connection isVideoStabilizationSupported]) {
if (SYSTEM_VERSION_LESS_THAN(#"8.0")) {
// For iOS 7
[connection setEnablesVideoStabilizationWhenAvailable:NO];
} else {
connection.preferredVideoStabilizationMode = AVCaptureVideoStabilizationModeOff;
}
}
Please guide me how can I disable auto focus during video recording?
I have called following method from viewDidLoad to achieve required functionality
- (void)disableAutoFocus {
AVCaptureDevice *device = [[self videoDeviceInput] device];
[device lockForConfiguration:nil];
NSArray *devices = [AVCaptureDevice devices];
NSError *error;
for (AVCaptureDevice *device in devices) {
if (([device hasMediaType:AVMediaTypeVideo]) &&
([device position] == AVCaptureDevicePositionBack) ) {
[device lockForConfiguration:&error];
if ([device isFocusModeSupported:AVCaptureFocusModeLocked]) {
device.focusMode = AVCaptureFocusModeLocked;
NSLog(#"Focus locked");
}
[device unlockForConfiguration];
}
}
}

Turn on torch/flash on iPhone

I know that the only way to turn on the flash and keep it on on iPhone 4 is by turning the video camera on. I'm not too sure of the code though. Here is what I am trying:
-(IBAction)turnTorchOn {
AVCaptureSession *captureSession = [[AVCaptureSession alloc] init];
AVCaptureDevice *videoCaptureDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
NSError *error = nil;
AVCaptureDeviceInput *videoInput = [AVCaptureDeviceInput deviceInputWithDevice:videoCaptureDevice error:&error];
if (videoInput) {
[captureSession addInput:videoInput];
AVCaptureVideoDataOutput* videoOutput = [[AVCaptureVideoDataOutput alloc] init];
[videoOutput setSampleBufferDelegate:self queue:dispatch_get_current_queue()];
[captureSession addOutput:videoOutput];
[captureSession startRunning];
videoCaptureDevice.torchMode = AVCaptureTorchModeOn;
}
}
Does anybody know if this would work or am I missing anything? (I don't have an iPhone 4 yet to test on -just trying out some of the new API's).
Thanks
Here's a shorter version you can now use to turn the light on or off:
AVCaptureDevice *device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
if ([device hasTorch]) {
[device lockForConfiguration:nil];
[device setTorchMode:AVCaptureTorchModeOn]; // use AVCaptureTorchModeOff to turn off
[device unlockForConfiguration];
}
UPDATE: (March 2015)
With iOS 6.0 and later, you can control the brightness or level of the torch using the following method:
- (void)setTorchToLevel:(float)torchLevel
{
AVCaptureDevice *device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
if ([device hasTorch]) {
[device lockForConfiguration:nil];
if (torchLevel <= 0.0) {
[device setTorchMode:AVCaptureTorchModeOff];
}
else {
if (torchLevel >= 1.0)
torchLevel = AVCaptureMaxAvailableTorchLevel;
BOOL success = [device setTorchModeOnWithLevel:torchLevel error:nil];
}
[device unlockForConfiguration];
}
}
You may also want to monitor the return value (success) from setTorchModeOnWithLevel:. You may get a failure if you try to set the level too high and the torch is overheating. In that case setting the level to AVCaptureMaxAvailableTorchLevel will set the level to the highest level that is allowed given the temperature of the torch.
iWasRobbed's answer is great, except there is an AVCaptureSession running in the background all the time. On my iPhone 4s it takes about 12% CPU power according to Instrument so my app took about 1% battery in a minute. In other words if the device is prepared for AV capture it's not cheap.
Using the code below my app requires 0.187% a minute so the battery life is more than 5x longer.
This code works just fine on any device (tested on both 3GS (no flash) and 4s). Tested on 4.3 in simulator as well.
#import <AVFoundation/AVFoundation.h>
- (void) turnTorchOn:(BOOL)on {
Class captureDeviceClass = NSClassFromString(#"AVCaptureDevice");
if (captureDeviceClass != nil) {
AVCaptureDevice *device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
if ([device hasTorch] && [device hasFlash]){
[device lockForConfiguration:nil];
if (on) {
[device setTorchMode:AVCaptureTorchModeOn];
[device setFlashMode:AVCaptureFlashModeOn];
torchIsOn = YES;
} else {
[device setTorchMode:AVCaptureTorchModeOff];
[device setFlashMode:AVCaptureFlashModeOff];
torchIsOn = NO;
}
[device unlockForConfiguration];
}
}
}
See a better answer below: https://stackoverflow.com/a/10054088/308315
Old answer:
First, in your AppDelegate .h file:
#import <AVFoundation/AVFoundation.h>
#interface AppDelegate : NSObject <UIApplicationDelegate> {
AVCaptureSession *torchSession;
}
#property (nonatomic, retain) AVCaptureSession * torchSession;
#end
Then in your AppDelegate .m file:
#implementation AppDelegate
#synthesize torchSession;
- (void)dealloc {
[torchSession release];
[super dealloc];
}
- (id) init {
if ((self = [super init])) {
// initialize flashlight
// test if this class even exists to ensure flashlight is turned on ONLY for iOS 4 and above
Class captureDeviceClass = NSClassFromString(#"AVCaptureDevice");
if (captureDeviceClass != nil) {
AVCaptureDevice *device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
if ([device hasTorch] && [device hasFlash]){
if (device.torchMode == AVCaptureTorchModeOff) {
NSLog(#"Setting up flashlight for later use...");
AVCaptureDeviceInput *flashInput = [AVCaptureDeviceInput deviceInputWithDevice:device error: nil];
AVCaptureVideoDataOutput *output = [[AVCaptureVideoDataOutput alloc] init];
AVCaptureSession *session = [[AVCaptureSession alloc] init];
[session beginConfiguration];
[device lockForConfiguration:nil];
[session addInput:flashInput];
[session addOutput:output];
[device unlockForConfiguration];
[output release];
[session commitConfiguration];
[session startRunning];
[self setTorchSession:session];
[session release];
}
}
}
}
return self;
}
Then anytime you want to turn it on, just do something like this:
// test if this class even exists to ensure flashlight is turned on ONLY for iOS 4 and above
Class captureDeviceClass = NSClassFromString(#"AVCaptureDevice");
if (captureDeviceClass != nil) {
AVCaptureDevice *device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
[device lockForConfiguration:nil];
[device setTorchMode:AVCaptureTorchModeOn];
[device setFlashMode:AVCaptureFlashModeOn];
[device unlockForConfiguration];
}
And similar for turning it off:
// test if this class even exists to ensure flashlight is turned on ONLY for iOS 4 and above
Class captureDeviceClass = NSClassFromString(#"AVCaptureDevice");
if (captureDeviceClass != nil) {
AVCaptureDevice *device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
[device lockForConfiguration:nil];
[device setTorchMode:AVCaptureTorchModeOff];
[device setFlashMode:AVCaptureFlashModeOff];
[device unlockForConfiguration];
}
the lockforConfiguration is set in your code, where you declare your AVCaptureDevice is a property.
[videoCaptureDevice lockForConfiguration:nil];
From iOS 6.0 and above, toggling torch flash on/off,
- (void) toggleFlash {
AVCaptureDevice *device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
if ([device hasTorch] && [device hasFlash]){
[device lockForConfiguration:nil];
[device setFlashMode:(device.flashActive) ? AVCaptureFlashModeOff : AVCaptureFlashModeOn];
[device setTorchMode:(device.torchActive) ? AVCaptureTorchModeOff : AVCaptureTorchModeOn];
[device unlockForConfiguration];
}
}
P.S. This approach is only suggestible if you don't have on/off function. Remember there's one more option Auto. i.e. AVCaptureFlashModeAuto and AVCaptureTorchModeAuto. To support auto mode as well, you've keep track of current mode and based on that change mode of flash & torch.
Swift 2.0 version:
func setTorchLevel(torchLevel: Float)
{
self.captureSession?.beginConfiguration()
defer {
self.captureSession?.commitConfiguration()
}
if let device = backCamera?.device where device.hasTorch && device.torchAvailable {
do {
try device.lockForConfiguration()
defer {
device.unlockForConfiguration()
}
if torchLevel <= 0.0 {
device.torchMode = .Off
}
else if torchLevel >= 1.0 {
try device.setTorchModeOnWithLevel(min(torchLevel, AVCaptureMaxAvailableTorchLevel))
}
}
catch let error {
print("Failed to set up torch level with error \(error)")
return
}
}
}
//import fremework in .h file
#import <AVFoundation/AVFoundation.h>
{
AVCaptureSession *torchSession;
}
#property(nonatomic,retain)AVCaptureSession *torchSession;
-(IBAction)onoff:(id)sender;
//implement in .m file
#synthesize torchSession;
-(IBAction)onoff:(id)sender
{
AVCaptureDevice *device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
if ([device hasTorch] && [device hasFlash])
{
if (device.torchMode == AVCaptureTorchModeOff)
{
[button setTitle:#"OFF" forState:UIControlStateNormal];
AVCaptureDeviceInput *flashInput = [AVCaptureDeviceInput deviceInputWithDevice:device error: nil];
AVCaptureVideoDataOutput *output = [[AVCaptureVideoDataOutput alloc] init];
AVCaptureSession *session = [[AVCaptureSession alloc] init];
[session beginConfiguration];
[device lockForConfiguration:nil];
[device setTorchMode:AVCaptureTorchModeOn];
[device setFlashMode:AVCaptureFlashModeOn];
[session addInput:flashInput];
[session addOutput:output];
[device unlockForConfiguration];
[output release];
[session commitConfiguration];
[session startRunning];
[self setTorchSession:session];
[session release];
}
else
{
[button setTitle:#"ON" forState:UIControlStateNormal];
[torchSession stopRunning];
}
}
}
- (void)dealloc
{
[torchSession release];
[super dealloc];
}
This work's very well.. hope it help's someone !
-(IBAction)flashlight:(id)sender {
AVCaptureDevice *device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
if ([device hasTorch] && [device hasFlash]){
if (device.torchMode == AVCaptureTorchModeOff) {
[sender setTitle:#"Torch Off" forState:UIControlStateNormal];
AVCaptureDeviceInput *flashInput = [AVCaptureDeviceInput deviceInputWithDevice:device error: nil];
AVCaptureVideoDataOutput *output = [[AVCaptureVideoDataOutput alloc] init];
AVCaptureSession *cam = [[AVCaptureSession alloc] init];
[cam beginConfiguration];
[device lockForConfiguration:nil];
[device setTorchMode:AVCaptureTorchModeOn];
[device setFlashMode:AVCaptureFlashModeOn];
[cam addInput:flashInput];
[cam addOutput:output];
[device unlockForConfiguration];
[cam commitConfiguration];
[cam startRunning];
[self setTorchSession:cam];
}
else {
[sender setTitle:#"Torch On" forState:UIControlStateNormal];
[_torchSession stopRunning];
}
}
}

Resources