How to turn off Torch when leaving (segue) from view? - ios

I have made an app that utilizes Torch. I have a button that turns torch on / off. However, if user turns torch on, then navigates away from view, torch stays on. User must navigate back to view to turn off. What I want is for the torch to turn off automatically when user navigates away from page.
Using Xcode 5.1.1; ios 7; this app primarily for iPhone
Here is the code I use for Torch:
#import "ViewController.h"
#interface ViewController ()
#end
#implementation ViewController
#synthesize btnFlash;
- (void)viewDidLoad
{
[super viewDidLoad];
// Do any additional setup after loading the view, typically from a nib.
}
- (void)didReceiveMemoryWarning
{
[super didReceiveMemoryWarning];
// Dispose of any resources that can be recreated.
}
- (IBAction)btnFlashOnClicked:(id)sender
{
AVCaptureDevice *flashLight = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
if ([flashLight isTorchAvailable] && [flashLight isTorchModeSupported:AVCaptureTorchModeOn])
{
BOOL success = [flashLight lockForConfiguration:nil];
if (success)
{
if ([flashLight isTorchActive])
{
[btnFlash setTitle:#"Light On" forState:UIControlStateNormal];
[flashLight setTorchMode:AVCaptureTorchModeOff];
}
else
{
[btnFlash setTitle:#"Light Off" forState:UIControlStateNormal];
[flashLight setTorchMode:AVCaptureTorchModeOn];
}
[flashLight unlockForConfiguration];
}
}
}
#end

OK, break down the functionality into methods that allow easy querying of the current light state and methods to turn it on or off:
- (AVCaptureDevice *)flashLight
{
AVCaptureDevice *flashLight = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
if (![flashLight isTorchAvailable] ||
![flashLight isTorchModeSupported:AVCaptureTorchModeOn]) {
flashLight = nil;
}
return flashLight;
}
- (BOOL)isLightOn
{
BOOL isOn = NO;
AVCaptureDevice *flashLight = [self flashLight];
if ([flashLight lockForConfiguration:nil]) {
isOn = [flashLight isTorchActive];
[flashLight unlockForConfiguration];
}
return isOn;
}
- (void)turnLightOn:(BOOL)on
{
AVCaptureDevice *flashLight = [self flashLight];
if ([flashLight lockForConfiguration:nil]) {
[flashLight setTorchMode:on ? AVCaptureTorchModeOn : AVCaptureTorchModeOff];
[flashLight unlockForConfiguration];
}
}
as that will make it easier to simply call turnLightOn:NO regardless of the current state and easier to manipulate in your action method:
- (IBAction)btnFlashOnClicked:(id)sender
{
BOOL newState = ![self isTorchOn];
[self turnLightOn:newState];
[btnFlash setTitle:newState ? #"Light Off" : #"Light On"
forState:UIControlStateNormal];
}
- (void)viewWillDisappear:(BOOL)animated
{
[super viewWillDisappear:animated];
[self turnLightOn:NO];
}

You could turn the torch off inside the method:
- (void)viewDidDisappear:(BOOL)animated
simply paste the following in your view controller:
- (void)viewDidDisappear:(BOOL)animated{
[super viewDidAppear:animated];
// your code to turn the torch off goes here
}

Related

usually crash at [ZXCapture dealloc]

I am using zxcapture. My program often crashes at a specific point pointed in the following code.
- (void)dealloc {
if (_lastScannedImage) {
CGImageRelease(_lastScannedImage); // crash here
}
if (_session && _session.inputs) {
for (AVCaptureInput *input in _session.inputs) {
[_session removeInput:input];
}
}
if (_session && _session.outputs) {
for (AVCaptureOutput *output in _session.outputs) {
[_session removeOutput:output];
}
}
}
[self.capture.layer removeFromSuperlayer];
[self.capture stop];
[self dismissViewControllerAnimated:YES completion:nil];
solved the issue.

no known instance method for selector - Obj-C

For some reason I am getting this error in my code and cannot figure it out. I am trying to have a QR Scanner in my app for a class project. Thanks in advance.
ScannerViewController.h:
#import "ViewController.h"
#import <AVFoundation/AVFoundation.h>
#protocol ScannerViewControllerDelegate;
#interface ScannerViewController : ViewController <AVCaptureMetadataOutputObjectsDelegate>
#property (nonatomic, weak) id<ScannerViewControllerDelegate> delegate;
#property (assign, nonatomic) BOOL touchToFocusEnabled;
- (BOOL) isCameraAvailable;
- (void) startScanning;
- (void) stopScanning;
- (void) setTorch:(BOOL) aStatus;
#end
#protocol AMScanViewControllerDelegate <NSObject>
#optional
- (void) scanViewController:(ScannerViewController *) aCtler didTapToFocusOnPoint:(CGPoint) aPoint;
- (void) scanViewController:(ScannerViewController *) aCtler didSuccessfullyScan:(NSString *) aScannedValue;
#end
ScannerViewController.m:
#import "ScannerViewController.h"
#interface ScannerViewController ()
#property (strong, nonatomic) AVCaptureDevice* device;
#property (strong, nonatomic) AVCaptureDeviceInput* input;
#property (strong, nonatomic) AVCaptureMetadataOutput* output;
#property (strong, nonatomic) AVCaptureSession* session;
#property (strong, nonatomic) AVCaptureVideoPreviewLayer* preview;
#end
#implementation ScannerViewController
- (id)initWithNibName:(NSString *)nibNameOrNil bundle:(NSBundle *)nibBundleOrNil
{
self = [super initWithNibName:nibNameOrNil bundle:nibBundleOrNil];
if (self) {
}
return self;
}
- (void)viewWillAppear:(BOOL)animated;
{
[super viewWillAppear:animated];
if(![self isCameraAvailable]) {
[self setupNoCameraView];
}
}
- (void) viewDidAppear:(BOOL)animated;
{
[super viewDidAppear:animated];
}
- (void)viewDidLoad
{
[super viewDidLoad];
if([self isCameraAvailable]) {
[self setupScanner];
}
}
- (void)didReceiveMemoryWarning
{
[super didReceiveMemoryWarning];
// Dispose of any resources that can be recreated.
}
- (void)touchesBegan:(NSSet*)touches withEvent:(UIEvent*)evt
{
if(self.touchToFocusEnabled) {
UITouch *touch=[touches anyObject];
CGPoint pt= [touch locationInView:self.view];
[self focus:pt];
}
}
#pragma mark -
#pragma mark NoCamAvailable
- (void) setupNoCameraView;
{
UILabel *labelNoCam = [[UILabel alloc] init];
labelNoCam.text = #"No Camera available";
labelNoCam.textColor = [UIColor blackColor];
[self.view addSubview:labelNoCam];
[labelNoCam sizeToFit];
labelNoCam.center = self.view.center;
}
- (NSUInteger)supportedInterfaceOrientations;
{
return UIInterfaceOrientationMaskLandscape;
}
- (BOOL)shouldAutorotate;
{
return (UIDeviceOrientationIsLandscape([[UIDevice currentDevice] orientation]));
}
- (void)didRotateFromInterfaceOrientation: (UIInterfaceOrientation)fromInterfaceOrientation;
{
if([[UIDevice currentDevice] orientation] == UIDeviceOrientationLandscapeLeft) {
AVCaptureConnection *con = self.preview.connection;
con.videoOrientation = AVCaptureVideoOrientationLandscapeRight;
} else {
AVCaptureConnection *con = self.preview.connection;
con.videoOrientation = AVCaptureVideoOrientationLandscapeLeft;
}
}
#pragma mark -
#pragma mark AVFoundationSetup
- (void) setupScanner;
{
self.device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
self.input = [AVCaptureDeviceInput deviceInputWithDevice:self.device error:nil];
self.session = [[AVCaptureSession alloc] init];
self.output = [[AVCaptureMetadataOutput alloc] init];
[self.session addOutput:self.output];
[self.session addInput:self.input];
[self.output setMetadataObjectsDelegate:self queue:dispatch_get_main_queue()];
self.output.metadataObjectTypes = #[AVMetadataObjectTypeQRCode];
self.preview = [AVCaptureVideoPreviewLayer layerWithSession:self.session];
self.preview.videoGravity = AVLayerVideoGravityResizeAspectFill;
self.preview.frame = CGRectMake(0, 0, self.view.frame.size.width, self.view.frame.size.height);
AVCaptureConnection *con = self.preview.connection;
con.videoOrientation = AVCaptureVideoOrientationLandscapeLeft;
[self.view.layer insertSublayer:self.preview atIndex:0];
}
#pragma mark -
#pragma mark Helper Methods
- (BOOL) isCameraAvailable;
{
NSArray *videoDevices = [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo];
return [videoDevices count] > 0;
}
- (void)startScanning;
{
[self.session startRunning];
}
- (void) stopScanning;
{
[self.session stopRunning];
}
- (void) setTorch:(BOOL) aStatus;
{
AVCaptureDevice *device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
[device lockForConfiguration:nil];
if ( [device hasTorch] ) {
if ( aStatus ) {
[device setTorchMode:AVCaptureTorchModeOn];
} else {
[device setTorchMode:AVCaptureTorchModeOff];
}
}
[device unlockForConfiguration];
}
- (void) focus:(CGPoint) aPoint;
{
AVCaptureDevice *device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
if([device isFocusPointOfInterestSupported] &&
[device isFocusModeSupported:AVCaptureFocusModeAutoFocus]) {
CGRect screenRect = [[UIScreen mainScreen] bounds];
double screenWidth = screenRect.size.width;
double screenHeight = screenRect.size.height;
double focus_x = aPoint.x/screenWidth;
double focus_y = aPoint.y/screenHeight;
if([device lockForConfiguration:nil]) {
// Error here ------------------------
if([self.delegate respondsToSelector:#selector(scanViewController:didTapToFocusOnPoint:)]) {
[self.delegate scanViewController:self didTapToFocusOnPoint:aPoint];
}
// ------------------- End
[device setFocusPointOfInterest:CGPointMake(focus_x,focus_y)];
[device setFocusMode:AVCaptureFocusModeAutoFocus];
if ([device isExposureModeSupported:AVCaptureExposureModeAutoExpose]){
[device setExposureMode:AVCaptureExposureModeAutoExpose];
}
[device unlockForConfiguration];
}
}
}
#pragma mark -
#pragma mark AVCaptureMetadataOutputObjectsDelegate
- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputMetadataObjects:(NSArray *)metadataObjects
fromConnection:(AVCaptureConnection *)connection
{
for(AVMetadataObject *current in metadataObjects) {
if([current isKindOfClass:[AVMetadataMachineReadableCodeObject class]]) {
//Error in this line here ---------------
if([self.delegate respondsToSelector:#selector(scanViewController:didSuccessfullyScan:)]) {
NSString *scannedValue = [((AVMetadataMachineReadableCodeObject *) current) stringValue];
[self.delegate scanViewController:self didSuccessfullyScan:scannedValue];
// ----------------------------End
}
}
}
}
#end
If anyone has a better tutorial than this one please feel free to provide one because they are hard to come by for some reason.
Referenced Tutorial:
http://www.ama-dev.com/iphone-qr-code-library-ios-7/
I'm missing where you tell the compiler that the delegate is an < AMScanViewControllerDelegate>.
Here:
`#protocol AMScanViewControllerDelegate <NSObject>`
Should have been:
`#protocol ScannerViewControllerDelegate <NSObject>`
Since everything else in my code hinted to that

Opening camera in half screen and webview in half screen on same view on ios

My requirement is that i want to open a webview in upper half of iphone screen and camera for video recording in lower half.Is it possible and if its is please describe how to achieve this.I have been struggling with this for past 3 days.Heres how i capture the video
#import "RecordVideoViewController.h"
#interface RecordVideoViewController ()
#end
#implementation RecordVideoViewController
- (void)viewDidLoad
{
[super viewDidLoad];
// Do any additional setup after loading the view, typically from a nib.
x=1;
}
- (void)didReceiveMemoryWarning
{
[super didReceiveMemoryWarning];
// Dispose of any resources that can be recreated.
}
- (IBAction)recordAndPlay:(id)sender {
[self startCameraControllerFromViewController:self usingDelegate:self];
}
-(BOOL)startCameraControllerFromViewController:(UIViewController*)controller
usingDelegate:(id )delegate {
// 1 - Validations
if (([UIImagePickerController isSourceTypeAvailable:UIImagePickerControllerSourceTypeCamera] == NO)
|| (delegate == nil)
|| (controller == nil)) {
return NO;
}
// 2 - Get image picker
UIImagePickerController *cameraUI = [[UIImagePickerController alloc] init];
cameraUI.sourceType = UIImagePickerControllerSourceTypeCamera;
// Displays a control that allows the user to choose movie capture
cameraUI.mediaTypes = [[NSArray alloc] initWithObjects:(NSString *)kUTTypeMovie, nil];
// Hides the controls for moving & scaling pictures, or for
// trimming movies. To instead show the controls, use YES.
cameraUI.allowsEditing = NO;
cameraUI.delegate = delegate;
//3 - Display image picker
[controller presentViewController:cameraUI animated:YES completion:nil];
return YES;
}
Solved it myself.Heres the code
// ViewController.m
// AppleVideoCapture
// Copyright (c) 2014 NetProphets. All rights reserved.
#import "ViewController.h"
#interface ViewController (){
AVCaptureSession * session;
AVCaptureMovieFileOutput * output;
}
#end
#implementation ViewController
- (void)viewDidLoad
{
[super viewDidLoad];
// Do any additional setup after loading the view, typically from a nib.
//1. SetUp an AV session
session= [[AVCaptureSession alloc] init];
if ([session canSetSessionPreset:AVCaptureSessionPresetMedium]) {
session.sessionPreset= AVCaptureSessionPresetMedium;
}
//Get the front facing camera as input device
AVCaptureDevice * device= [self frontCamera ];
//Setup the device capture input
NSError * error;
AVCaptureDeviceInput * videoInput= [AVCaptureDeviceInput deviceInputWithDevice:device error:&error];
if (error) {
NSLog(#"Error with video capture...%#",[error description]);
}
else{
if ([session canAddInput:videoInput])
[session addInput:videoInput];
else
NSLog(#"Error adding video input to session");
}
AVCaptureDevice * audioDevice= [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeAudio];
AVCaptureDeviceInput * audioInput= [AVCaptureDeviceInput deviceInputWithDevice:audioDevice error:&error];
if (error) {
NSLog(#"Error with audio input...%#",[error description]);
}
else{
if ([session canAddInput:audioInput])
[session addInput:audioInput];
else
NSLog(#"Error adding audio to session");
}
//Customize and add your customized video capturing layer to the view
CALayer * viewLayer= [self.view layer];
AVCaptureVideoPreviewLayer * previewLayer= [[AVCaptureVideoPreviewLayer alloc]initWithSession:session];
previewLayer.videoGravity=AVLayerVideoGravityResizeAspectFill;
previewLayer.frame= CGRectMake(0.0f,530.0f,320.0f,-250.0f);
[viewLayer addSublayer:previewLayer];
//Configure the movie output
output= [[AVCaptureMovieFileOutput alloc]init];
if ([session canAddOutput:output]) {
[session addOutput:output];
}
[session startRunning];
}
- (void)didReceiveMemoryWarning
{
[super didReceiveMemoryWarning];
// Dispose of any resources that can be recreated.
}
- (IBAction)recordVideo:(id)sender {
NSLog(#"Record video called");
NSString * path= [NSString stringWithFormat:#"%#%#",NSTemporaryDirectory(),#"output.mov"];
NSURL * outputUrl= [NSURL fileURLWithPath:path];
NSFileManager * myManager= [NSFileManager defaultManager];
NSError * error;
if ([myManager fileExistsAtPath:path]) {
if ([myManager removeItemAtPath:path error:&error]==NO) {
NSLog(#"File removal at temporary directory failed..%#",[error description]);
}
}
[output startRecordingToOutputFileURL:outputUrl recordingDelegate:self];
}
-(AVCaptureDevice *)frontCamera{
NSArray * devices= [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo];
for (AVCaptureDevice * device in devices) {
if ([device position]==AVCaptureDevicePositionFront) {
return device;
}
}
return nil;
}
-(void)captureOutput:(AVCaptureFileOutput *)captureOutput didFinishRecordingToOutputFileAtURL:(NSURL *)outputFileURL fromConnections:(NSArray *)connections error:(NSError *)error{
NSLog(#"Recording Finished");
ALAssetsLibrary * library=[[ALAssetsLibrary alloc]init];
if ([library videoAtPathIsCompatibleWithSavedPhotosAlbum:outputFileURL])
{
[library writeVideoAtPathToSavedPhotosAlbum:outputFileURL
completionBlock:^(NSURL *assetURL, NSError *error)
{
if (error)
{
NSLog(#"Error saving file to photos album");
}
}];
}
}
-(void)captureOutput:(AVCaptureFileOutput *)captureOutput didStartRecordingToOutputFileAtURL:(NSURL *)fileURL fromConnections:(NSArray *)connections{
NSLog(#"Recording Started");
}
- (IBAction)stopRecording:(id)sender {
NSLog(#"Stop Recording called");
[output stopRecording];
}
#end

Implementing a strobe function

So I am trying to create a LED strobe light and I have managed to make a on/off switch for the light. Here is my code:
#implementation ViewController
- (void) setTorchOn:(BOOL)isOn
{
AVCaptureDevice* device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
[device lockForConfiguration:nil];
[device setTorchMode:isOn ? AVCaptureTorchModeOn : AVCaptureTorchModeOff];
[device unlockForConfiguration];
}
-(IBAction)changedSate:(id)sender {
UISwitch *switchValue = (UISwitch*)sender;
[self setTorchOn:[switchValue isOn]];
I was wondering if anyone could help me with this part.
Just make a loop that continuously turns the torch on and off. The type of loops depends on how you wish this to be implemented.
I think you should use the NSTimer class to repeatedly toggle the torch. There are other ways, but just do not loop with a sleep() call.
// Have an NSTimer* timer and BOOL torchOn and volatile BOOL stopStrobe property in your class...
- (void) startFlashing{
self.timer = [[NSTimer alloc] initWithFireDate:[NSDate timeInvervalSinceNow: 0] interval:0.1 target:self selector:#selector(toggleTorch) userInfo:nil repeats:YES];
}
- (void) toggleTorch{
if (stopStrobe){
[self.timer invalidate];
}
torchOn = !torchOn
[self setTorchOn:torchOn];
}
// Set stopStrobe to YES elsewhere in your program when you want it to stop.
is probably what you're looking for.
UPDATE: I know this isn't what you originally asked, but I know it's often best to learn by example, so here is the full example of using this (untested):
#interface ViewController()
#property(nonatomic) BOOL torchOn;
#property(atomic) BOOL stopStrobe;
#end
#implementation ViewController
- (id) init{
self = [super init];
if (self){
self.torchOn = NO;
self.stopStrobe = NO;
}
}
- (void) setTorchOn:(BOOL)isOn
{
AVCaptureDevice* device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
[device lockForConfiguration:nil];
[device setTorchMode:isOn ? AVCaptureTorchModeOn : AVCaptureTorchModeOff];
[device unlockForConfiguration];
}
- (void) toggleTorch{
if (stopStrobe){
[self.timer invalidate];
}
self.torchOn = !self.torchOn
[self setTorchOn:self.torchOn];
}
- (void) startFlashing{
self.timer = [[NSTimer alloc] initWithFireDate:[NSDate timeInvervalSinceNow: 0] interval:0.1 target:self selector:#selector(toggleTorch) userInfo:nil repeats:YES];
}
-(IBAction)changedSate:(id)sender {
UISwitch *switchValue = (UISwitch*)sender;
if ([switchValue isOn]{
self.stopStrobe = NO;
[self startFlashing];
}
else{
[self.stopStrobe = YES];
}
}
This will start the flashing whenever you turn the switch on and stop it once you turn the switch off.

AVCaptureDevice Low Light Boost Does Not Work After Release

I am working on a camera application that as a view controller, manager, and processor. The manager basically has AVCaptureSession, AVCaptureDeviceInput, and AVCaptureStillImageOutput to control image capture. Because the processor does a heavy image processing operations, I release the manager each time the image has been captured to avoid app crash.
I also have a button to toggle automaticallyEnablesLowLightBoostWhenAvailable (Low Light Settings) as documented by Apple here. Here's the toggle function:
- (IBAction)toggleLLBoost:(id)sender {
if([manager isLLBoostActivated]){
[self turnOffLLBoost];
} else {
[self turnOnLLBoost];
}
}
- (void)turnOffLLBoost
{
NSLog(#"LLBoost off");
[boostBt setSelected:NO];
[manager deactivateLLBoostMode];
[[[manager videoInput] device] removeObserver:self forKeyPath:#"lowLightBoostEnabled"];
}
- (void)turnOnLLBoost
{
NSLog(#"LLBoost on");
[boostBt setSelected:YES];
[manager activateLLBoostMode];
[[[manager videoInput] device] addObserver:self forKeyPath:#"lowLightBoostEnabled" options:NSKeyValueObservingOptionNew | NSKeyValueObservingOptionOld context:NULL];
}
It will register observer when the Low Light Setting is ON, and remove observer when it's turned OFF. Here's the code in the manager that does the activation/deactivation for the setting:
-(void)activateLLBoostMode
{
AVCaptureDevice *device = [videoInput device];
if([device isLowLightBoostSupported]){
if([device lockForConfiguration:NULL]){
[device setAutomaticallyEnablesLowLightBoostWhenAvailable:YES];
[device unlockForConfiguration];
[self setIsLLBoostActivated:YES];
NSLog(#"Low Light Boost is set? %hhd",[device automaticallyEnablesLowLightBoostWhenAvailable]);
}
}
}
-(void)deactivateLLBoostMode
{
AVCaptureDevice *device = [videoInput device];
if([device isLowLightBoostSupported]){
if([device lockForConfiguration:NULL]){
[device setAutomaticallyEnablesLowLightBoostWhenAvailable:NO];
[device unlockForConfiguration];
[self setIsLLBoostActivated:NO];
NSLog(#"Low Light Boost is set? %hhd",[device automaticallyEnablesLowLightBoostWhenAvailable]);
}
}
}
Everything would work perfectly at first. The observer would get called when the lowLightBoostEnabled changed value. However, after I release the manager (session, device, output, etc.) and then re-setup the manager again upon finishing image processing, the observer would never get called. Despite the fact that 'automaticallyEnablesLowLightBoostWhenAvailable' has been set to YES.
Any advice or suggestion why this happens?
ADDITIONAL INFO:
Here's snippet of code of how manager initialized. After alloc/init setupSession will be called. At release, the dealloc method will be called:
-(BOOL)setupSession
{
[self deactivateLLBoostMode];
[self setIsLLBoostActivated:NO];
//Setting Session
[self setSession:[AVCaptureSession new]];
session.sessionPreset = AVCaptureSessionPresetPhoto;
//Setting Input
[self setVideoInput:[[AVCaptureDeviceInput alloc] initWithDevice:[self backFacingCamera] error:nil]];
if ([session canAddInput:videoInput]) {
[session addInput:videoInput];
}
//Setting Output
[self setStillImage:[AVCaptureStillImageOutput new]];
if ([stillImage isStillImageStabilizationSupported]) {
[stillImage setAutomaticallyEnablesStillImageStabilizationWhenAvailable:YES];
}
[stillImage setOutputSettings:#{AVVideoCodecKey: AVVideoCodecJPEG, AVVideoQualityKey:#0.6}];
if ([session canAddOutput:stillImage]) {
[session addOutput:stillImage];
}
return true;
}
-(void)dealloc
{
[session stopRunning];
self.session = nil;
self.stillImage = nil;
self.videoInput = nil;
self.toSaveImage = nil;
self.rawImages = nil;
}

Resources