AVCaptureSession in modalviewcontroller on iOS5 with ARC - ios

I'm going insane trying to get an AVCaptureSession (in a view controller) to be presented and dismissed in my project. I'm currently on iOS5.1 and have ARC enabled.
I can get it to work fine the first time I present the viewcontroller and start the session but when I dismiss and present a second time the session will not start. I subscribed to the "AVCaptureSessionRuntimeErrorNotification" notification and receive the following error:
"Error Domain=AVFoundationErrorDomain Code=-11819 "Cannot Complete Action" UserInfo=0x1a4020 {NSLocalizedRecoverySuggestion=Try again later., NSLocalizedDescription=Cannot Complete Action}"
I'm assuming that something is not being properly released in my session, but with ARC there are no releases and I instead set everything to be released to nil.
my viewDidLoad methods basically just triggers initCamera
initCamera method:
AVCaptureSession *tmpSession = [[AVCaptureSession alloc] init];
session = tmpSession;
session.sessionPreset = AVCaptureSessionPresetMedium;
captureVideoPreviewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:session];
captureVideoPreviewLayer.frame = self.vImagePreview.bounds;
[self.vImagePreview.layer addSublayer:captureVideoPreviewLayer];
rearCamera = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
NSError *error = nil;
input = [AVCaptureDeviceInput deviceInputWithDevice:rearCamera error:&error];
if (!input) {
// Handle the error appropriately.
NSLog(#"ERROR: trying to open camera: %#", error);
}
[session addInput:input];
videoDataOutput = [[AVCaptureVideoDataOutput alloc] init];
NSDictionary *outputSettings = [[NSDictionary alloc] initWithObjectsAndKeys: [NSNumber numberWithInt:kCVPixelFormatType_32BGRA], kCVPixelBufferPixelFormatTypeKey, nil];
[videoDataOutput setVideoSettings:outputSettings];
[videoDataOutput setAlwaysDiscardsLateVideoFrames:YES];
queue = dispatch_queue_create("cameraQueue", DISPATCH_QUEUE_SERIAL);
[videoDataOutput setSampleBufferDelegate:self queue:queue];
dispatch_release(queue);
[session addOutput:videoDataOutput];
NSNotificationCenter *notify =
[NSNotificationCenter defaultCenter];
[notify addObserver: self
selector: #selector(onVideoError:)
name: AVCaptureSessionRuntimeErrorNotification
object: session];
[session startRunning];
[rearCamera lockForConfiguration:nil];
rearCamera.whiteBalanceMode = AVCaptureWhiteBalanceModeContinuousAutoWhiteBalance;
rearCamera.exposureMode = AVCaptureExposureModeContinuousAutoExposure;
rearCamera.focusMode = AVCaptureFocusModeContinuousAutoFocus;
[rearCamera unlockForConfiguration];
The method
captureOutput:(AVCaptureOutput *)captureOutput
didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer
fromConnection:(AVCaptureConnection *)connection
gets called no problem the first time I present the modal viewcontroller, but on the second attempt this method stops getting called (because the session does not start)
For clean up I'm calling stopSession from my parent viewcontroller before dismissing and that does the following:
if ([session isRunning]) {
[session removeInput:input];
[session stopRunning];
[vImagePreview removeFromSuperview];
vImagePreview = nil;
input = nil;
videoDataOutput = nil;
captureVideoPreviewLayer = nil;
session = nil;
queue = nil;
}
I feel like I've tried all sorts of things such as performing a dispatch_sync(queue, ^{}) on the queue to wait for it to be flushed, but that doesn't seem to make a difference (when calling the dispatch_sync I removed the dispatch_release call in my init camera method). I've also tried using the dispatch_set_finalizer_f(queue, capture_cleanup) method suggested in another question but I don't know what needs to actually go in the capture_cleanup method because all of the examples I find are non-ARC code where they call release on pointer to self. I've also combed through all of the sample code I can find from Apple (SquareCam and AVCam) but these are also non-ARC. Any help would be greatly appreciated.

I realized that I was performing a setFocusPointOfInterest on my rear camera and for some reason it was corrupting the session on relaunch. I don't understand why this caused the issue but I will be looking into that.

You might try converting the SquareCam project to ARC before using the source in your program. I was able to do so by using a __bridge cast in the places the converter was complaining, and also replace the "bail:" goto's with simple if statements.

Related

What causes this type of error. No visible #interface for xxxxxx. The root cause. Not just this one

I see there are 869 questions ranging over many years asking how to fix this error. What I'm trying to understand is the fundamental reason for this message.
Each one of the 869 questions is different. Including this one. I'm updating some code from Apple's Resources The AVCamera code, to work with iOS11. Obj-c.
The line returning this error in this case is
[stillImageOutput setOutputSettings:outputSettings];
However, what I really need is to understand what the root causes of this error are. Not just in this case, but most cases.
I first put this project together in 2014, and of course, there have been a dozen updates since then, so there are many Depreciated statements. I have it down to 8 depreciations now, but I have three errors pop up. This has been one of them. Two NO visible interfaces, and one No known class.
//AVCaptureStillImageOutput *stillImageOutput = [[AVCaptureStillImageOutput alloc] init];
AVCapturePhotoOutput *stillImageOutput = [[AVCapturePhotoOutput alloc] init];
if ([session canAddOutput:stillImageOutput])
{
// [stillImageOutput setOutputSettings:#{AVVideoCodecKey : AVVideoCodecTypeJPEG}];
NSDictionary *outputSettings = #{ AVVideoCodecKey : AVVideoCodecTypeJPEG};
[stillImageOutput setOutputSettings:outputSettings];
[session addOutput:stillImageOutput];
[self setStillImageOutput:stillImageOutput];
}
});
As in Sergiy's comment, the method that is said to be "NO visible interfaces" and No known class are removed from the iOS 11's SDK.
Apple sometimes do these drastic changes. But generally these methods and classes get marked as deprecated, first and then (after some iOS major updates) they are removed.
Ok, I understand the circumstances behind Apple's decision. As for the code above, I solved it simply by changing the code to this. The redundant code is commented out, and the new code inserted.
However, the fundamental question of "No visible interface" remains a mystery to me. What exactly does it mean?
//AVCaptureStillImageOutput *stillImageOutput = [[AVCaptureStillImageOutput alloc] init];
AVCapturePhotoOutput *stillImageOutput = [[AVCapturePhotoOutput alloc] init];
if ([self.session canAddOutput:stillImageOutput])
{
[self.session addOutput:stillImageOutput];
self.stillImageOutput = stillImageOutput;
// [stillImageOutput setOutputSettings:#{AVVideoCodecKey : AVVideoCodecTypeJPEG}];
// NSDictionary *outputSettings = #{ AVVideoCodecKey : AVVideoCodecTypeJPEG};
// [stillImageOutput setOutputSettings:outputSettings];
//[session addOutput:stillImageOutput];
//[self setStillImageOutput:stillImageOutput];
}
});

Method captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection only called a few times

I'm capturing audio from external bluetooth microphone. But I can't record anything.
This method is only called one time, at the beginning of the current AvCaptureSession.
- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection
After that I never get called this method for process the audio.
For instantiate the capture session I do this:
self.captureSession.usesApplicationAudioSession = true;
self.captureSession.automaticallyConfiguresApplicationAudioSession = true;
[[AVAudioSession sharedInstance] setCategory:AVAudioSessionCategoryPlayAndRecord withOptions:AVAudioSessionCategoryOptionAllowBluetooth error:nil];
/* Audio */
AVCaptureDevice *audioDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeAudio];
audioIn = [[AVCaptureDeviceInput alloc] initWithDevice:audioDevice error:nil];
if ( [_captureSession canAddInput:audioIn] ) {
[_captureSession addInput:audioIn];
}
[audioIn release];
audioOut = [[AVCaptureAudioDataOutput alloc] init];
// Put audio on its own queue to ensure that our video processing doesn't cause us to drop audio
dispatch_queue_t audioCaptureQueue = dispatch_queue_create( "com.apple.sample.capturepipeline.audio", DISPATCH_QUEUE_SERIAL );
[audioOut setSampleBufferDelegate:self queue:audioCaptureQueue];
[audioCaptureQueue release];
if ( [self.captureSession canAddOutput:audioOut] ) {
[self.captureSession addOutput:audioOut];
}
_audioConnection = [audioOut connectionWithMediaType:AVMediaTypeAudio];
[audioOut release];
If I use another bluetooth device is always working, but not with this one.
I thought this device could be faulty, but actually is working in another apps to record audio.
Is really strange the problem. Anyone knows what could be happening?
Thanks!

AVCaptureVideoDataOutput on iOS 8 does not post sample buffers on the specified dispatch queue

When using AVCaptureVideoDataOutput and defining a sample buffer delegate with a dispatch queue (setSampleBufferDelegate:queue), we are experiencing on iOS 8 that AVFoundation does not post the sample buffers on the specified dispatch queue but rather always uses "com.apple.avfoundation.videodataoutput.bufferqueue".
This works as expected on iOS7.
Has anyone else experienced this?
An obvious workaround is to manually call dispatch_sync in the callback to sync processing to the custom dispatch queue, but this, strangely, causes a deadlock...
Sample code that produces this issue:
- (void)viewDidLoad {
[super viewDidLoad];
// Do any additional setup after loading the view, typically from a nib.
AVCaptureSession *session = [[AVCaptureSession alloc] init];
session.sessionPreset = AVCaptureSessionPresetMedium;
AVCaptureVideoPreviewLayer *captureVideoPreviewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:session];
captureVideoPreviewLayer.frame = self.view.bounds;
[self.view.layer addSublayer:captureVideoPreviewLayer];
[session addInput:[AVCaptureDeviceInput deviceInputWithDevice:[AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo] error:nil]];
AVCaptureVideoDataOutput *output = [[AVCaptureVideoDataOutput alloc] init];
queue = dispatch_queue_create("our.dispatch.queue", DISPATCH_QUEUE_SERIAL);
[output setSampleBufferDelegate:self queue:queue];
[session addOutput:output];
[session startRunning];
}
- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection {
NSLog(#"Running on queue %#, queue that was set is %#, this is %s", dispatch_get_current_queue(),
[captureOutput performSelector:#selector(sampleBufferCallbackQueue)],
queue == dispatch_get_current_queue() ? "our queue" : "not our queue!!!");
}
What's probably happening here is that their queue, com.apple.avfoundation.videodataoutput.bufferqueue, has been set to target yours using dispatch_set_target_queue. This is functionally equivalent to dispatching to your queue, but would explain the name, and would also explain the deadlock when you tried to dispatch back to your queue.
In other words, just because the queue name isn't equal to your queue's name doesn't mean the block isn't executing on your queue.
To get around this issue, I had to modify my -captureOutput::
- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection
{
dispatch_queue_t queue = ((MyAppDelegate *)UIApplication.sharedApplication.delegate).videoDataOutputQueue;
CFRetain(sampleBuffer);
dispatch_async(queue, ^{
for (id<AVCaptureVideoDataOutputSampleBufferDelegate> target in captureTargets.copy)
[target captureOutput:captureOutput didOutputSampleBuffer:sampleBuffer fromConnection:connection];
CFRelease(sampleBuffer);
});
}

iOS Custom Keyboard - camera not working

I want to create a custom keyboard, that acts as a barcode scanner.
I already did the whole coding, but the output is not as expected: I am being asked for camera permissions (the first time), but the camera sends no video to the view.
I think, that there might be some restrictions using keyboards for safety reasons?!?
1.) Turn on the torch
-(void) turnFlashOn
{
AVCaptureDevice *flashLight = [AVCaptureDevice
defaultDeviceWithMediaType:AVMediaTypeVideo];
if([flashLight isTorchAvailable] && [flashLight
isTorchModeSupported:AVCaptureTorchModeOn])
{
BOOL success = [flashLight lockForConfiguration:nil];
if(success){
NSError *error;
[flashLight setTorchMode:AVCaptureTorchModeOn];
[flashLight setTorchModeOnWithLevel:1.0 error:&error];
NSLog(#"Error: %#", error);
[flashLight unlockForConfiguration];
NSLog(#"flash turned on -> OK");
}
else
{
NSLog(#"flash turn on -> ERROR");
}
}
}
This gives me this log output, but nothing happens with the flash:
Error: (null)
flash turned on -> OK
2.) Scan the barcode (part of viewDidLoad)
// SCANNER PART
self.captureSession = [[AVCaptureSession alloc] init];
AVCaptureDevice *videoCaptureDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
NSError *error = nil;
AVCaptureDeviceInput *videoInput = [AVCaptureDeviceInput deviceInputWithDevice:videoCaptureDevice error:&error];
if(videoInput)
[self.captureSession addInput:videoInput];
else
NSLog(#"Error: %#", error);
AVCaptureMetadataOutput *metadataOutput = [[AVCaptureMetadataOutput alloc] init];
[self.captureSession addOutput:metadataOutput];
[metadataOutput setMetadataObjectsDelegate:self queue:dispatch_get_main_queue()];
[metadataOutput setMetadataObjectTypes:#[AVMetadataObjectTypeQRCode, AVMetadataObjectTypeEAN13Code]];
AVCaptureVideoPreviewLayer *previewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:self.captureSession];
camView = [[UIView alloc] initWithFrame: [[UIScreen mainScreen] bounds]];
previewLayer.frame = camView.layer.bounds;
[camView.layer addSublayer:previewLayer];
self.keyboard.barcodeView.clipsToBounds=YES;
camView.center = CGPointMake(self.keyboard.barcodeView.frame.size.width/2, self.keyboard.barcodeView.frame.size.height/2);
[self.keyboard.barcodeView addSubview:camView];
And if I press a special key on my keyboard this one is called:
-(void)scanBarcodeNow{
AudioServicesPlaySystemSound(systemSoundTock);
NSLog(#"Start scanning...");
self.keyboard.barcodeView.hidden=false;
[self.keyboard.barcodeView addSubview:camView];
[self.keyboard.barcodeView setBackgroundColor:[UIColor redColor]];
[self.captureSession startRunning];
}
The only thing happens, is that the keyboard.barcodeView changes its background color to red. I've made this to see, that all the wiring that I've done should be Ok. But no video from the cam is shown....
Can anyone help me out?
The reason you're getting back null is because you don't have access to it. It's actually not a bug. According to Apple guidelines certain APIs are not available to iOS 8 extensions (See bullet #3 below).
It sucks, but I always encourage people to read up on new features and see if what they want to do is possible, before dwelling into an idea (Saves a lot of time). Definitely check out the App Extension Programming Guide for more information.

How do I properly cleanup an AVCaptureSession and AVCaptureVideoPreviewLayer

I'm using the AVFoundation api to create a camera preview view and I'm having trouble cleaning up after I'm done.
The best answer I've found to this problem is in this SO thread, thanks Codo.
However, he doesn't address the deallocation of the AVCaptureVideoPreviewLayer, and that's where I'm having trouble.
In my view controller class I have some initialization code in a startCameraCapture method. Listening to Codo's answer, I'm using dispatch_set_finalizer_f(_captureQueue, capture_cleanup); to register a callback to be called when the queue is truly closed.
I'm also retaining self, to make sure my object doesn't go away before the queue is done calling my object. I then use the capture_cleanup callback to release self.
-(void) startCameraCapture {
_camSession = [[AVCaptureSession alloc] init];
if (_previewLayer == nil) {
_previewLayer = [AVCaptureVideoPreviewLayer layerWithSession:_camSession];
}
_previewLayer.frame = self.compView.bgView.frame;
[self.compView.bgView.layer addSublayer:_previewLayer];
// Get the default camera device
AVCaptureDevice* camera = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
// Create a AVCaptureInput with the camera device
NSError *error=nil;
AVCaptureInput* cameraInput = [[AVCaptureDeviceInput alloc] initWithDevice:camera error:&error];
if (cameraInput == nil) {
NSLog(#"Error to create camera capture:%#",error);
}
AVCaptureVideoDataOutput* videoOutput = [[[AVCaptureVideoDataOutput alloc] init] autorelease];
// create a queue to run the capture on
_captureQueue=dispatch_queue_create("captureQueue", NULL);
dispatch_set_context(_captureQueue, self);
dispatch_set_finalizer_f(_captureQueue, capture_cleanup);
// setup our delegate
[videoOutput setSampleBufferDelegate:self queue:_captureQueue];
dispatch_release(_captureQueue);
// retain self as a workouround a queue finalization bug in apples's sdk
// per Stackoverflow answer https://stackoverflow.com/questions/3741121/how-to-properly-release-an-avcapturesession
[self retain];
// configure the pixel format
videoOutput.videoSettings = [NSDictionary dictionaryWithObjectsAndKeys:[NSNumber numberWithUnsignedInt:kCVPixelFormatType_32BGRA], (id)kCVPixelBufferPixelFormatTypeKey,
nil];
// and the size of the frames we want
[_camSession setSessionPreset:AVCaptureSessionPresetMedium];
// Add the input and output
[_camSession addInput:cameraInput];
[_camSession addOutput:videoOutput];
[cameraInput release];
// Start the session
[_camSession startRunning];
}
Here the capture_cleanup callback:
static void capture_cleanup(void* p)
{
LiveCompViewController* ar = (LiveCompViewController*)p;
[ar release]; // releases capture session if dealloc is called
}
Then my cleanup code looks like this:
-(void) stopCameraCapture {
[_camSession stopRunning];
[_camSession release];
_camSession=nil;
// Remove the layer in order to release the camSession
[_previewLayer removeFromSuperlayer];
_previewLayer = nil;
}
The problem I'm having is that removing the _previewLayer from the superlayer in stopCameraCapture is causing the following console error:
"...modifying layer that is being finalized..."
But I need to remove the layer so that it gets released and deallocated so that it releases the _camSession which in turn releases the dispatch_queue and then finally calls my capture_cleanup callback which finally releases self.
I don't understand why I'm getting the console error and how to fix it. Why is the Layer being finalized at the time I'm calling [_previewLayer removeFromSuperlayer] if self.dealloc hasn't been called.
Note: self is a viewController and I haven't popped it yet, so it is retained by the NavigationContoller.
Try stopping the session before releasing:
[captureSession stopRunning];

Resources