AVCaptureSession output is not properly displayed and AVCaptureMetadataOutputObjectsDelegate method is not called - ios

I have a single view application in which I am trying to test iOS7's AVCaptureMetadataOutput based on this explanation. My ViewController conforms to AVCaptureMetadataOutputObjectsDelegate and the code looks like this (almost exactly the same as Mattt's):
- (void)viewDidLoad
{
[super viewDidLoad];
// Do any additional setup after loading the view, typically from a nib.
// Testing the VIN Scanner before I make it part of the library
NSLog(#"Setting up the vin scanner");
AVCaptureSession *session = [[AVCaptureSession alloc] init];
AVCaptureDevice *device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
NSError *error = nil;
AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:device
error:&error];
if (input) {
[session addInput:input];
} else {
NSLog(#"Error: %#", error);
}
AVCaptureMetadataOutput *output = [[AVCaptureMetadataOutput alloc] init];
[output setMetadataObjectsDelegate:self queue:dispatch_get_main_queue()];
[session addOutput:output];
[session startRunning];
}
- (void)captureOutput:(AVCaptureOutput *)captureOutput
didOutputMetadataObjects:(NSArray *)metadataObjects
fromConnection:(AVCaptureConnection *)connection
{
NSString *code = nil;
for (AVMetadataObject *metadata in metadataObjects) {
if ([metadata.type isEqualToString:AVMetadataObjectTypeCode39Code]) {
code = [(AVMetadataMachineReadableCodeObject *)metadata stringValue];
break;
}
}
NSLog(#"code: %#", code);
}
When I run this on an iOS7 device (I've tried an iPhone 4 and iPhone 4s) XCode logs "Setting up the vin scanner" but the camera (ie the AVCaptureSession) never opens.
Edit 1:
I added the following code to show the camera output on screen:
AVCaptureVideoPreviewLayer *previewLayer = [AVCaptureVideoPreviewLayer layerWithSession:session];
// Display full screen
previewLayer.frame = self.view.frame;
// Add the video preview layer to the view
[self.view.layer addSublayer:previewLayer];
But the display is very odd, does not conform to the screen and the way it rotates does not make sense. The other issue is that when I focus the camera on a bar code the metadata delegate method is never called. Please see pictures below:

The camera will not open the way it does for the UIImagePickerController. The problem is that your code does nothing with the output. You'll need to add a preview layer to display the output of the camera as it streams in.
AVCaptureVideoPreviewLayer *previewLayer = [AVCaptureVideoPreviewLayer layerWithSession:session];
// Display full screen
previewLayer.frame = CGRectMake(0.0, 0.0, self.view.frame.size.width, self.view.frame.size.height);
// Add the video preview layer to the view
[self.view.layer addSublayer:previewLayer];
[session startRunning];
Edit**
After taking a deeper look at your code I noticed a few more issues.
First you need to also set the MetaDataObjectTypes you want to search for, right now your not looking for any valid object types. This should be added after you add the output to the session. You can view the full list of available types in the documentation
[output setMetadataObjectTypes:#[AVMetadataObjectTypeQRCode, AVMetadataObjectTypeEAN13Code]];
Second your AVCaptureSession *session is a local variable in your viewDidLoad, take this and place it just after your #interface ViewController () as shown below.
#interface ViewController ()
#property (nonatomic, strong) AVCaptureSession *session;
#end
#implementation ViewController
- (void)viewDidLoad
{
[super viewDidLoad];
self.session = [[AVCaptureSession alloc] init];
// Testing the VIN Scanner before I make it part of the library
NSLog(#"Setting up the vin scanner");
self.session = [[AVCaptureSession alloc] init];
AVCaptureDevice *device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
NSError *error = nil;
AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:device
error:&error];
if (input) {
[self.session addInput:input];
} else {
NSLog(#"Error: %#", error);
}
AVCaptureMetadataOutput *output = [[AVCaptureMetadataOutput alloc] init];
[output setMetadataObjectsDelegate:self queue:dispatch_get_main_queue()];
[self.session addOutput:output];
[output setMetadataObjectTypes:#[AVMetadataObjectTypeQRCode, AVMetadataObjectTypeEAN13Code]];
[self.session startRunning];
}

Related

I need to use both AVCaptureVideoDataOutput and AVCaptureMetadataOutput

I'm writing an app that needs to look at the raw video (custom edge detection etc) and use the meta data barcode reader.
even though the AVCaptureSession has an addOutput: method instead of setOutput: method, that's exactly what it's doing - first one in wins.
if I add AVCaptureVideoDataOutput as output first - it's delegate gets called.
if I add AVCaptureMetadataOutput as output first - it's delegate gets called.
Has anyone figured out a way around this?
short of removing the other one every other frame?
I was able to add both AVCaptureVideoDataOutput and AVCaptureMetadataOutput.
NSError *error = nil;
self.captureSession = [[AVCaptureSession alloc] init];
[self.captureSession setSessionPreset:AVCaptureSessionPresetHigh];
// Select a video device, make an input
AVCaptureDevice *captureDevice;
AVCaptureDevicePosition desiredPosition = AVCaptureDevicePositionFront;
// Find the front facing camera
for (AVCaptureDevice *device in [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo]) {
if ([device position] == desiredPosition) {
captureDevice = device;
break;
}
}
AVCaptureDeviceInput *deviceInput = [AVCaptureDeviceInput deviceInputWithDevice:captureDevice error:&error];
if (!error) {
[self.captureSession beginConfiguration];
// add the input to the session
if ([self.captureSession canAddInput:deviceInput]) {
[self.captureSession addInput:deviceInput];
}
AVCaptureMetadataOutput *metadataOutput = [AVCaptureMetadataOutput new];
if ([self.captureSession canAddOutput:metadataOutput]) {
[self.captureSession addOutput:metadataOutput];
self.metaDataOutputQueue = dispatch_queue_create("MetaDataOutputQueue", DISPATCH_QUEUE_SERIAL);
[metadataOutput setMetadataObjectsDelegate:self queue:self.metaDataOutputQueue];
[metadataOutput setMetadataObjectTypes:#[AVMetadataObjectTypeQRCode]];
}
self.videoDataOutput = [AVCaptureVideoDataOutput new];
if ([self.captureSession canAddOutput:self.videoDataOutput]) {
[self.captureSession addOutput:self.videoDataOutput];
NSDictionary *rgbOutputSettings = [NSDictionary dictionaryWithObject:[NSNumber numberWithInt:kCMPixelFormat_32BGRA] forKey:(id)kCVPixelBufferPixelFormatTypeKey];
[self.videoDataOutput setVideoSettings:rgbOutputSettings];
[self.videoDataOutput setAlwaysDiscardsLateVideoFrames:YES];
self.videoDataOutputQueue = dispatch_queue_create("VideoDataOutputQueue", DISPATCH_QUEUE_SERIAL);
[self.videoDataOutput setSampleBufferDelegate:self queue:self.videoDataOutputQueue];
[[self.videoDataOutput connectionWithMediaType:AVMediaTypeVideo] setEnabled:YES];
}
[self.captureSession commitConfiguration];
[self.captureSession startRunning];
}

Set Background Camera behind UILabels

I have been trying for very long and could not get it to work out but basically I would like to display the live camera feed in the background behind my labels and buttons. Here is the code I am working with to make the camera appear
- (void)viewDidLoad {
[super viewDidLoad];
AVCaptureSession *session = [[AVCaptureSession alloc] init];
session.sessionPreset = AVCaptureSessionPresetHigh;
AVCaptureDevice *device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
NSError *error = nil;
AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:device error:&error];
[session addInput:input];
AVCaptureVideoPreviewLayer *newCaptureVideoPreviewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:session];
newCaptureVideoPreviewLayer.frame = self.view.bounds;
[self.view addSublayer:newCaptureVideoPreviewLayer.view];
[self.view sendSubviewToBack:newCaptureVideoPreviewLayer.view];
[session startRunning];
}
I do not know how to place it behind the labels on the view did load. Any help would be much appreciated!
You just need to add the view in background by using sendSubviewToBack
For more detail you can check Apple AVCam Example
sendSubviewToBack:
Moves the specified subview so that it appears behind its siblings.
AVCaptureVideoPreviewLayer *newCaptureVideoPreviewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:session];
newCaptureVideoPreviewLayer.frame = self.view.bounds;
[self.view.layer addSublayer:newCaptureVideoPreviewLayer];
[session startRunning];
OR
[[self.view superView] insertSubview:newCaptureVideoPreviewLayer belowSubview:self.view];

AVCam: Zoom further out (as it is possible for photo-mode)?

Using XCode-6.4, iOS-8.4.1:
With AVCaptureSession, I would like to zoom further out (as much as the iPhone camera possibly can manage!)
I already use the "setVideoZoomFactor" method set equal to 1 (= its smallest value allowed). This works quite good (see code-example at the very bottom...). But then I did the following observation (recognising that the camera in photo-mode possibly manages to zoom even further out than being in video-mode):
The iPhone-camera in photo-mode shows a completely different zoom than the camera being in video-mode (at least for my iPhone 5S). You can test yourself using the native "Camera App" on your iPhone. Switch between PHOTO and VIDEO and you will see that the Photo-mode can possibly zoom further out than video-zoomfactor=1). How is that possible ???
And moreover, is there any way in achieving the same minimal zoomfactor the photo-mode achieves also in video-mode using AVCam under iOS ????
Here is an illustration of what the zoom-difference is between photo-mode and video-mode of my 5S iPhone (see picture):
Here is the code of the AVCamViewController:
- (void)viewDidLoad
{
[super viewDidLoad];
// Create the AVCaptureSession
AVCaptureSession *session = [[AVCaptureSession alloc] init];
[self setSession:session];
// Setup the preview view
[[self previewView] setSession:session];
// Check for device authorization
[self checkDeviceAuthorizationStatus];
dispatch_queue_t sessionQueue = dispatch_queue_create("session queue", DISPATCH_QUEUE_SERIAL);
[self setSessionQueue:sessionQueue];
// http://stackoverflow.com/questions/25110055/ios-captureoutputdidoutputsamplebufferfromconnection-is-not-called
dispatch_async(sessionQueue, ^{
self.session = [AVCaptureSession new];
self.session.sessionPreset = AVCaptureSessionPresetMedium;
NSArray *devices = [AVCaptureDevice devices];
AVCaptureDevice *backCamera;
for (AVCaptureDevice *device in devices) {
if ([device hasMediaType:AVMediaTypeVideo]) {
if ([device position] == AVCaptureDevicePositionBack) {
backCamera = device;
}
}
}
NSError *error = nil;
AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:backCamera error:&error];
if (error) {
NSLog(#"%#",error);
}
if ([self.session canAddInput:input]) {
[self.session addInput:input];
}
AVCaptureVideoDataOutput *output = [AVCaptureVideoDataOutput new];
[output setSampleBufferDelegate:self queue:sessionQueue];
output.videoSettings = #{(id)kCVPixelBufferPixelFormatTypeKey:#(kCVPixelFormatType_32BGRA)};
if ([self.session canAddOutput:output]) {
[self.session addOutput:output];
}
// Apply initial VideoZoomFactor to the device
NSNumber *DefaultZoomFactor = [NSNumber numberWithFloat:1.0];
if ([backCamera lockForConfiguration:&error])
{
// HERE IS THE ZOOMING DONE !!!!!!
[backCamera setVideoZoomFactor:[DefaultZoomFactor floatValue]];
[backCamera unlockForConfiguration];
}
else
{
NSLog(#"%#", error);
}
[self.session startRunning];
});
}
If your problem is that your app is zooming the photo by comparison to native iOS camera app, then this setup will probably help you.
I had "this" issue and the following solution fix it.
The solution:
Configure your session
- (void)viewDidLoad
{
[super viewDidLoad];
// Create the AVCaptureSession
AVCaptureSession *session = [[AVCaptureSession alloc] init];
[self setSession:session];
--> self.session.sessionPreset = AVCaptureSessionPresetPhoto; <---
...
}
You must be aware that setup will only work for photos and not for videos. If you try with videos, your app shall crash.
You can configure your session based upon your needs (photo or video)
For video you can use this value: AVCaptureSessionPresetHigh
BR.

Quickblox video chat saving

I am using QuickBlox iOS SDK for vidoe chating in my app. It works fine. Now I want to record the chat video and save it in camera roll. How can I do that.
I have gone through their documentation and implemented this -
-(IBAction)record:(id)sender{
// Create video Chat
videoChat = [[QBChat instance] createAndRegisterVideoChatInstance];
[videoChat setIsUseCustomVideoChatCaptureSession:YES];
// Create capture session
captureSession = [[AVCaptureSession alloc] init];
// ... setup capture session here
/*We create a serial queue to handle the processing of our frames*/
dispatch_queue_t callbackQueue= dispatch_queue_create("cameraQueue", NULL);
[videoCaptureOutput setSampleBufferDelegate:self queue:callbackQueue];
/*We start the capture*/
[captureSession startRunning];
}
-(void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer: (CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection {
// Do something with samples
// ...
// forward video samples to SDK
[videoChat processVideoChatCaptureVideoSample:sampleBuffer];
}
But I am not sure what to do from here.
How should I get the video data ?
From the quickblox docs
To setup a custom video capture session you simply follow these steps:
create an instance of AVCaptureSession
setup the input and output
implement frames callback and forward all frames to the QuickBlox iOS SDK
tell the QuickBlox SDK that you will use your own capture session
To setup a custom video capture session, setup input and output:
-(void) setupVideoCapture{
self.captureSession = [[AVCaptureSession alloc] init];
__block NSError *error = nil;
// set preset
[self.captureSession setSessionPreset:AVCaptureSessionPresetLow];
// Setup the Video input
AVCaptureDevice *videoDevice = [self frontFacingCamera];
//
AVCaptureDeviceInput *captureVideoInput = [AVCaptureDeviceInput deviceInputWithDevice:videoDevice error:&error];
if(error){
QBDLogEx(#"deviceInputWithDevice Video error: %#", error);
}else{
if ([self.captureSession canAddInput:captureVideoInput]){
[self.captureSession addInput:captureVideoInput];
}else{
QBDLogEx(#"cantAddInput Video");
}
}
// Setup Video output
AVCaptureVideoDataOutput *videoCaptureOutput = [[AVCaptureVideoDataOutput alloc] init];
videoCaptureOutput.alwaysDiscardsLateVideoFrames = YES;
//
// Set the video output to store frame in BGRA (It is supposed to be faster)
NSString* key = (NSString*)kCVPixelBufferPixelFormatTypeKey;
NSNumber* value = [NSNumber numberWithUnsignedInt:kCVPixelFormatType_32BGRA];
NSDictionary* videoSettings = [NSDictionary dictionaryWithObject:value forKey:key];
[videoCaptureOutput setVideoSettings:videoSettings];
/*And we create a capture session*/
if([self.captureSession canAddOutput:videoCaptureOutput]){
[self.captureSession addOutput:videoCaptureOutput];
}else{
QBDLogEx(#"cantAddOutput");
}
[videoCaptureOutput release];
// set FPS
int framesPerSecond = 3;
AVCaptureConnection *conn = [videoCaptureOutput connectionWithMediaType:AVMediaTypeVideo];
if (conn.isVideoMinFrameDurationSupported){
conn.videoMinFrameDuration = CMTimeMake(1, framesPerSecond);
}
if (conn.isVideoMaxFrameDurationSupported){
conn.videoMaxFrameDuration = CMTimeMake(1, framesPerSecond);
}
/*We create a serial queue to handle the processing of our frames*/
dispatch_queue_t callbackQueue= dispatch_queue_create("cameraQueue", NULL);
[videoCaptureOutput setSampleBufferDelegate:self queue:callbackQueue];
dispatch_release(callbackQueue);
// Add preview layer
AVCaptureVideoPreviewLayer *prewLayer = [[[AVCaptureVideoPreviewLayer alloc] initWithSession:self.captureSession] autorelease];
[prewLayer setVideoGravity:AVLayerVideoGravityResizeAspectFill];
CGRect layerRect = [[myVideoView layer] bounds];
[prewLayer setBounds:layerRect];
[prewLayer setPosition:CGPointMake(CGRectGetMidX(layerRect),CGRectGetMidY(layerRect))];
myVideoView.hidden = NO;
[myVideoView.layer addSublayer:prewLayer];
/*We start the capture*/
[self.captureSession startRunning];
}
- (AVCaptureDevice *) cameraWithPosition:(AVCaptureDevicePosition) position{
NSArray *devices = [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo];
for (AVCaptureDevice *device in devices) {
if ([device position] == position) {
return device;
}
}
return nil;
}
- (AVCaptureDevice *) backFacingCamera{
return [self cameraWithPosition:AVCaptureDevicePositionBack];
}
- (AVCaptureDevice *) frontFacingCamera{
return [self cameraWithPosition:AVCaptureDevicePositionFront];
}
Implement frames callback:
- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection {
// Usually we just forward camera frames to QuickBlox SDK
// But we also can do something with them before, for example - apply some video filters or so
[self.videoChat processVideoChatCaptureVideoSample:sampleBuffer];
}
Tell to QuickBlox iOS SDK that we use our own video capture session:
self.videoChat = [[QBChat instance] createAndRegisterVideoChatInstance];
self.videoChat.viewToRenderOpponentVideoStream = opponentVideoView;
//
// we use own video capture session
self.videoChat.isUseCustomVideoChatCaptureSession = YES;

Play audio ruins my AVCaptureSession ios

I am using AVCaptureSession to record audio and video and pass this data to
- (void) captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection {
if(!_isPaused) {
pass sample buffer to encoder
}
}
Everything works nice (delegate is called and I am processing frames) until I open control panel and play music (I pause processing video from appDelegate). After stop music and try to continue recording the delegate method is never called. I think that playing audio something "ruins" my AVCaptureSession or delegate and I need to fix this issue...
here is method where I init AVCaptureSession
- (void) startup
{
if (_session == nil)
{
NSLog(#"Starting camera");
// create capture device with video input
_session = [[AVCaptureSession alloc] init];
AVCaptureDevice* backCamera = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
AVCaptureDeviceInput* input = [AVCaptureDeviceInput deviceInputWithDevice:backCamera error:nil];
[_session addInput:input];
// audio input from default mic
AVCaptureDevice* mic = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeAudio];
AVCaptureDeviceInput* micinput = [AVCaptureDeviceInput deviceInputWithDevice:mic error:nil];
[_session addInput:micinput];
_captureQueue = dispatch_queue_create("myQueue", DISPATCH_QUEUE_SERIAL);
AVCaptureVideoDataOutput* videoout = [[AVCaptureVideoDataOutput alloc] init];
[videoout setSampleBufferDelegate:self queue:_captureQueue];
NSDictionary* setcapSettings = [NSDictionary dictionaryWithObjectsAndKeys:
[NSNumber numberWithInt:kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange], kCVPixelBufferPixelFormatTypeKey,
nil];
videoout.videoSettings = setcapSettings;
[_session addOutput:videoout];
_videoConnection = [videoout connectionWithMediaType:AVMediaTypeVideo];
NSDictionary* actual = videoout.videoSettings;
AVCaptureAudioDataOutput* audioout = [[AVCaptureAudioDataOutput alloc] init];
[audioout setSampleBufferDelegate:self queue:_captureQueue];
[_session addOutput:audioout];
_audioConnection = [audioout connectionWithMediaType:AVMediaTypeAudio];
[_session startRunning];
}
}
and here is my interface:
#interface CameraController () <AVCaptureVideoDataOutputSampleBufferDelegate, AVCaptureAudioDataOutputSampleBufferDelegate>
{
AVCaptureSession* _session;
AVCaptureVideoPreviewLayer* _preview;
dispatch_queue_t _captureQueue;
AVCaptureConnection* _audioConnection;
AVCaptureConnection* _videoConnection; }
and in appDelegate I pause and resume rocording like:
-(void)applicationDidBecomeActive:(UIApplication *)application {
[[CamerraController controller] resume capture]; //this only set _isPaused = NO;
}
-(void)applicationWillResignActive:(UIApplication *)application {
[[CamerraController controller] pauseCapture]; //this only set _isPaused = YES;
}
I am in dead end, thank you for any help

Resources