I've written an application that takes advantage of the new AVCaptureMetadataOutput APIs in iOS 7 for barcode scanning.
I have the following code in one of my view controllers:
- (void)viewDidLoad
{
[super viewDidLoad];
highlightView = [[UIView alloc] init];
[highlightView setAutoresizingMask:UIViewAutoresizingFlexibleTopMargin | UIViewAutoresizingFlexibleLeftMargin | UIViewAutoresizingFlexibleRightMargin | UIViewAutoresizingFlexibleBottomMargin];
[[highlightView layer] setBorderColor:[[UIColor greenColor] CGColor]];
[[highlightView layer] setBorderWidth:3.0];
[[self view] addSubview:highlightView];
session = [[AVCaptureSession alloc] init];
device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
input = [AVCaptureDeviceInput deviceInputWithDevice:device error:nil];
[session addInput:input];
output = [[AVCaptureMetadataOutput alloc] init];
[output setMetadataObjectsDelegate:self queue:dispatch_get_main_queue()];
[session addOutput:output];
[output setMetadataObjectTypes:[output availableMetadataObjectTypes]];
[output setRectOfInterest:[[self view] bounds]];
previewLayer = [AVCaptureVideoPreviewLayer layerWithSession:session];
[previewLayer setFrame:[[self view] bounds]];
[previewLayer setVideoGravity:AVLayerVideoGravityResizeAspectFill];
if ([[previewLayer connection] isVideoOrientationSupported]) {
[[previewLayer connection] setVideoOrientation:(AVCaptureVideoOrientation)[[UIApplication sharedApplication] statusBarOrientation]];
}
[[[self view] layer] insertSublayer:previewLayer above:[[self view] layer]];
[session startRunning];
[[self view] bringSubviewToFront:highlightView];
}
- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputMetadataObjects:(NSArray *)metadataObjects fromConnection:(AVCaptureConnection *)connection
{
CGRect highlightViewRect = CGRectZero;
AVMetadataMachineReadableCodeObject *barcode;
NSArray *barCodeTypes = #[AVMetadataObjectTypeUPCECode, AVMetadataObjectTypeCode39Code,AVMetadataObjectTypeCode39Mod43Code, AVMetadataObjectTypeEAN13Code, AVMetadataObjectTypeEAN8Code, AVMetadataObjectTypeCode93Code, AVMetadataObjectTypeCode128Code, AVMetadataObjectTypePDF417Code, AVMetadataObjectTypeQRCode, AVMetadataObjectTypeAztecCode];
for (AVMetadataObject *metadata in metadataObjects) {
if ([barCodeTypes containsObject:[metadata type]]) {
barcode = (AVMetadataMachineReadableCodeObject *)[previewLayer transformedMetadataObjectForMetadataObject:(AVMetadataMachineReadableCodeObject *)metadata];
highlightViewRect = [barcode bounds];
break;
}
}
[highlightView setFrame:highlightViewRect];
[delegate barcodeScannerController:self didFinishScanningWithBarcode:[barcode stringValue]];
}
This code works in that it can detect various barcode types and convert barcodes into their string values. What I'm wondering why on iPhones, only barcodes near the center of the view are detected, while on iPads, only barcodes located near the bottom of the view are detected. It's a very peculiar behaviour, and in the case of the iPad, not intuitive at all.
Related
We have an AVCaptureVideoPreviewLayer that should display the Camera image.
On iPhone 10 (and iPhone 7), the image is displayed correctly.
However on iPhone XR it is displayed in pink.
Here is my code how I setup the Camera Session:
UIView* scanView = [[UIView alloc]initWithFrame:[self view].frame];
[self setCameraScanView:scanView];
[[self cameraScanView]setContentMode:UIViewContentModeScaleAspectFit];
[self setScanCaptureSession:[[AVCaptureSession alloc] init]];
[[self scanCaptureSession]setSessionPreset:AVCaptureSessionPresetLow];
AVCaptureDevice* camera = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
AVCaptureVideoDataOutput* videoOutput = [[AVCaptureVideoDataOutput alloc] init];
[[videoOutput connectionWithMediaType:AVMediaTypeVideo]setEnabled:YES];
[videoOutput setAlwaysDiscardsLateVideoFrames:YES];
[videoOutput setSampleBufferDelegate:self
queue:dispatch_get_main_queue()];
[[self scanCaptureSession]addOutput:videoOutput];
NSError* error;
AVCaptureDeviceInput* input = [[AVCaptureDeviceInput alloc]
initWithDevice:camera error:&error];
[[self scanCaptureSession]addInput:input];
if(error) {
NSLog(#"There was an error setting up Device Input!");
}
AVCaptureVideoPreviewLayer* previewLayer = [[AVCaptureVideoPreviewLayer alloc]initWithSession:[self scanCaptureSession]];
[self setCameraLayer:previewLayer];
[[self cameraLayer] setContentsFormat:kCAContentsFormatRGBA16Float];
[[self cameraLayer] setOpaque:YES];
CALayer* rootLayer = [[self cameraScanView]layer];
[[self cameraLayer]setFrame:[self cameraScanView].bounds];
[[self cameraLayer]setVideoGravity:AVLayerVideoGravityResizeAspectFill];
[rootLayer addSublayer:[self cameraLayer]];
[[self scanCaptureSession]commitConfiguration];
[[self view] addSubview:[self cameraScanView]];
Here is an image how it looks on iPhone XR.
I am making an app that scans a barcode that inverted color (black background & white bars). I have to use AVFoundation. Currently, I am using AVCaptureMetadataOutput. I can get it to work perfectly with a normal barcode. I need to invert the color on the white -> black & black -> white etc. Can I add a CIColorInvert to the Input in AVCaptureSession
- (void)viewDidLoad
{
[super viewDidLoad];
// Do any additional setup after loading the view from its nib.
mCaptureSession = [[AVCaptureSession alloc] init];
AVCaptureDevice *videoCaptureDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
NSError *error = nil;
AVCaptureDeviceInput *videoInput = [AVCaptureDeviceInput deviceInputWithDevice:videoCaptureDevice error:&error];
if([mCaptureSession canAddInput:videoInput]){
[mCaptureSession addInput:videoInput];
} else {
NSLog(#"Could not add video input: %#", [error localizedDescription]);
}
// set up metadata output and this class as its delegate so that if metadata (barcode 39) is detected it will send the data to this class
AVCaptureMetadataOutput *metadataOutput = [[AVCaptureMetadataOutput alloc] init];
if([mCaptureSession canAddOutput:metadataOutput]){
[mCaptureSession addOutput:metadataOutput];
[metadataOutput setMetadataObjectsDelegate:self queue:dispatch_get_main_queue()];
[metadataOutput setMetadataObjectTypes:#[AVMetadataObjectTypeCode39Code]];
} else {
NSLog(#"Could not add metadata output");
}
// sets up what the camera sees as a layer of the view
AVCaptureVideoPreviewLayer *previewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:mCaptureSession];
//CGRect frame = CGRectMake(0.0 - 50, 0.0, 1024.0, 1024.0 + 720.0);
CGRect bounds=self.view.layer.bounds;
previewLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;
previewLayer.bounds=bounds;
previewLayer.position=CGPointMake(CGRectGetMidX(bounds), CGRectGetMidY(bounds));
NSArray *filters = [[NSArray alloc] initWithObjects:[CIFilter filterWithName:#"CIColorInvert"], nil];
[previewLayer setFilters:filters];
//[previewLayer setFrame:self.view.bounds];
[self.view.layer addSublayer:previewLayer];
//starts the camera session
[mCaptureSession startRunning];
}
I am using iOS 7 AVFoundation framework for Barcode scanning. It works fine for me but on some Barcodes it gives me wrong result.
I am adding code file and Barcode also to generate the issue. Please take a look and help me to find out the problem.
Code Sample:
ViewController.h Class-
#import <UIKit/UIKit.h>
#import <AVFoundation/AVFoundation.h>
#interface ViewController : UIViewController <AVCaptureMetadataOutputObjectsDelegate>
{
AVCaptureSession *_session;
AVCaptureDevice *_device;
AVCaptureDeviceInput *_input;
AVCaptureMetadataOutput *_output;
AVCaptureVideoPreviewLayer *_prevLayer;
UIView *_highlightView;
IBOutlet UIView *_viewBg;
IBOutlet UIButton *_btnScanning;
}
#end
ViewController.m Class Methods -
- (void)setScanningVideoOrientation
{
if ([UIDevice currentDevice].orientation == UIInterfaceOrientationPortrait)
{
_prevLayer.connection.videoOrientation = AVCaptureVideoOrientationPortrait;
}
else if ([UIDevice currentDevice].orientation == UIInterfaceOrientationPortraitUpsideDown)
{
_prevLayer.connection.videoOrientation = AVCaptureVideoOrientationPortraitUpsideDown;
}
else if ([UIDevice currentDevice].orientation == UIInterfaceOrientationLandscapeLeft)
{
_prevLayer.connection.videoOrientation = AVCaptureVideoOrientationLandscapeLeft;
}
else if ([UIDevice currentDevice].orientation == UIInterfaceOrientationLandscapeRight)
{
_prevLayer.connection.videoOrientation = AVCaptureVideoOrientationLandscapeRight;
}
}
- (void)startScanning
{
// Create Session.------------------------------------------------------
_session = nil;
_session = [[AVCaptureSession alloc] init];
_device = nil;
_device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
NSError *error = nil;
_input = nil;
_input = [AVCaptureDeviceInput deviceInputWithDevice:_device error:&error];
if (_input)
{
[_session addInput:_input];
_output = nil;
_output = [[AVCaptureMetadataOutput alloc] init];
[_output setMetadataObjectsDelegate:self queue:dispatch_get_main_queue()];
[_session addOutput:_output];
_output.metadataObjectTypes = [_output availableMetadataObjectTypes];
_prevLayer = nil;
_prevLayer = [AVCaptureVideoPreviewLayer layerWithSession:_session];
_prevLayer.frame = _viewBg.frame;
_prevLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;
[self setScanningVideoOrientation];
[self.view.layer addSublayer:_prevLayer];
[_session startRunning];
} else
{
NSLog(#"Error: %#", error);
}
//----------------------------------------------------------------------
}
In this delegate method. I am getting wrong barcode string. Please take a look on 'detectionString'.
- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputMetadataObjects:(NSArray *)metadataObjects fromConnection:(AVCaptureConnection *)connection
{
CGRect highlightViewRect = CGRectZero;
AVMetadataMachineReadableCodeObject *barCodeObject;
NSString *detectionString = nil;
NSArray *barCodeTypes = #[AVMetadataObjectTypeUPCECode, AVMetadataObjectTypeCode39Code, AVMetadataObjectTypeCode39Mod43Code,
AVMetadataObjectTypeEAN13Code, AVMetadataObjectTypeEAN8Code, AVMetadataObjectTypeCode93Code, AVMetadataObjectTypeCode128Code,
AVMetadataObjectTypePDF417Code, AVMetadataObjectTypeQRCode, AVMetadataObjectTypeAztecCode];
for (AVMetadataObject *metadata in metadataObjects) {
for (NSString *type in barCodeTypes) {
if ([metadata.type isEqualToString:type])
{
barCodeObject = (AVMetadataMachineReadableCodeObject *)[_prevLayer transformedMetadataObjectForMetadataObject:(AVMetadataMachineReadableCodeObject *)metadata];
highlightViewRect = barCodeObject.bounds;
detectionString = [(AVMetadataMachineReadableCodeObject *)metadata stringValue];
break;
}
}
if (detectionString != nil)
{
NSLog(#"ViewController-->captureOutput-->detectionString = %#",detectionString);
[self stopScanning];
break;
}
}
barCodeObject = nil;
detectionString = nil;
barCodeTypes = nil;
}
Barcode Image -
I am getting result - 0649954006029
But it should be - 649954006029.
As for some other barcodes, i am seeing 'l' before actual barcode scanning string.
Hope it will help you to identify the problem.
Thanks.
The reason you're getting a 13 digit code is because you're trying to scan a UPC-A code (649954..) and iOS doesn't have a separate category for UPC-A codes.
UPC-A codes come scanned in as EAN13 codes, and those are 13 digits long, so they get a prefix of 0.
From AVFoundation/AVMetadata.h
/*! #constant AVMetadataObjectTypeEAN13Code
#abstract An identifier for an instance of
AVMetadataMachineReadableCodeObject having a type
AVMetadataObjectTypeEAN13Code.
#discussion
AVMetadataMachineReadableCodeObject objects generated from EAN-13 (including UPC-A) codes return this constant as their type. */
AVF_EXPORT NSString *const AVMetadataObjectTypeEAN13Code NS_AVAILABLE(NA, 7_0);
I'm making an app that has a camera preview in a View, inside this View I want to draw the camera data and also another View that shows a little rectangle when data it's captured, for example, in a BarCode scanning Scenario the Camera is shown in a view, when a BarCode it's found a Rectangle will be drawn showing that it has scanned a Barcode.
My current View Hierarchy it's the following:
View
{
-UIView cameraHolder
{
-UIView highlightView
}
}
I'm managed to get the camera showed and scanning things, but the highlight View it's not being shown, why is this happening?
This is the code for initializing the highlight View:
-(void)setUpHiglightView{
self.highlightView = [[UIView alloc] init];
self.highlightView.autoresizingMask = UIViewAutoresizingFlexibleTopMargin|UIViewAutoresizingFlexibleLeftMargin|UIViewAutoresizingFlexibleRightMargin|UIViewAutoresizingFlexibleBottomMargin;
self.highlightView.layer.borderColor = [UIColor greenColor].CGColor;
self.highlightView.layer.borderWidth = 3;
}
and this is the code for when a data it's captured:
-(void)captureOutput:(AVCaptureOutput *)captureOutput didOutputMetadataObjects:(NSArray *)metadataObjects fromConnection:(AVCaptureConnection *)connection{
CGRect highlightViewRect = CGRectZero;
AVMetadataMachineReadableCodeObject *barCodeObject;
NSString *detectionString = nil;
NSArray *barCodeTypes = #[AVMetadataObjectTypeUPCECode, AVMetadataObjectTypeCode39Code, AVMetadataObjectTypeCode39Mod43Code,
AVMetadataObjectTypeEAN13Code, AVMetadataObjectTypeEAN8Code, AVMetadataObjectTypeCode93Code, AVMetadataObjectTypeCode128Code,
AVMetadataObjectTypePDF417Code, AVMetadataObjectTypeQRCode, AVMetadataObjectTypeAztecCode];
for(AVMetadataObject *metadata in metadataObjects){
for(NSString *type in barCodeTypes){
if([metadata.type isEqualToString:type]){
barCodeObject = (AVMetadataMachineReadableCodeObject *)[prevLayer transformedMetadataObjectForMetadataObject:(AVMetadataMachineReadableCodeObject*)metadata];
highlightViewRect = barCodeObject.bounds;
detectionString = [(AVMetadataMachineReadableCodeObject*)metadata stringValue];
break;
}
}
}
if(detectionString != nil){
[self.itemIdTextField setText:detectionString];
}else{
//NSLog(#"Got Nothing");
}
NSLog(#"Position: [%f,%f][%f,%f]",highlightViewRect.origin.x, highlightViewRect.origin.y,highlightViewRect.size.height, highlightViewRect.size.width);
self.highlightView.frame = highlightViewRect;
}
Also the code that initializes the camera:
-(void)setupBarCodeScanner{
[self setUpHiglightView];
session = [[AVCaptureSession alloc] init];
device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
NSError *error = nil;
input = [AVCaptureDeviceInput deviceInputWithDevice:device error:&error];
if(input){
[session addInput:input];
}else{
[self showAlertDialogWithTitle:#"Error" andMessage:#"There was an error while accessing your camera"];
NSLog(#"Error: %#", error);
}
output = [[AVCaptureMetadataOutput alloc] init];
[output setMetadataObjectsDelegate:self queue:dispatch_get_main_queue()];
[session addOutput:output];
output.metadataObjectTypes = [output availableMetadataObjectTypes];
prevLayer = [AVCaptureVideoPreviewLayer layerWithSession:session];
prevLayer.frame = self.cameraHolder.bounds;
prevLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;
[self.cameraHolder.layer addSublayer:prevLayer];
}
Thank you very much!
Add self.highlightView to self.view at the end of captureOutput:
[self.view addSubview:self.highlightView];
It doesn't look like you're adding that view anywhere. You create it and set its frame, but I'm not seeing where you're adding to the view hierarchy.
I have a problem with AVCaptureSession startRunning. It's problem only on iphone 5 with IOS 7. My app should record video and show all on AVCaptureVideoPreviewLayer, but if I test on iphone 5, first time call error = AVErrorMediaServicesWereReset.
This is my code, where I create capturemanager and startRunning:
-(void)startVideoTranslation{
CGRect r = videoBackView.bounds;
videoPreviewView = [[UIView alloc] initWithFrame:r];
[videoBackView addSubview:videoPreviewView];
if (currentGameType == MPGameTypeCrocodile){
if (captureManager == nil) {
captureManager = [[AVCamCaptureManager alloc] init];
[captureManager setDelegate:self];
if ([captureManager setupSession]) {
//Create video preview layer and add it to the UI
AVCaptureVideoPreviewLayer *newCaptureVideoPreviewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:[captureManager session]];
CALayer *viewLayer = [videoPreviewView layer];
[viewLayer setMasksToBounds:YES];
CGRect bounds = [videoPreviewView bounds];
bounds.origin.y = bounds.origin.y+1;
[newCaptureVideoPreviewLayer setFrame:bounds];
if ([newCaptureVideoPreviewLayer.connection isVideoOrientationSupported]) {
[newCaptureVideoPreviewLayer.connection setVideoOrientation:AVCaptureVideoOrientationLandscapeRight];
}
[newCaptureVideoPreviewLayer setVideoGravity:AVLayerVideoGravityResizeAspectFill];
[viewLayer insertSublayer:newCaptureVideoPreviewLayer below:[[viewLayer sublayers] objectAtIndex:0]];
captureVideoPreviewLayer = newCaptureVideoPreviewLayer;
[[NSNotificationCenter defaultCenter] addObserver:self selector:#selector(isErrorSession:) name:AVCaptureSessionRuntimeErrorNotification object:nil];
dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT,0), ^{
[[captureManager session] startRunning];
});
}
}
}
}
And this selector I call here:
-(void)showPrepareView{
[self playSound:#"start"];
BGView.frame = videoBackView.bounds;
[prepareView addSubview:videoBackView];
[self startVideoTranslation];
[videoBackView addSubview:BGView];
In this code I use AVAudioPlayer in [self playSound:],
on videobackView i add my PreviewLayer.
Does anyone know the reason for this problem?
Now,I solve this problem so, I move in -(id)initWithFrame:
captureManager = [[AVCamCaptureManager alloc] init];
[captureManager setDelegate:self]
[captureManager setupSession];
[[captureManager session] startRunning];
But I don't understand what is it.