Memory related issue of ZBarReaderViewController in iOS 7 - ios

I use ZBarReaderViewController for scan QR Code. and it was perfectly worked on iOS 6.
But when i use iOS 7 with my project then it is not properly working with ZBarReaderViewController
Issue is related to memory, it take more then 100 MB and my device is hang at this time.
Generally in my project. user can scan QR Generator image and i have function which recognize code of QR code is related to my string which i got from server then if YES then i goes to next view controller otherwise remain in current (continue QR SCAN) screen.
If QR code mach with my string then on next screen has "cancel" button which make be scan another code ( it means i got to previous viewController (QR SCAN)).
At that time when i go to next viewController and back to pervious (QR Scan screen) then each time i got ZBarReaderViewController is allocated so (May be) memory related issue is generated.
but i write code
if(self.ZBarReaderVC)
{
for(UIView *subVies in self.ZBarReaderVC.cameraOverlayView.subviews)
[subVies removeFromSuperview];
for(UIView *subVies in self.ZBarReaderVC.view.subviews)
[subVies removeFromSuperview];
[self.ZBarReaderVC removeFromParentViewController];
self.ZBarReaderVC = nil;
}
after [self.ZBarReaderVC dismissModalViewControllerAnimated: YES]; I remove ZBarReaderViewController at the end time then why each time i got allocated ZBarReaderViewController ???
And also i put [self.ZBarReaderVC.readerView stop]; before dismiss ZBarReaderViewController fro stop scanning stream of reader
but also it not worked for me.
But i tried to solve my problem about hours of time but i am not able to solve my issue
please help me.
Alos i found similar problem
Zbar SDK and ios7/xcode 5 - App is reaching 100% cpu use and memory more than 100MB
http://sourceforge.net/p/zbar/discussion/1072195/thread/df4c215a/
But No one can help me.

I found that in iOS 7 issue is occur at
self.ZBarReaderVC.view.frame = self.view.bounds;
I put break point here and check whenever i come bake from previous viewController, its take more time and also memory (issue) at this code.
So first i need to remove view of self.ZBarReaderVC with its all subViews.. so at first i need to write
if(self.ZBarReaderVC) // first check `self.ZBarReaderVC` is created or not?
{
[self.ZBarReaderVC.readerView stop]; // then stop continue scanning stream of "self.ZBarReaderVC"
for(UIView *subViews in self.ZBarReaderVC.view.subviews) // remove all subviews
[subViews removeFromSuperview];
[self.ZBarReaderVC.view removeFromSuperview];
self.ZBarReaderVC.view = nil;
}
And also i got that in iOS 7 self.ZBarReaderVC has remain continue scanning stream of QR Code so each time we need to stop it whenever your QR Code scanning is done and you need to dismiss your self.ZBarReaderVC then first stop scanning by [self.ZBarReaderVC.readerView stop];
And some time user need to write/call (For do/implement some type of extra features)
[self.ZBarReaderVC viewDidLoad];
[self.ZBarReaderVC viewWillAppear:NO];
[self.ZBarReaderVC viewDidAppear:NO];
Methods of self.ZBarReaderVC then it is not need to use in iOS 7, so if any user who call this methods of self.ZBarReaderVC then please put it in comment.
I hope this my suggestion is helpful for others.
Thanks :)

If you are targetting your app for iOS7 only, I ditched the ZBar component and used the native AVFoundation method, making the viewcontroller a AVCaptureMetadataOutputObjectsDelegate. Works prefectly with 3% CPU usage:
viewcontroller.h:
#interface viewcontroller : UIViewController <AVCaptureMetadataOutputObjectsDelegate> {
AVCaptureSession *_session;
AVCaptureDevice *_device;
AVCaptureDeviceInput *_input;
AVCaptureMetadataOutput *_output;
AVCaptureVideoPreviewLayer *_prevLayer;
UIView *_highlightView;
}
viewcontroller.m
- (IBAction)btnScan:(id)sender {
_session = [[AVCaptureSession alloc] init];
_device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
NSError *error = nil;
_input = [AVCaptureDeviceInput deviceInputWithDevice:_device error:&error];
if (_input) {
[_session addInput:_input];
} else {
NSLog(#"Error: %#", error);
}
_output = [[AVCaptureMetadataOutput alloc] init];
[_output setMetadataObjectsDelegate:self queue:dispatch_get_main_queue()];
[_session addOutput:_output];
_output.metadataObjectTypes = [_output availableMetadataObjectTypes];
_prevLayer = [AVCaptureVideoPreviewLayer layerWithSession:_session];
_prevLayer.frame = self.view.bounds;
_prevLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;
[self.view.layer addSublayer:_prevLayer];
[_session startRunning];
}
- (void)captureOutput:(AVCaptureOutput *)captureOutput
didOutputMetadataObjects:(NSArray *)metadataObjects
fromConnection:(AVCaptureConnection *)connection {
AVMetadataMachineReadableCodeObject *barCodeObject;
NSString *detectionString = nil;
NSArray *barCodeTypes = #[AVMetadataObjectTypeCode39Code, AVMetadataObjectTypeCode128Code,
AVMetadataObjectTypeQRCode];
for (AVMetadataObject *metadata in metadataObjects) {
for (NSString *type in barCodeTypes) {
if ([metadata.type isEqualToString:type]) {
barCodeObject = (AVMetadataMachineReadableCodeObject *)
[_prevLayer transformedMetadataObjectForMetadataObject:
(AVMetadataMachineReadableCodeObject *)metadata];
detectionString = [(AVMetadataMachineReadableCodeObject *)metadata stringValue];
break;
}
}
if (detectionString != nil) {
NSLog(#"%#", detectionString);
[self buscarCarga:detectionString]; //Do whatever you want with the data
[_session stopRunning];
AVCaptureInput* input = [_session.inputs objectAtIndex:0];
[_session removeInput:input];
AVCaptureVideoDataOutput* output = (AVCaptureVideoDataOutput*)[_session.outputs objectAtIndex:0];
[_session removeOutput:output];
[_prevLayer removeFromSuperlayer];
}
else
NSLog(#"No data");
}
}

- (void)viewDidLoad
{
[super viewDidLoad];
self.ZBarReaderVC = [ZBarReaderViewController new];
self.ZBarReaderVC.readerDelegate=self;
self.ZBarReaderVC.supportedOrientationsMask = ZBarOrientationMaskAll;
ZBarImageScanner *scanner = self.ZBarReaderVC.scanner;
[scanner setSymbology: ZBAR_I25 config: ZBAR_CFG_ENABLE to: 0];
}
#pragma mark - Button click method
- (IBAction)startScanning:(id)sender {
NSLog(#"Scanning..");
resultTextView.text = #"Scanning..";
[self presentViewController:self.ZBarReaderVC animated:YES completion:nil];
}

Related

iOS AVFoundation read QRcode when vuforia using the AVCaptureDeviceInput

This the code read QRcode
- (instancetype)init {
if (self = [super init]) {
if (self.session == nil)
self.session = [[AVCaptureSession alloc] init];
//device
if (self.device == nil)
self.device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
//output
if (self.output == nil)
self.output = [[AVCaptureMetadataOutput alloc] init];
}
return self;
}
- (void)creatScanQR{
NSError *error = nil;
//input
if (self.input == nil)
self.input = [AVCaptureDeviceInput deviceInputWithDevice:self.device error:&error];
if(self.input) {
[self.session addInput:self.input];
} else {
NSLog(#"%#", error);
return;
}
[self.session addOutput:self.output];
[self.output setMetadataObjectTypes:#[AVMetadataObjectTypeQRCode]];
[self.output setMetadataObjectsDelegate:self queue:dispatch_get_main_queue()];
[self.session startRunning];
}
- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputMetadataObjects:(NSArray *)metadataObjects fromConnection:(AVCaptureConnection *)connection {
for (AVMetadataMachineReadableCodeObject *metadata in metadataObjects) {
if ([metadata.type isEqualToString:AVMetadataObjectTypeQRCode]) {
NSLog(#"======%#=======",metadata.stringValue);
}
}
}
It works in native app. But my app is build by Unity, it used Vuforia, When I use AVCapture read QRcode, vuforia is black screen. Because camera are only one which is using by Voforia. How can I use AVCaptureInput to read QRcode and vuforia is still working?
My planB is get the vuforia view , write a image by vuforia view , use iOS CIDetector read the qrcode, but I got a nil image.why ?
UIView *view = UnityGetGLView();
UIGraphicsBeginImageContext(view.bounds.size);
[view.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage *image= UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
NSData *imageData = UIImagePNGRepresentation(image);
CIDetector *detector = [CIDetector detectorOfType:CIDetectorTypeQRCode context:[CIContext contextWithOptions:nil] options:#{CIDetectorAccuracy:CIDetectorAccuracyLow}];
NSArray *features = [detector featuresInImage:[CIImage imageWithData:imageData]];
for (CIFeature *feature in features) {
NSLog(#"%#",feature.type);
if ([feature isKindOfClass:[CIQRCodeFeature class]]) {
NSLog(#"?????? %# ????? ", ((CIQRCodeFeature *)feature).messageString);
dispatch_sync(queue, ^{ dispatch_suspend(timer); });
}
}
sounds like your problem is caused by competition of using camera. The only solution may be to use the exactly same view in vuforia and QRCode reading, namely workaround the competition, and share the camera images between vuforia and QRCode reading.
I make some change for Plan B.
Use GCD timer every second get the image from rootview, the shot view method is
- (BOOL)drawViewHierarchyInRect:(CGRect)rect afterScreenUpdates:(BOOL)afterUpdates NS_AVAILABLE_IOS(7_0);
.Then CIDetector read then qrcode from image.It works.It looks well.
I don't know how many bugs in it.But there is no way to do. Boss pushed me if I didn't use plan b. He wants Vuforia can read qrcode same time, and quickly no slowly.So problem temporarily solved . If you have better idea, love to listen.

objective c delay in open page after scanning QR code

I'm quite new with ios developing (even though I have a lot of experience with other platforms) and I am facing an issue while I'm try to implement a QR code reader.
This is the piece of code I wrote:
- (BOOL)startReading {
NSError *error;
AVCaptureDevice *captureDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:captureDevice error:&error];
if (!input) {
NSLog(#"%#", [error localizedDescription]);
return NO;
}
_qrLB.text=#"Loading....";
_captureSession = [[AVCaptureSession alloc] init];
[_captureSession addInput:input];
AVCaptureMetadataOutput *captureMetadataOutput = [[AVCaptureMetadataOutput alloc] init];
[_captureSession addOutput:captureMetadataOutput];
dispatch_queue_t dispatchQueue;
dispatchQueue = dispatch_queue_create("myQueue", NULL);
[captureMetadataOutput setMetadataObjectsDelegate:self queue:dispatchQueue];
[captureMetadataOutput setMetadataObjectTypes:[NSArray arrayWithObject:AVMetadataObjectTypeQRCode]];
_videoPreviewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:_captureSession];
[_videoPreviewLayer setVideoGravity:AVLayerVideoGravityResizeAspectFill];
[_videoPreviewLayer setFrame:_viewPreview.layer.bounds];
[_viewPreview.layer addSublayer:_videoPreviewLayer];
[_captureSession startRunning];
return YES;
}
#pragma mark
#pragma mark Stop reading QR Code
-(void)stopReading{
[_captureSession stopRunning];
_captureSession = nil;
[_videoPreviewLayer removeFromSuperlayer];
}
#pragma mark
#pragma mark Capture QR Code
-(void)captureOutput:(AVCaptureOutput *)captureOutput didOutputMetadataObjects:(NSArray *)metadataObjects fromConnection:(AVCaptureConnection *)connection{
NSLog(#"qrcode1");
if (metadataObjects != nil && [metadataObjects count] > 0) {
AVMetadataMachineReadableCodeObject *metadataObj = [metadataObjects objectAtIndex:0];
if ([[metadataObj type] isEqualToString:AVMetadataObjectTypeQRCode]) {
[_lblStatus performSelectorOnMainThread:#selector(setText:) withObject:[metadataObj stringValue] waitUntilDone:NO];
[self performSelectorOnMainThread:#selector(stopReading) withObject:nil waitUntilDone:NO];
_isReading = NO;
}
}
NSLog(#"Value of qrcodetext = %#", _lblStatus.text);
ReceiveVC *view = [self.storyboard instantiateViewControllerWithIdentifier:#"ReceiveVC"];
view.imageURL=_lblStatus.text;
view.title = #"";
[self.navigationController pushViewController:view animated:YES];
NSLog(#"qrcode2");
}
When I launch the app on the device (iphone 5s) and I scan a QR code this is the output I got:
screenshot
It works fine! it reads the value and it redirects the user to the page "ReceiveVC" ("ReceiveVC1" is a debug I print when the page at loading)
The problem is: it's taking like 20 seconds (sometime even more!!) from the moment in which the value is correctly read, to the moment in which the page is opened (check time of lines 2 and 3). Seems taht the navigationController is waiting for some thread to finish but I cannot figure out which one, since all the values I need are already there!
Can you help me?
Thank you

AVCam: Zoom further out (as it is possible for photo-mode)?

Using XCode-6.4, iOS-8.4.1:
With AVCaptureSession, I would like to zoom further out (as much as the iPhone camera possibly can manage!)
I already use the "setVideoZoomFactor" method set equal to 1 (= its smallest value allowed). This works quite good (see code-example at the very bottom...). But then I did the following observation (recognising that the camera in photo-mode possibly manages to zoom even further out than being in video-mode):
The iPhone-camera in photo-mode shows a completely different zoom than the camera being in video-mode (at least for my iPhone 5S). You can test yourself using the native "Camera App" on your iPhone. Switch between PHOTO and VIDEO and you will see that the Photo-mode can possibly zoom further out than video-zoomfactor=1). How is that possible ???
And moreover, is there any way in achieving the same minimal zoomfactor the photo-mode achieves also in video-mode using AVCam under iOS ????
Here is an illustration of what the zoom-difference is between photo-mode and video-mode of my 5S iPhone (see picture):
Here is the code of the AVCamViewController:
- (void)viewDidLoad
{
[super viewDidLoad];
// Create the AVCaptureSession
AVCaptureSession *session = [[AVCaptureSession alloc] init];
[self setSession:session];
// Setup the preview view
[[self previewView] setSession:session];
// Check for device authorization
[self checkDeviceAuthorizationStatus];
dispatch_queue_t sessionQueue = dispatch_queue_create("session queue", DISPATCH_QUEUE_SERIAL);
[self setSessionQueue:sessionQueue];
// http://stackoverflow.com/questions/25110055/ios-captureoutputdidoutputsamplebufferfromconnection-is-not-called
dispatch_async(sessionQueue, ^{
self.session = [AVCaptureSession new];
self.session.sessionPreset = AVCaptureSessionPresetMedium;
NSArray *devices = [AVCaptureDevice devices];
AVCaptureDevice *backCamera;
for (AVCaptureDevice *device in devices) {
if ([device hasMediaType:AVMediaTypeVideo]) {
if ([device position] == AVCaptureDevicePositionBack) {
backCamera = device;
}
}
}
NSError *error = nil;
AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:backCamera error:&error];
if (error) {
NSLog(#"%#",error);
}
if ([self.session canAddInput:input]) {
[self.session addInput:input];
}
AVCaptureVideoDataOutput *output = [AVCaptureVideoDataOutput new];
[output setSampleBufferDelegate:self queue:sessionQueue];
output.videoSettings = #{(id)kCVPixelBufferPixelFormatTypeKey:#(kCVPixelFormatType_32BGRA)};
if ([self.session canAddOutput:output]) {
[self.session addOutput:output];
}
// Apply initial VideoZoomFactor to the device
NSNumber *DefaultZoomFactor = [NSNumber numberWithFloat:1.0];
if ([backCamera lockForConfiguration:&error])
{
// HERE IS THE ZOOMING DONE !!!!!!
[backCamera setVideoZoomFactor:[DefaultZoomFactor floatValue]];
[backCamera unlockForConfiguration];
}
else
{
NSLog(#"%#", error);
}
[self.session startRunning];
});
}
If your problem is that your app is zooming the photo by comparison to native iOS camera app, then this setup will probably help you.
I had "this" issue and the following solution fix it.
The solution:
Configure your session
- (void)viewDidLoad
{
[super viewDidLoad];
// Create the AVCaptureSession
AVCaptureSession *session = [[AVCaptureSession alloc] init];
[self setSession:session];
--> self.session.sessionPreset = AVCaptureSessionPresetPhoto; <---
...
}
You must be aware that setup will only work for photos and not for videos. If you try with videos, your app shall crash.
You can configure your session based upon your needs (photo or video)
For video you can use this value: AVCaptureSessionPresetHigh
BR.

AVCaptureSession stopRunning method creates terrible hang

Using Ray Wenderlich's QRCode reader from Chapter 22 of iOS7 Tutorials, I am successfully reading QRCodes for my current app. I am now extending it that upon successfully reading a QRCode, I want to store the stringValue of the AVMetadataMachineReadableCodeObject that was read, segue to a new view, and use that data on the new view, more or less exactly how most QRCode reader apps (like RedLaser, etc...) process barcodes and QRCodes.
However, I call [captureSession stopRunning] (so that it does not read any more QRCodes and trigger additional segues) and there is a 10+ second hang. I have tried to implement an async call per this SO question, however to no avail. I have also looked at these SO Questions and they seem not to be appropriate for this purpose.
Does anyone have an idea how to remove this hanging?
Here is the code:
#import "BMQRCodeReaderViewController.h"
#import "NSString+containsString.h"
#import "BMManualExperimentDataEntryViewController.h"
#import AVFoundation;
#interface BMQRCodeReaderViewController ()
<AVCaptureMetadataOutputObjectsDelegate>
#end
#implementation BMQRCodeReaderViewController {
AVCaptureSession *_captureSession;
AVCaptureDevice *_videoDevice;
AVCaptureDeviceInput *_videoInput;
AVCaptureVideoPreviewLayer *_previewLayer;
BOOL _running;
AVCaptureMetadataOutput *_metadataOutput;
}
- (void)setupCaptureSession { // 1
if (_captureSession) return;
// 2
_videoDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
if (!_videoDevice) {
NSLog(#"No video camera on this device!"); return;
}
// 3
_captureSession = [[AVCaptureSession alloc] init];
// 4
_videoInput = [[AVCaptureDeviceInput alloc] initWithDevice:_videoDevice error:nil];
// 5
if ([_captureSession canAddInput:_videoInput]) { [_captureSession addInput:_videoInput];
}
// 6
_previewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:_captureSession];
_previewLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;
_metadataOutput = [[AVCaptureMetadataOutput alloc] init];
dispatch_queue_t metadataQueue = dispatch_queue_create("com.razeware.ColloQR.metadata", 0);
[_metadataOutput setMetadataObjectsDelegate:self queue:metadataQueue];
if ([_captureSession canAddOutput:_metadataOutput]) { [_captureSession addOutput:_metadataOutput];
}
}
- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputMetadataObjects:(NSArray *)metadataObjects
fromConnection:(AVCaptureConnection *)connection {
// This fancy BOOL is just helping me fire the segue when the correct string is found
__block NSNumber *didFind = [NSNumber numberWithBool:NO];
[metadataObjects enumerateObjectsUsingBlock:^(AVMetadataObject *obj, NSUInteger idx, BOOL *stop) {
AVMetadataMachineReadableCodeObject *readableObject = (AVMetadataMachineReadableCodeObject *)obj;
NSLog(#"Metadata: %#", readableObject);
// [ containsString is a category I extended for NSString, just FYI
if ([readableObject.stringValue containsString:#"CorrectString"]) {
didFind = [NSNumber numberWithBool:YES];
NSLog(#"Found it");
_testName = #"NameOfTest";
*stop = YES;
return;
}
}];
if ([didFind boolValue]) {
NSLog(#"Confirming we found it");
dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0), ^{
[self stopRunning];
});
_labelTestName.text = _testName;
[self performSegueWithIdentifier:#"segueFromFoundQRCode" sender:self];
}
else {
NSLog(#"Did not find it");
}
}
- (void)startRunning {
if (_running)
return;
[_captureSession startRunning];
_metadataOutput.metadataObjectTypes = _metadataOutput.availableMetadataObjectTypes;
_running = YES;
}
- (void)stopRunning {
if (!_running) return;
[_captureSession stopRunning];
_running = NO;
}
- (void)viewDidLoad
{
[super viewDidLoad];
// Do any additional setup after loading the view.
[self setupCaptureSession];
[self setupNavBar];
[self startRunning];
_previewLayer.frame = _previewView.bounds;
[_previewView.layer addSublayer:_previewLayer];
}
I have successively solved the issue. The issue was that the delegate method call
- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputMetadataObjects:(NSArray *)metadataObjects
fromConnection:(AVCaptureConnection *)connection
is running in the background. This was determined with a [NSThread isMainThread] call, which it fails.
The solution is to find the proper stringValue from the QRCode, stop your captureSession in the background, THEN call your segue on the main thread. The solution looks as such:
- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputMetadataObjects:(NSArray *)metadataObjects
fromConnection:(AVCaptureConnection *)connection {
// This fancy BOOL is just helping me fire the segue when the correct string is found
__block NSNumber *didFind = [NSNumber numberWithBool:NO];
[metadataObjects enumerateObjectsUsingBlock:^(AVMetadataObject *obj, NSUInteger idx, BOOL *stop) {
AVMetadataMachineReadableCodeObject *readableObject = (AVMetadataMachineReadableCodeObject *)obj;
NSLog(#"Metadata: %#", readableObject);
if ([NSThread isMainThread]) {
NSLog(#"Yes Main Thread");
}
else {
NSLog(#"Not main thread");
}
// [ containsString is a category I extended for NSString, just FYI
if ([readableObject.stringValue containsString:#"Biomeme"]) {
//NSLog(#"this is a test: %#", getTestName);
didFind = [NSNumber numberWithBool:YES];
NSLog(#"Found it");
_testName = readableObject.stringValue;
*stop = YES;
return;
}
}];
if ([didFind boolValue]) {
NSLog(#"Confirming we found it");
dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0), ^{
NSDate *start = [NSDate date];
[self stopRunning];
NSLog(#"time took: %f", -[start timeIntervalSinceNow]);
// *** Here is the key, make your segue on the main thread
dispatch_async(dispatch_get_main_queue(), ^{
[self performSegueWithIdentifier:#"segueFromFoundQRCode" sender:self];
_labelTestName.text = _testName;
});
});
}
else {
NSLog(#"Did not find it");
}
}

AVCaptureSession output is not properly displayed and AVCaptureMetadataOutputObjectsDelegate method is not called

I have a single view application in which I am trying to test iOS7's AVCaptureMetadataOutput based on this explanation. My ViewController conforms to AVCaptureMetadataOutputObjectsDelegate and the code looks like this (almost exactly the same as Mattt's):
- (void)viewDidLoad
{
[super viewDidLoad];
// Do any additional setup after loading the view, typically from a nib.
// Testing the VIN Scanner before I make it part of the library
NSLog(#"Setting up the vin scanner");
AVCaptureSession *session = [[AVCaptureSession alloc] init];
AVCaptureDevice *device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
NSError *error = nil;
AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:device
error:&error];
if (input) {
[session addInput:input];
} else {
NSLog(#"Error: %#", error);
}
AVCaptureMetadataOutput *output = [[AVCaptureMetadataOutput alloc] init];
[output setMetadataObjectsDelegate:self queue:dispatch_get_main_queue()];
[session addOutput:output];
[session startRunning];
}
- (void)captureOutput:(AVCaptureOutput *)captureOutput
didOutputMetadataObjects:(NSArray *)metadataObjects
fromConnection:(AVCaptureConnection *)connection
{
NSString *code = nil;
for (AVMetadataObject *metadata in metadataObjects) {
if ([metadata.type isEqualToString:AVMetadataObjectTypeCode39Code]) {
code = [(AVMetadataMachineReadableCodeObject *)metadata stringValue];
break;
}
}
NSLog(#"code: %#", code);
}
When I run this on an iOS7 device (I've tried an iPhone 4 and iPhone 4s) XCode logs "Setting up the vin scanner" but the camera (ie the AVCaptureSession) never opens.
Edit 1:
I added the following code to show the camera output on screen:
AVCaptureVideoPreviewLayer *previewLayer = [AVCaptureVideoPreviewLayer layerWithSession:session];
// Display full screen
previewLayer.frame = self.view.frame;
// Add the video preview layer to the view
[self.view.layer addSublayer:previewLayer];
But the display is very odd, does not conform to the screen and the way it rotates does not make sense. The other issue is that when I focus the camera on a bar code the metadata delegate method is never called. Please see pictures below:
The camera will not open the way it does for the UIImagePickerController. The problem is that your code does nothing with the output. You'll need to add a preview layer to display the output of the camera as it streams in.
AVCaptureVideoPreviewLayer *previewLayer = [AVCaptureVideoPreviewLayer layerWithSession:session];
// Display full screen
previewLayer.frame = CGRectMake(0.0, 0.0, self.view.frame.size.width, self.view.frame.size.height);
// Add the video preview layer to the view
[self.view.layer addSublayer:previewLayer];
[session startRunning];
Edit**
After taking a deeper look at your code I noticed a few more issues.
First you need to also set the MetaDataObjectTypes you want to search for, right now your not looking for any valid object types. This should be added after you add the output to the session. You can view the full list of available types in the documentation
[output setMetadataObjectTypes:#[AVMetadataObjectTypeQRCode, AVMetadataObjectTypeEAN13Code]];
Second your AVCaptureSession *session is a local variable in your viewDidLoad, take this and place it just after your #interface ViewController () as shown below.
#interface ViewController ()
#property (nonatomic, strong) AVCaptureSession *session;
#end
#implementation ViewController
- (void)viewDidLoad
{
[super viewDidLoad];
self.session = [[AVCaptureSession alloc] init];
// Testing the VIN Scanner before I make it part of the library
NSLog(#"Setting up the vin scanner");
self.session = [[AVCaptureSession alloc] init];
AVCaptureDevice *device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
NSError *error = nil;
AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:device
error:&error];
if (input) {
[self.session addInput:input];
} else {
NSLog(#"Error: %#", error);
}
AVCaptureMetadataOutput *output = [[AVCaptureMetadataOutput alloc] init];
[output setMetadataObjectsDelegate:self queue:dispatch_get_main_queue()];
[self.session addOutput:output];
[output setMetadataObjectTypes:#[AVMetadataObjectTypeQRCode, AVMetadataObjectTypeEAN13Code]];
[self.session startRunning];
}

Resources