I'm quite new with ios developing (even though I have a lot of experience with other platforms) and I am facing an issue while I'm try to implement a QR code reader.
This is the piece of code I wrote:
- (BOOL)startReading {
NSError *error;
AVCaptureDevice *captureDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:captureDevice error:&error];
if (!input) {
NSLog(#"%#", [error localizedDescription]);
return NO;
}
_qrLB.text=#"Loading....";
_captureSession = [[AVCaptureSession alloc] init];
[_captureSession addInput:input];
AVCaptureMetadataOutput *captureMetadataOutput = [[AVCaptureMetadataOutput alloc] init];
[_captureSession addOutput:captureMetadataOutput];
dispatch_queue_t dispatchQueue;
dispatchQueue = dispatch_queue_create("myQueue", NULL);
[captureMetadataOutput setMetadataObjectsDelegate:self queue:dispatchQueue];
[captureMetadataOutput setMetadataObjectTypes:[NSArray arrayWithObject:AVMetadataObjectTypeQRCode]];
_videoPreviewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:_captureSession];
[_videoPreviewLayer setVideoGravity:AVLayerVideoGravityResizeAspectFill];
[_videoPreviewLayer setFrame:_viewPreview.layer.bounds];
[_viewPreview.layer addSublayer:_videoPreviewLayer];
[_captureSession startRunning];
return YES;
}
#pragma mark
#pragma mark Stop reading QR Code
-(void)stopReading{
[_captureSession stopRunning];
_captureSession = nil;
[_videoPreviewLayer removeFromSuperlayer];
}
#pragma mark
#pragma mark Capture QR Code
-(void)captureOutput:(AVCaptureOutput *)captureOutput didOutputMetadataObjects:(NSArray *)metadataObjects fromConnection:(AVCaptureConnection *)connection{
NSLog(#"qrcode1");
if (metadataObjects != nil && [metadataObjects count] > 0) {
AVMetadataMachineReadableCodeObject *metadataObj = [metadataObjects objectAtIndex:0];
if ([[metadataObj type] isEqualToString:AVMetadataObjectTypeQRCode]) {
[_lblStatus performSelectorOnMainThread:#selector(setText:) withObject:[metadataObj stringValue] waitUntilDone:NO];
[self performSelectorOnMainThread:#selector(stopReading) withObject:nil waitUntilDone:NO];
_isReading = NO;
}
}
NSLog(#"Value of qrcodetext = %#", _lblStatus.text);
ReceiveVC *view = [self.storyboard instantiateViewControllerWithIdentifier:#"ReceiveVC"];
view.imageURL=_lblStatus.text;
view.title = #"";
[self.navigationController pushViewController:view animated:YES];
NSLog(#"qrcode2");
}
When I launch the app on the device (iphone 5s) and I scan a QR code this is the output I got:
screenshot
It works fine! it reads the value and it redirects the user to the page "ReceiveVC" ("ReceiveVC1" is a debug I print when the page at loading)
The problem is: it's taking like 20 seconds (sometime even more!!) from the moment in which the value is correctly read, to the moment in which the page is opened (check time of lines 2 and 3). Seems taht the navigationController is waiting for some thread to finish but I cannot figure out which one, since all the values I need are already there!
Can you help me?
Thank you
Related
My scanner can scan this barcode.
But not this barcode.
This is the code i'm using.
#pragma mark - IBAction method implementation
- (void)startStopReading {
if (!_isReading) {
// This is the case where the app should read a QR code when the start button is tapped.
if ([self startReading]) {
// If the startReading methods returns YES and the capture session is successfully
// running, then change the start button title and the status message.
[_decodedLabel setText:#"Scanning for QR Code..."];
}
}
else{
// In this case the app is currently reading a QR code and it should stop doing so.
[self stopReading];
// The bar button item's title should change again.
}
// Set to the flag the exact opposite value of the one that currently has.
_isReading = !_isReading;
}
#pragma mark - Private method implementation
- (BOOL)startReading {
NSError *error;
// Get an instance of the AVCaptureDevice class to initialize a device object and provide the video
// as the media type parameter.
AVCaptureDevice *captureDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
// Get an instance of the AVCaptureDeviceInput class using the previous device object.
AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:captureDevice error:&error];
if (!input) {
// If any error occurs, simply log the description of it and don't continue any more.
NSLog(#"%#", [error localizedDescription]);
return NO;
}
// Initialize the captureSession object.
_captureSession = [[AVCaptureSession alloc] init];
// Set the input device on the capture session.
[_captureSession addInput:input];
// Initialize a AVCaptureMetadataOutput object and set it as the output device to the capture session.
AVCaptureMetadataOutput *captureMetadataOutput = [[AVCaptureMetadataOutput alloc] init];
[_captureSession addOutput:captureMetadataOutput];
// Create a new serial dispatch queue.
dispatch_queue_t dispatchQueue;
dispatchQueue = dispatch_queue_create("myQueue", NULL);
[captureMetadataOutput setMetadataObjectsDelegate:self queue:dispatchQueue];
[captureMetadataOutput setMetadataObjectTypes:[NSArray arrayWithObject:AVMetadataObjectTypeEAN13Code]];
// Initialize the video preview layer and add it as a sublayer to the viewPreview view's layer.
_videoPreviewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:_captureSession];
[_videoPreviewLayer setVideoGravity:AVLayerVideoGravityResizeAspectFill];
[_videoPreviewLayer setFrame:_viewPreview.layer.bounds];
[_viewPreview.layer addSublayer:_videoPreviewLayer];
// Start video capture.
[_captureSession startRunning];
return YES;
}
-(void)stopReading{
// Stop video capture and make the capture session object nil.
[_captureSession stopRunning];
_captureSession = nil;
// Remove the video preview layer from the viewPreview view's layer.
[_videoPreviewLayer removeFromSuperlayer];
}
#pragma mark - AVCaptureMetadataOutputObjectsDelegate method implementation
-(void)captureOutput:(AVCaptureOutput *)captureOutput didOutputMetadataObjects:(NSArray *)metadataObjects fromConnection:(AVCaptureConnection *)connection{
// Check if the metadataObjects array is not nil and it contains at least one object.
if (metadataObjects != nil && [metadataObjects count] > 0) {
// Get the metadata object.
AVMetadataMachineReadableCodeObject *metadataObj = [metadataObjects objectAtIndex:0];
if ([[metadataObj type] isEqualToString:AVMetadataObjectTypeEAN13Code]) {
// If the found metadata is equal to the QR code metadata then update the status label's text,
// stop reading and change the bar button item's title and the flag's value.
// Everything is done on the main thread.
[_decodedLabel performSelectorOnMainThread:#selector(setText:) withObject:[metadataObj stringValue] waitUntilDone:NO];
[self performSelectorOnMainThread:#selector(stopReading) withObject:nil waitUntilDone:NO];
_isReading = NO;
[self barCodeReadGoToNextPage:[metadataObj stringValue]];
// If the audio player is not nil, then play the sound effect.
if (_audioPlayer) {
[_audioPlayer play];
}
}
}
}
How can I get this to work for all barcodes of type AVMetadataObjectTypeEAN13Code
I am using QuickBlox iOS SDK for vidoe chating in my app. It works fine. Now I want to record the chat video and save it in camera roll. How can I do that.
I have gone through their documentation and implemented this -
-(IBAction)record:(id)sender{
// Create video Chat
videoChat = [[QBChat instance] createAndRegisterVideoChatInstance];
[videoChat setIsUseCustomVideoChatCaptureSession:YES];
// Create capture session
captureSession = [[AVCaptureSession alloc] init];
// ... setup capture session here
/*We create a serial queue to handle the processing of our frames*/
dispatch_queue_t callbackQueue= dispatch_queue_create("cameraQueue", NULL);
[videoCaptureOutput setSampleBufferDelegate:self queue:callbackQueue];
/*We start the capture*/
[captureSession startRunning];
}
-(void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer: (CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection {
// Do something with samples
// ...
// forward video samples to SDK
[videoChat processVideoChatCaptureVideoSample:sampleBuffer];
}
But I am not sure what to do from here.
How should I get the video data ?
From the quickblox docs
To setup a custom video capture session you simply follow these steps:
create an instance of AVCaptureSession
setup the input and output
implement frames callback and forward all frames to the QuickBlox iOS SDK
tell the QuickBlox SDK that you will use your own capture session
To setup a custom video capture session, setup input and output:
-(void) setupVideoCapture{
self.captureSession = [[AVCaptureSession alloc] init];
__block NSError *error = nil;
// set preset
[self.captureSession setSessionPreset:AVCaptureSessionPresetLow];
// Setup the Video input
AVCaptureDevice *videoDevice = [self frontFacingCamera];
//
AVCaptureDeviceInput *captureVideoInput = [AVCaptureDeviceInput deviceInputWithDevice:videoDevice error:&error];
if(error){
QBDLogEx(#"deviceInputWithDevice Video error: %#", error);
}else{
if ([self.captureSession canAddInput:captureVideoInput]){
[self.captureSession addInput:captureVideoInput];
}else{
QBDLogEx(#"cantAddInput Video");
}
}
// Setup Video output
AVCaptureVideoDataOutput *videoCaptureOutput = [[AVCaptureVideoDataOutput alloc] init];
videoCaptureOutput.alwaysDiscardsLateVideoFrames = YES;
//
// Set the video output to store frame in BGRA (It is supposed to be faster)
NSString* key = (NSString*)kCVPixelBufferPixelFormatTypeKey;
NSNumber* value = [NSNumber numberWithUnsignedInt:kCVPixelFormatType_32BGRA];
NSDictionary* videoSettings = [NSDictionary dictionaryWithObject:value forKey:key];
[videoCaptureOutput setVideoSettings:videoSettings];
/*And we create a capture session*/
if([self.captureSession canAddOutput:videoCaptureOutput]){
[self.captureSession addOutput:videoCaptureOutput];
}else{
QBDLogEx(#"cantAddOutput");
}
[videoCaptureOutput release];
// set FPS
int framesPerSecond = 3;
AVCaptureConnection *conn = [videoCaptureOutput connectionWithMediaType:AVMediaTypeVideo];
if (conn.isVideoMinFrameDurationSupported){
conn.videoMinFrameDuration = CMTimeMake(1, framesPerSecond);
}
if (conn.isVideoMaxFrameDurationSupported){
conn.videoMaxFrameDuration = CMTimeMake(1, framesPerSecond);
}
/*We create a serial queue to handle the processing of our frames*/
dispatch_queue_t callbackQueue= dispatch_queue_create("cameraQueue", NULL);
[videoCaptureOutput setSampleBufferDelegate:self queue:callbackQueue];
dispatch_release(callbackQueue);
// Add preview layer
AVCaptureVideoPreviewLayer *prewLayer = [[[AVCaptureVideoPreviewLayer alloc] initWithSession:self.captureSession] autorelease];
[prewLayer setVideoGravity:AVLayerVideoGravityResizeAspectFill];
CGRect layerRect = [[myVideoView layer] bounds];
[prewLayer setBounds:layerRect];
[prewLayer setPosition:CGPointMake(CGRectGetMidX(layerRect),CGRectGetMidY(layerRect))];
myVideoView.hidden = NO;
[myVideoView.layer addSublayer:prewLayer];
/*We start the capture*/
[self.captureSession startRunning];
}
- (AVCaptureDevice *) cameraWithPosition:(AVCaptureDevicePosition) position{
NSArray *devices = [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo];
for (AVCaptureDevice *device in devices) {
if ([device position] == position) {
return device;
}
}
return nil;
}
- (AVCaptureDevice *) backFacingCamera{
return [self cameraWithPosition:AVCaptureDevicePositionBack];
}
- (AVCaptureDevice *) frontFacingCamera{
return [self cameraWithPosition:AVCaptureDevicePositionFront];
}
Implement frames callback:
- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection {
// Usually we just forward camera frames to QuickBlox SDK
// But we also can do something with them before, for example - apply some video filters or so
[self.videoChat processVideoChatCaptureVideoSample:sampleBuffer];
}
Tell to QuickBlox iOS SDK that we use our own video capture session:
self.videoChat = [[QBChat instance] createAndRegisterVideoChatInstance];
self.videoChat.viewToRenderOpponentVideoStream = opponentVideoView;
//
// we use own video capture session
self.videoChat.isUseCustomVideoChatCaptureSession = YES;
I use ZBarReaderViewController for scan QR Code. and it was perfectly worked on iOS 6.
But when i use iOS 7 with my project then it is not properly working with ZBarReaderViewController
Issue is related to memory, it take more then 100 MB and my device is hang at this time.
Generally in my project. user can scan QR Generator image and i have function which recognize code of QR code is related to my string which i got from server then if YES then i goes to next view controller otherwise remain in current (continue QR SCAN) screen.
If QR code mach with my string then on next screen has "cancel" button which make be scan another code ( it means i got to previous viewController (QR SCAN)).
At that time when i go to next viewController and back to pervious (QR Scan screen) then each time i got ZBarReaderViewController is allocated so (May be) memory related issue is generated.
but i write code
if(self.ZBarReaderVC)
{
for(UIView *subVies in self.ZBarReaderVC.cameraOverlayView.subviews)
[subVies removeFromSuperview];
for(UIView *subVies in self.ZBarReaderVC.view.subviews)
[subVies removeFromSuperview];
[self.ZBarReaderVC removeFromParentViewController];
self.ZBarReaderVC = nil;
}
after [self.ZBarReaderVC dismissModalViewControllerAnimated: YES]; I remove ZBarReaderViewController at the end time then why each time i got allocated ZBarReaderViewController ???
And also i put [self.ZBarReaderVC.readerView stop]; before dismiss ZBarReaderViewController fro stop scanning stream of reader
but also it not worked for me.
But i tried to solve my problem about hours of time but i am not able to solve my issue
please help me.
Alos i found similar problem
Zbar SDK and ios7/xcode 5 - App is reaching 100% cpu use and memory more than 100MB
http://sourceforge.net/p/zbar/discussion/1072195/thread/df4c215a/
But No one can help me.
I found that in iOS 7 issue is occur at
self.ZBarReaderVC.view.frame = self.view.bounds;
I put break point here and check whenever i come bake from previous viewController, its take more time and also memory (issue) at this code.
So first i need to remove view of self.ZBarReaderVC with its all subViews.. so at first i need to write
if(self.ZBarReaderVC) // first check `self.ZBarReaderVC` is created or not?
{
[self.ZBarReaderVC.readerView stop]; // then stop continue scanning stream of "self.ZBarReaderVC"
for(UIView *subViews in self.ZBarReaderVC.view.subviews) // remove all subviews
[subViews removeFromSuperview];
[self.ZBarReaderVC.view removeFromSuperview];
self.ZBarReaderVC.view = nil;
}
And also i got that in iOS 7 self.ZBarReaderVC has remain continue scanning stream of QR Code so each time we need to stop it whenever your QR Code scanning is done and you need to dismiss your self.ZBarReaderVC then first stop scanning by [self.ZBarReaderVC.readerView stop];
And some time user need to write/call (For do/implement some type of extra features)
[self.ZBarReaderVC viewDidLoad];
[self.ZBarReaderVC viewWillAppear:NO];
[self.ZBarReaderVC viewDidAppear:NO];
Methods of self.ZBarReaderVC then it is not need to use in iOS 7, so if any user who call this methods of self.ZBarReaderVC then please put it in comment.
I hope this my suggestion is helpful for others.
Thanks :)
If you are targetting your app for iOS7 only, I ditched the ZBar component and used the native AVFoundation method, making the viewcontroller a AVCaptureMetadataOutputObjectsDelegate. Works prefectly with 3% CPU usage:
viewcontroller.h:
#interface viewcontroller : UIViewController <AVCaptureMetadataOutputObjectsDelegate> {
AVCaptureSession *_session;
AVCaptureDevice *_device;
AVCaptureDeviceInput *_input;
AVCaptureMetadataOutput *_output;
AVCaptureVideoPreviewLayer *_prevLayer;
UIView *_highlightView;
}
viewcontroller.m
- (IBAction)btnScan:(id)sender {
_session = [[AVCaptureSession alloc] init];
_device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
NSError *error = nil;
_input = [AVCaptureDeviceInput deviceInputWithDevice:_device error:&error];
if (_input) {
[_session addInput:_input];
} else {
NSLog(#"Error: %#", error);
}
_output = [[AVCaptureMetadataOutput alloc] init];
[_output setMetadataObjectsDelegate:self queue:dispatch_get_main_queue()];
[_session addOutput:_output];
_output.metadataObjectTypes = [_output availableMetadataObjectTypes];
_prevLayer = [AVCaptureVideoPreviewLayer layerWithSession:_session];
_prevLayer.frame = self.view.bounds;
_prevLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;
[self.view.layer addSublayer:_prevLayer];
[_session startRunning];
}
- (void)captureOutput:(AVCaptureOutput *)captureOutput
didOutputMetadataObjects:(NSArray *)metadataObjects
fromConnection:(AVCaptureConnection *)connection {
AVMetadataMachineReadableCodeObject *barCodeObject;
NSString *detectionString = nil;
NSArray *barCodeTypes = #[AVMetadataObjectTypeCode39Code, AVMetadataObjectTypeCode128Code,
AVMetadataObjectTypeQRCode];
for (AVMetadataObject *metadata in metadataObjects) {
for (NSString *type in barCodeTypes) {
if ([metadata.type isEqualToString:type]) {
barCodeObject = (AVMetadataMachineReadableCodeObject *)
[_prevLayer transformedMetadataObjectForMetadataObject:
(AVMetadataMachineReadableCodeObject *)metadata];
detectionString = [(AVMetadataMachineReadableCodeObject *)metadata stringValue];
break;
}
}
if (detectionString != nil) {
NSLog(#"%#", detectionString);
[self buscarCarga:detectionString]; //Do whatever you want with the data
[_session stopRunning];
AVCaptureInput* input = [_session.inputs objectAtIndex:0];
[_session removeInput:input];
AVCaptureVideoDataOutput* output = (AVCaptureVideoDataOutput*)[_session.outputs objectAtIndex:0];
[_session removeOutput:output];
[_prevLayer removeFromSuperlayer];
}
else
NSLog(#"No data");
}
}
- (void)viewDidLoad
{
[super viewDidLoad];
self.ZBarReaderVC = [ZBarReaderViewController new];
self.ZBarReaderVC.readerDelegate=self;
self.ZBarReaderVC.supportedOrientationsMask = ZBarOrientationMaskAll;
ZBarImageScanner *scanner = self.ZBarReaderVC.scanner;
[scanner setSymbology: ZBAR_I25 config: ZBAR_CFG_ENABLE to: 0];
}
#pragma mark - Button click method
- (IBAction)startScanning:(id)sender {
NSLog(#"Scanning..");
resultTextView.text = #"Scanning..";
[self presentViewController:self.ZBarReaderVC animated:YES completion:nil];
}
Using Ray Wenderlich's QRCode reader from Chapter 22 of iOS7 Tutorials, I am successfully reading QRCodes for my current app. I am now extending it that upon successfully reading a QRCode, I want to store the stringValue of the AVMetadataMachineReadableCodeObject that was read, segue to a new view, and use that data on the new view, more or less exactly how most QRCode reader apps (like RedLaser, etc...) process barcodes and QRCodes.
However, I call [captureSession stopRunning] (so that it does not read any more QRCodes and trigger additional segues) and there is a 10+ second hang. I have tried to implement an async call per this SO question, however to no avail. I have also looked at these SO Questions and they seem not to be appropriate for this purpose.
Does anyone have an idea how to remove this hanging?
Here is the code:
#import "BMQRCodeReaderViewController.h"
#import "NSString+containsString.h"
#import "BMManualExperimentDataEntryViewController.h"
#import AVFoundation;
#interface BMQRCodeReaderViewController ()
<AVCaptureMetadataOutputObjectsDelegate>
#end
#implementation BMQRCodeReaderViewController {
AVCaptureSession *_captureSession;
AVCaptureDevice *_videoDevice;
AVCaptureDeviceInput *_videoInput;
AVCaptureVideoPreviewLayer *_previewLayer;
BOOL _running;
AVCaptureMetadataOutput *_metadataOutput;
}
- (void)setupCaptureSession { // 1
if (_captureSession) return;
// 2
_videoDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
if (!_videoDevice) {
NSLog(#"No video camera on this device!"); return;
}
// 3
_captureSession = [[AVCaptureSession alloc] init];
// 4
_videoInput = [[AVCaptureDeviceInput alloc] initWithDevice:_videoDevice error:nil];
// 5
if ([_captureSession canAddInput:_videoInput]) { [_captureSession addInput:_videoInput];
}
// 6
_previewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:_captureSession];
_previewLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;
_metadataOutput = [[AVCaptureMetadataOutput alloc] init];
dispatch_queue_t metadataQueue = dispatch_queue_create("com.razeware.ColloQR.metadata", 0);
[_metadataOutput setMetadataObjectsDelegate:self queue:metadataQueue];
if ([_captureSession canAddOutput:_metadataOutput]) { [_captureSession addOutput:_metadataOutput];
}
}
- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputMetadataObjects:(NSArray *)metadataObjects
fromConnection:(AVCaptureConnection *)connection {
// This fancy BOOL is just helping me fire the segue when the correct string is found
__block NSNumber *didFind = [NSNumber numberWithBool:NO];
[metadataObjects enumerateObjectsUsingBlock:^(AVMetadataObject *obj, NSUInteger idx, BOOL *stop) {
AVMetadataMachineReadableCodeObject *readableObject = (AVMetadataMachineReadableCodeObject *)obj;
NSLog(#"Metadata: %#", readableObject);
// [ containsString is a category I extended for NSString, just FYI
if ([readableObject.stringValue containsString:#"CorrectString"]) {
didFind = [NSNumber numberWithBool:YES];
NSLog(#"Found it");
_testName = #"NameOfTest";
*stop = YES;
return;
}
}];
if ([didFind boolValue]) {
NSLog(#"Confirming we found it");
dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0), ^{
[self stopRunning];
});
_labelTestName.text = _testName;
[self performSegueWithIdentifier:#"segueFromFoundQRCode" sender:self];
}
else {
NSLog(#"Did not find it");
}
}
- (void)startRunning {
if (_running)
return;
[_captureSession startRunning];
_metadataOutput.metadataObjectTypes = _metadataOutput.availableMetadataObjectTypes;
_running = YES;
}
- (void)stopRunning {
if (!_running) return;
[_captureSession stopRunning];
_running = NO;
}
- (void)viewDidLoad
{
[super viewDidLoad];
// Do any additional setup after loading the view.
[self setupCaptureSession];
[self setupNavBar];
[self startRunning];
_previewLayer.frame = _previewView.bounds;
[_previewView.layer addSublayer:_previewLayer];
}
I have successively solved the issue. The issue was that the delegate method call
- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputMetadataObjects:(NSArray *)metadataObjects
fromConnection:(AVCaptureConnection *)connection
is running in the background. This was determined with a [NSThread isMainThread] call, which it fails.
The solution is to find the proper stringValue from the QRCode, stop your captureSession in the background, THEN call your segue on the main thread. The solution looks as such:
- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputMetadataObjects:(NSArray *)metadataObjects
fromConnection:(AVCaptureConnection *)connection {
// This fancy BOOL is just helping me fire the segue when the correct string is found
__block NSNumber *didFind = [NSNumber numberWithBool:NO];
[metadataObjects enumerateObjectsUsingBlock:^(AVMetadataObject *obj, NSUInteger idx, BOOL *stop) {
AVMetadataMachineReadableCodeObject *readableObject = (AVMetadataMachineReadableCodeObject *)obj;
NSLog(#"Metadata: %#", readableObject);
if ([NSThread isMainThread]) {
NSLog(#"Yes Main Thread");
}
else {
NSLog(#"Not main thread");
}
// [ containsString is a category I extended for NSString, just FYI
if ([readableObject.stringValue containsString:#"Biomeme"]) {
//NSLog(#"this is a test: %#", getTestName);
didFind = [NSNumber numberWithBool:YES];
NSLog(#"Found it");
_testName = readableObject.stringValue;
*stop = YES;
return;
}
}];
if ([didFind boolValue]) {
NSLog(#"Confirming we found it");
dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0), ^{
NSDate *start = [NSDate date];
[self stopRunning];
NSLog(#"time took: %f", -[start timeIntervalSinceNow]);
// *** Here is the key, make your segue on the main thread
dispatch_async(dispatch_get_main_queue(), ^{
[self performSegueWithIdentifier:#"segueFromFoundQRCode" sender:self];
_labelTestName.text = _testName;
});
});
}
else {
NSLog(#"Did not find it");
}
}
I have a single view application in which I am trying to test iOS7's AVCaptureMetadataOutput based on this explanation. My ViewController conforms to AVCaptureMetadataOutputObjectsDelegate and the code looks like this (almost exactly the same as Mattt's):
- (void)viewDidLoad
{
[super viewDidLoad];
// Do any additional setup after loading the view, typically from a nib.
// Testing the VIN Scanner before I make it part of the library
NSLog(#"Setting up the vin scanner");
AVCaptureSession *session = [[AVCaptureSession alloc] init];
AVCaptureDevice *device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
NSError *error = nil;
AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:device
error:&error];
if (input) {
[session addInput:input];
} else {
NSLog(#"Error: %#", error);
}
AVCaptureMetadataOutput *output = [[AVCaptureMetadataOutput alloc] init];
[output setMetadataObjectsDelegate:self queue:dispatch_get_main_queue()];
[session addOutput:output];
[session startRunning];
}
- (void)captureOutput:(AVCaptureOutput *)captureOutput
didOutputMetadataObjects:(NSArray *)metadataObjects
fromConnection:(AVCaptureConnection *)connection
{
NSString *code = nil;
for (AVMetadataObject *metadata in metadataObjects) {
if ([metadata.type isEqualToString:AVMetadataObjectTypeCode39Code]) {
code = [(AVMetadataMachineReadableCodeObject *)metadata stringValue];
break;
}
}
NSLog(#"code: %#", code);
}
When I run this on an iOS7 device (I've tried an iPhone 4 and iPhone 4s) XCode logs "Setting up the vin scanner" but the camera (ie the AVCaptureSession) never opens.
Edit 1:
I added the following code to show the camera output on screen:
AVCaptureVideoPreviewLayer *previewLayer = [AVCaptureVideoPreviewLayer layerWithSession:session];
// Display full screen
previewLayer.frame = self.view.frame;
// Add the video preview layer to the view
[self.view.layer addSublayer:previewLayer];
But the display is very odd, does not conform to the screen and the way it rotates does not make sense. The other issue is that when I focus the camera on a bar code the metadata delegate method is never called. Please see pictures below:
The camera will not open the way it does for the UIImagePickerController. The problem is that your code does nothing with the output. You'll need to add a preview layer to display the output of the camera as it streams in.
AVCaptureVideoPreviewLayer *previewLayer = [AVCaptureVideoPreviewLayer layerWithSession:session];
// Display full screen
previewLayer.frame = CGRectMake(0.0, 0.0, self.view.frame.size.width, self.view.frame.size.height);
// Add the video preview layer to the view
[self.view.layer addSublayer:previewLayer];
[session startRunning];
Edit**
After taking a deeper look at your code I noticed a few more issues.
First you need to also set the MetaDataObjectTypes you want to search for, right now your not looking for any valid object types. This should be added after you add the output to the session. You can view the full list of available types in the documentation
[output setMetadataObjectTypes:#[AVMetadataObjectTypeQRCode, AVMetadataObjectTypeEAN13Code]];
Second your AVCaptureSession *session is a local variable in your viewDidLoad, take this and place it just after your #interface ViewController () as shown below.
#interface ViewController ()
#property (nonatomic, strong) AVCaptureSession *session;
#end
#implementation ViewController
- (void)viewDidLoad
{
[super viewDidLoad];
self.session = [[AVCaptureSession alloc] init];
// Testing the VIN Scanner before I make it part of the library
NSLog(#"Setting up the vin scanner");
self.session = [[AVCaptureSession alloc] init];
AVCaptureDevice *device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
NSError *error = nil;
AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:device
error:&error];
if (input) {
[self.session addInput:input];
} else {
NSLog(#"Error: %#", error);
}
AVCaptureMetadataOutput *output = [[AVCaptureMetadataOutput alloc] init];
[output setMetadataObjectsDelegate:self queue:dispatch_get_main_queue()];
[self.session addOutput:output];
[output setMetadataObjectTypes:#[AVMetadataObjectTypeQRCode, AVMetadataObjectTypeEAN13Code]];
[self.session startRunning];
}