AVCaptureSession stopRunning method creates terrible hang - ios

Using Ray Wenderlich's QRCode reader from Chapter 22 of iOS7 Tutorials, I am successfully reading QRCodes for my current app. I am now extending it that upon successfully reading a QRCode, I want to store the stringValue of the AVMetadataMachineReadableCodeObject that was read, segue to a new view, and use that data on the new view, more or less exactly how most QRCode reader apps (like RedLaser, etc...) process barcodes and QRCodes.
However, I call [captureSession stopRunning] (so that it does not read any more QRCodes and trigger additional segues) and there is a 10+ second hang. I have tried to implement an async call per this SO question, however to no avail. I have also looked at these SO Questions and they seem not to be appropriate for this purpose.
Does anyone have an idea how to remove this hanging?
Here is the code:
#import "BMQRCodeReaderViewController.h"
#import "NSString+containsString.h"
#import "BMManualExperimentDataEntryViewController.h"
#import AVFoundation;
#interface BMQRCodeReaderViewController ()
<AVCaptureMetadataOutputObjectsDelegate>
#end
#implementation BMQRCodeReaderViewController {
AVCaptureSession *_captureSession;
AVCaptureDevice *_videoDevice;
AVCaptureDeviceInput *_videoInput;
AVCaptureVideoPreviewLayer *_previewLayer;
BOOL _running;
AVCaptureMetadataOutput *_metadataOutput;
}
- (void)setupCaptureSession { // 1
if (_captureSession) return;
// 2
_videoDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
if (!_videoDevice) {
NSLog(#"No video camera on this device!"); return;
}
// 3
_captureSession = [[AVCaptureSession alloc] init];
// 4
_videoInput = [[AVCaptureDeviceInput alloc] initWithDevice:_videoDevice error:nil];
// 5
if ([_captureSession canAddInput:_videoInput]) { [_captureSession addInput:_videoInput];
}
// 6
_previewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:_captureSession];
_previewLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;
_metadataOutput = [[AVCaptureMetadataOutput alloc] init];
dispatch_queue_t metadataQueue = dispatch_queue_create("com.razeware.ColloQR.metadata", 0);
[_metadataOutput setMetadataObjectsDelegate:self queue:metadataQueue];
if ([_captureSession canAddOutput:_metadataOutput]) { [_captureSession addOutput:_metadataOutput];
}
}
- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputMetadataObjects:(NSArray *)metadataObjects
fromConnection:(AVCaptureConnection *)connection {
// This fancy BOOL is just helping me fire the segue when the correct string is found
__block NSNumber *didFind = [NSNumber numberWithBool:NO];
[metadataObjects enumerateObjectsUsingBlock:^(AVMetadataObject *obj, NSUInteger idx, BOOL *stop) {
AVMetadataMachineReadableCodeObject *readableObject = (AVMetadataMachineReadableCodeObject *)obj;
NSLog(#"Metadata: %#", readableObject);
// [ containsString is a category I extended for NSString, just FYI
if ([readableObject.stringValue containsString:#"CorrectString"]) {
didFind = [NSNumber numberWithBool:YES];
NSLog(#"Found it");
_testName = #"NameOfTest";
*stop = YES;
return;
}
}];
if ([didFind boolValue]) {
NSLog(#"Confirming we found it");
dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0), ^{
[self stopRunning];
});
_labelTestName.text = _testName;
[self performSegueWithIdentifier:#"segueFromFoundQRCode" sender:self];
}
else {
NSLog(#"Did not find it");
}
}
- (void)startRunning {
if (_running)
return;
[_captureSession startRunning];
_metadataOutput.metadataObjectTypes = _metadataOutput.availableMetadataObjectTypes;
_running = YES;
}
- (void)stopRunning {
if (!_running) return;
[_captureSession stopRunning];
_running = NO;
}
- (void)viewDidLoad
{
[super viewDidLoad];
// Do any additional setup after loading the view.
[self setupCaptureSession];
[self setupNavBar];
[self startRunning];
_previewLayer.frame = _previewView.bounds;
[_previewView.layer addSublayer:_previewLayer];
}

I have successively solved the issue. The issue was that the delegate method call
- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputMetadataObjects:(NSArray *)metadataObjects
fromConnection:(AVCaptureConnection *)connection
is running in the background. This was determined with a [NSThread isMainThread] call, which it fails.
The solution is to find the proper stringValue from the QRCode, stop your captureSession in the background, THEN call your segue on the main thread. The solution looks as such:
- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputMetadataObjects:(NSArray *)metadataObjects
fromConnection:(AVCaptureConnection *)connection {
// This fancy BOOL is just helping me fire the segue when the correct string is found
__block NSNumber *didFind = [NSNumber numberWithBool:NO];
[metadataObjects enumerateObjectsUsingBlock:^(AVMetadataObject *obj, NSUInteger idx, BOOL *stop) {
AVMetadataMachineReadableCodeObject *readableObject = (AVMetadataMachineReadableCodeObject *)obj;
NSLog(#"Metadata: %#", readableObject);
if ([NSThread isMainThread]) {
NSLog(#"Yes Main Thread");
}
else {
NSLog(#"Not main thread");
}
// [ containsString is a category I extended for NSString, just FYI
if ([readableObject.stringValue containsString:#"Biomeme"]) {
//NSLog(#"this is a test: %#", getTestName);
didFind = [NSNumber numberWithBool:YES];
NSLog(#"Found it");
_testName = readableObject.stringValue;
*stop = YES;
return;
}
}];
if ([didFind boolValue]) {
NSLog(#"Confirming we found it");
dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0), ^{
NSDate *start = [NSDate date];
[self stopRunning];
NSLog(#"time took: %f", -[start timeIntervalSinceNow]);
// *** Here is the key, make your segue on the main thread
dispatch_async(dispatch_get_main_queue(), ^{
[self performSegueWithIdentifier:#"segueFromFoundQRCode" sender:self];
_labelTestName.text = _testName;
});
});
}
else {
NSLog(#"Did not find it");
}
}

Related

How to mirror iOS screen via USB?

I'm trying to mirror iOS device screen via USB connection to OSX. QuickTime does this fine, and I read this article with a code example: https://nadavrub.wordpress.com/2015/07/06/macos-media-capture-using-coremediaio/
However, the callback of CMIOStreamCopyBufferQueue is never called and I'm wondering what am I doing wrong?
Have anyone faced this issue and can provide a working example ?
Thanks.
Well.. eventually I did what Nadav told me in his blog - discover DAL devices and capture their output using AVCaptureSession like this:
-(id) init {
// Allow iOS Devices Discovery
CMIOObjectPropertyAddress prop =
{ kCMIOHardwarePropertyAllowScreenCaptureDevices,
kCMIOObjectPropertyScopeGlobal,
kCMIOObjectPropertyElementMaster };
UInt32 allow = 1;
CMIOObjectSetPropertyData( kCMIOObjectSystemObject,
&prop, 0, NULL,
sizeof(allow), &allow );
// Get devices
NSArray *devices = [AVCaptureDevice devicesWithMediaType:AVMediaTypeMuxed];
BOOL deviceAttahced = false;
for (int i = 0; i < [devices count]; i++) {
AVCaptureDevice *device = devices[i];
if ([[device uniqueID] isEqualToString:/*deviceUDID*/]) {
deviceAttahced = true;
[self startSession:device];
break;
}
}
NSNotificationCenter *notificationCenter = [NSNotificationCenter defaultCenter];
// Device not attached - subscribe to onConnect notifications
if (!deviceAttahced) {
id deviceWasConnectedObserver = [notificationCenter addObserverForName:AVCaptureDeviceWasConnectedNotification
object:nil
queue:[NSOperationQueue mainQueue]
usingBlock:^(NSNotification *note) {
AVCaptureDevice *device = note.object;
[self deviceConnected:device];
}];
observers = [[NSArray alloc] initWithObjects:deviceWasConnectedObserver, nil];
}
return self;
}
- (void) deviceConnected:(AVCaptureDevice *)device {
if ([[device uniqueID] isEqualToString:/*deviceUDID*/]) {
[self startSession:device];
}
}
- (void) startSession:(AVCaptureDevice *)device {
// Init capturing session
session = [[AVCaptureSession alloc] init];
// Star session configuration
[session beginConfiguration];
// Add session input
NSError *error;
newVideoDeviceInput = [AVCaptureDeviceInput deviceInputWithDevice:device error:&error];
if (newVideoDeviceInput == nil) {
dispatch_async(dispatch_get_main_queue(), ^(void) {
NSLog(#"%#", error);
});
} else {
[session addInput:newVideoDeviceInput];
}
// Add session output
videoDataOutput = [[AVCaptureVideoDataOutput alloc] init];
videoDataOutput.videoSettings = [NSDictionary dictionaryWithObject: [NSNumber numberWithInt:kCVPixelFormatType_32BGRA] forKey: (id)kCVPixelBufferPixelFormatTypeKey];
dispatch_queue_t videoQueue = dispatch_queue_create("videoQueue", NULL);
[videoDataOutput setSampleBufferDelegate:self queue:videoQueue];
[session addOutput:videoDataOutput];
// Finish session configuration
[session commitConfiguration];
// Start the session
[session startRunning];
}
#pragma mark - AVCaptureAudioDataOutputSampleBufferDelegate
- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection {
NSImage *resultNSImage = [self imageFromSampleBuffer:sampleBuffer];
/*
* Here you can do whatever you need with the frame (e.g. convert to JPG)
*/
}

objective c delay in open page after scanning QR code

I'm quite new with ios developing (even though I have a lot of experience with other platforms) and I am facing an issue while I'm try to implement a QR code reader.
This is the piece of code I wrote:
- (BOOL)startReading {
NSError *error;
AVCaptureDevice *captureDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:captureDevice error:&error];
if (!input) {
NSLog(#"%#", [error localizedDescription]);
return NO;
}
_qrLB.text=#"Loading....";
_captureSession = [[AVCaptureSession alloc] init];
[_captureSession addInput:input];
AVCaptureMetadataOutput *captureMetadataOutput = [[AVCaptureMetadataOutput alloc] init];
[_captureSession addOutput:captureMetadataOutput];
dispatch_queue_t dispatchQueue;
dispatchQueue = dispatch_queue_create("myQueue", NULL);
[captureMetadataOutput setMetadataObjectsDelegate:self queue:dispatchQueue];
[captureMetadataOutput setMetadataObjectTypes:[NSArray arrayWithObject:AVMetadataObjectTypeQRCode]];
_videoPreviewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:_captureSession];
[_videoPreviewLayer setVideoGravity:AVLayerVideoGravityResizeAspectFill];
[_videoPreviewLayer setFrame:_viewPreview.layer.bounds];
[_viewPreview.layer addSublayer:_videoPreviewLayer];
[_captureSession startRunning];
return YES;
}
#pragma mark
#pragma mark Stop reading QR Code
-(void)stopReading{
[_captureSession stopRunning];
_captureSession = nil;
[_videoPreviewLayer removeFromSuperlayer];
}
#pragma mark
#pragma mark Capture QR Code
-(void)captureOutput:(AVCaptureOutput *)captureOutput didOutputMetadataObjects:(NSArray *)metadataObjects fromConnection:(AVCaptureConnection *)connection{
NSLog(#"qrcode1");
if (metadataObjects != nil && [metadataObjects count] > 0) {
AVMetadataMachineReadableCodeObject *metadataObj = [metadataObjects objectAtIndex:0];
if ([[metadataObj type] isEqualToString:AVMetadataObjectTypeQRCode]) {
[_lblStatus performSelectorOnMainThread:#selector(setText:) withObject:[metadataObj stringValue] waitUntilDone:NO];
[self performSelectorOnMainThread:#selector(stopReading) withObject:nil waitUntilDone:NO];
_isReading = NO;
}
}
NSLog(#"Value of qrcodetext = %#", _lblStatus.text);
ReceiveVC *view = [self.storyboard instantiateViewControllerWithIdentifier:#"ReceiveVC"];
view.imageURL=_lblStatus.text;
view.title = #"";
[self.navigationController pushViewController:view animated:YES];
NSLog(#"qrcode2");
}
When I launch the app on the device (iphone 5s) and I scan a QR code this is the output I got:
screenshot
It works fine! it reads the value and it redirects the user to the page "ReceiveVC" ("ReceiveVC1" is a debug I print when the page at loading)
The problem is: it's taking like 20 seconds (sometime even more!!) from the moment in which the value is correctly read, to the moment in which the page is opened (check time of lines 2 and 3). Seems taht the navigationController is waiting for some thread to finish but I cannot figure out which one, since all the values I need are already there!
Can you help me?
Thank you

Obj. C - QR Reading application runs too slow

I made an QR Code reading application with AVFoundation by tutorial on this site (tutorial of Appcoda). After reading QR code, the app shows an UIAlertView. But it takes nearly 2 minutes (sometimes more than 3mins). I'll paste the whole ViewController.m file here. I hope it is enough. (UIAlertView is in captureOutput method)
//
// ViewController.m
// Yuvio
//
// Created by İhsan Batuğhan YILMAZ on 29/08/15.
// Copyright © 2015 Farabius. All rights reserved.
//
#import "ViewController.h"
#import <AVFoundation/AVFoundation.h>
#interface ViewController ()
#property (nonatomic) BOOL isReading;
#property (nonatomic, strong) AVCaptureSession *captureSession;
#property (nonatomic, strong) AVCaptureVideoPreviewLayer *videoPreviewLayer;
-(BOOL)startReading;
-(void) stopReading;
#end
#implementation ViewController
- (void)viewDidLoad {
[super viewDidLoad];
// Do any additional setup after loading the view, typically from a nib.
_isReading=NO;
_captureSession = nil;
}
- (void)didReceiveMemoryWarning {
[super didReceiveMemoryWarning];
// Dispose of any resources that can be recreated.
}
-(IBAction)startStopReading:(id)sender {
if (!_isReading) {
if ([self startReading]) {
[_btnStart setTitle:#"Stop"];
}
}
else{
[self stopReading];
[_btnStart setTitle:#"Start!"];
}
_isReading = !_isReading;
}
- (BOOL)startReading {
NSError *error;
AVCaptureDevice *captureDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:captureDevice error:&error];
if (!input) {
NSLog(#"%#", [error localizedDescription]);
return NO;
}
_captureSession = [[AVCaptureSession alloc] init];
[_captureSession addInput:input];
AVCaptureMetadataOutput *captureMetadataOutput = [[AVCaptureMetadataOutput alloc] init];
[_captureSession addOutput:captureMetadataOutput];
dispatch_queue_t dispatchQueue;
dispatchQueue = dispatch_queue_create("myQueue", NULL);
[captureMetadataOutput setMetadataObjectsDelegate:self queue:dispatchQueue];
[captureMetadataOutput setMetadataObjectTypes:[NSArray arrayWithObject:AVMetadataObjectTypeQRCode]];
_videoPreviewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:_captureSession];
[_videoPreviewLayer setVideoGravity:AVLayerVideoGravityResizeAspectFill];
[_videoPreviewLayer setFrame:_cameraView.layer.bounds];
[_cameraView.layer addSublayer:_videoPreviewLayer];
[_captureSession startRunning];
return YES;
}
-(void)captureOutput:(AVCaptureOutput *)captureOutput didOutputMetadataObjects:(NSArray *)metadataObjects fromConnection:(AVCaptureConnection *)connection{
if (metadataObjects != nil && [metadataObjects count] > 0) {
AVMetadataMachineReadableCodeObject *metadataObj = [metadataObjects objectAtIndex:0];
if ([[metadataObj type] isEqualToString:AVMetadataObjectTypeQRCode]) {
UIAlertView *alert = [[UIAlertView alloc]
initWithTitle:#"QR Detected"
message:[metadataObj stringValue]
delegate:self
cancelButtonTitle:#"OK" otherButtonTitles:nil, nil];
[alert show];
[self performSelectorOnMainThread:#selector(stopReading) withObject:nil waitUntilDone:NO];
[_btnStart performSelectorOnMainThread:#selector(setTitle:) withObject:#"Start!" waitUntilDone:NO];
_isReading = NO;
}
}
}
-(void)stopReading{
[_captureSession stopRunning];
_captureSession = nil;
[_videoPreviewLayer removeFromSuperlayer];
return;
}
#end
I think that problem is in using UI functions outside main thread. Try this code:
-(void)captureOutput:(AVCaptureOutput *)captureOutput didOutputMetadataObjects:(NSArray *)metadataObjects fromConnection:(AVCaptureConnection *)connection{
if (metadataObjects != nil && [metadataObjects count] > 0) {
AVMetadataMachineReadableCodeObject *metadataObj = [metadataObjects objectAtIndex:0];
if ([[metadataObj type] isEqualToString:AVMetadataObjectTypeQRCode]) {
__weak ViewController *weakSelf = self;
dispatch_async(dispatch_get_main_queue(), ^{
[weakSelf processQRCode:metadataObj];
});
}
}
}
-(void)processQRCode:(AVMetadataMachineReadableCodeObject *)codeObject{
UIAlertView *alert = [[UIAlertView alloc]
initWithTitle:#"QR Detected"
message:[codeObject stringValue]
delegate:self
cancelButtonTitle:#"OK" otherButtonTitles:nil, nil];
[alert show];
[self stopReading];
[_btnStart setTitle:#"Start!" forState:UIControlStateNormal];
_isReading = NO;
}
I checked your view controller with this fix and it works fast.

Why AVCaptureVideoDataOutputSampleBufferDelegate method is not called

I use AVCaptureVideoDataOutput to display camera image at display preview. My intention is to use two dispatch queues, one for display session queue and another one is for image processing. So that I can display images on the preview and then in the background, I can do image processing. That is why I use AVCaptureVideoDataOutput and AVCaptureVideoDataOutputSampleBufferDelegate.
AVCaptureVideoDataOutput is properly setup to link to AVCaptureSession for display at preview. But the delegate method
- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection
is never get called to retrieve the image and proceed for image processing. What could be wrong with my setting? The code is shown below.
#import "AVCamViewController.h"
#import <AVFoundation/AVFoundation.h>
#import <AssetsLibrary/AssetsLibrary.h>
#import "AVCamPreviewView.h"
#import "ImageProcessing.h"
static void * CapturingStillImageContext = &CapturingStillImageContext;
static void * RecordingContext = &RecordingContext;
static void * SessionRunningAndDeviceAuthorizedContext = &SessionRunningAndDeviceAuthorizedContext;
#interface AVCamViewController() <AVCaptureVideoDataOutputSampleBufferDelegate>
// For use in the storyboards.
#property (weak, nonatomic) IBOutlet UIButton *cpuButton;
#property (weak, nonatomic) IBOutlet UIButton *gpuButton;
#property (weak, nonatomic) IBOutlet AVCamPreviewView *previewView;
#property (nonatomic, weak) IBOutlet UIButton *cameraButton;
- (IBAction)changeCamera:(id)sender;
- (IBAction)focusAndExposeTap:(UIGestureRecognizer *)gestureRecognizer;
- (UIImage *) imageFromSampleBuffer:(CMSampleBufferRef) sampleBuffer;
// Session management.
#property (nonatomic) dispatch_queue_t sessionQueue; // Communicate with the session and other session objects on this queue.
#property (nonatomic) AVCaptureSession *session;
#property (nonatomic) AVCaptureDeviceInput *videoDeviceInput;
#property (nonatomic) AVCaptureVideoDataOutput *vid_Output;
// Utilities.
#property (nonatomic) UIBackgroundTaskIdentifier backgroundRecordingID;
#property (nonatomic, getter = isDeviceAuthorized) BOOL deviceAuthorized;
#property (nonatomic, readonly, getter = isSessionRunningAndDeviceAuthorized) BOOL sessionRunningAndDeviceAuthorized;
#property (nonatomic) BOOL lockInterfaceRotation;
#property (nonatomic) id runtimeErrorHandlingObserver;
//fps
//#property int fps;
//Image processing management
#property (nonatomic) dispatch_queue_t im_processingQueue;
#property bool cpu_processing;
#property bool gpu_processing;
#end
#implementation AVCamViewController
- (BOOL)isSessionRunningAndDeviceAuthorized
{
return [[self session] isRunning] && [self isDeviceAuthorized];
}
+ (NSSet *)keyPathsForValuesAffectingSessionRunningAndDeviceAuthorized
{
return [NSSet setWithObjects:#"session.running", #"deviceAuthorized", nil];
}
- (void)viewDidLoad
{
[super viewDidLoad];
// self.fps = 30;
// Create the AVCaptureSession
AVCaptureSession *session = [[AVCaptureSession alloc] init];
[self setSession:session];
// Setup the preview view
[[self previewView] setSession:session];
// Check for device authorization
[self checkDeviceAuthorizationStatus];
// In general it is not safe to mutate an AVCaptureSession or any of its inputs, outputs, or connections from multiple threads at the same time.
// Why not do all of this on the main queue?
// -[AVCaptureSession startRunning] is a blocking call which can take a long time. We dispatch session setup to the sessionQueue so that the main queue isn't blocked (which keeps the UI responsive).
dispatch_queue_t sessionQueue = dispatch_queue_create("session queue", DISPATCH_QUEUE_SERIAL);
[self setSessionQueue:sessionQueue];
//we will use a separate dispatch session not to block the main queue in processing
dispatch_queue_t im_processingQueue = dispatch_queue_create("im_processing queue", DISPATCH_QUEUE_SERIAL);
[self setIm_processingQueue:im_processingQueue];
dispatch_async(sessionQueue, ^{
[self setBackgroundRecordingID:UIBackgroundTaskInvalid];
NSError *error = nil;
AVCaptureDevice *videoDevice = [AVCamViewController deviceWithMediaType:AVMediaTypeVideo preferringPosition:AVCaptureDevicePositionBack];
AVCaptureDeviceInput *videoDeviceInput = [AVCaptureDeviceInput deviceInputWithDevice:videoDevice error:&error];
if (error)
{
NSLog(#"%#", error);
}
if ([session canAddInput:videoDeviceInput])
{
[session addInput:videoDeviceInput];
[self setVideoDeviceInput:videoDeviceInput];
dispatch_async(dispatch_get_main_queue(), ^{
// Why are we dispatching this to the main queue?
// Because AVCaptureVideoPreviewLayer is the backing layer for AVCamPreviewView and UIView can only be manipulated on main thread.
// Note: As an exception to the above rule, it is not necessary to serialize video orientation changes on the AVCaptureVideoPreviewLayer’s connection with other session manipulation.
[[(AVCaptureVideoPreviewLayer *)[[self previewView] layer] connection] setVideoOrientation:(AVCaptureVideoOrientation)[self interfaceOrientation]];
});
}
AVCaptureDevice *audioDevice = [[AVCaptureDevice devicesWithMediaType:AVMediaTypeAudio] firstObject];
AVCaptureDeviceInput *audioDeviceInput = [AVCaptureDeviceInput deviceInputWithDevice:audioDevice error:&error];
if (error)
{
NSLog(#"%#", error);
}
if ([session canAddInput:audioDeviceInput])
{
[session addInput:audioDeviceInput];
}
AVCaptureVideoDataOutput *vid_Output = [[AVCaptureVideoDataOutput alloc] init];
vid_Output.alwaysDiscardsLateVideoFrames = NO;
if ([session canAddOutput:vid_Output])
{
[session addOutput:vid_Output];
AVCaptureConnection *connection = [vid_Output connectionWithMediaType:AVMediaTypeVideo];
if ([connection isVideoStabilizationSupported])
[connection setEnablesVideoStabilizationWhenAvailable:YES];
[self setVid_Output:vid_Output];
}
});
}
- (void)viewWillAppear:(BOOL)animated
{
dispatch_async([self sessionQueue], ^{
[self addObserver:self forKeyPath:#"sessionRunningAndDeviceAuthorized" options:(NSKeyValueObservingOptionOld | NSKeyValueObservingOptionNew) context:SessionRunningAndDeviceAuthorizedContext];
[self addObserver:self forKeyPath:#"vid_Output.recording" options:(NSKeyValueObservingOptionOld | NSKeyValueObservingOptionNew) context:RecordingContext];
[[NSNotificationCenter defaultCenter] addObserver:self selector:#selector(subjectAreaDidChange:) name:AVCaptureDeviceSubjectAreaDidChangeNotification object:[[self videoDeviceInput] device]];
__weak AVCamViewController *weakSelf = self;
[self setRuntimeErrorHandlingObserver:[[NSNotificationCenter defaultCenter] addObserverForName:AVCaptureSessionRuntimeErrorNotification object:[self session] queue:nil usingBlock:^(NSNotification *note) {
AVCamViewController *strongSelf = weakSelf;
dispatch_async([strongSelf sessionQueue], ^{
// Manually restarting the session since it must have been stopped due to an error.
[[strongSelf session] startRunning];
});
}]];
[[self session] startRunning];
});
}
- (void)viewDidDisappear:(BOOL)animated
{
dispatch_async([self sessionQueue], ^{
[[self session] stopRunning];
[[NSNotificationCenter defaultCenter] removeObserver:self name:AVCaptureDeviceSubjectAreaDidChangeNotification object:[[self videoDeviceInput] device]];
[[NSNotificationCenter defaultCenter] removeObserver:[self runtimeErrorHandlingObserver]];
[self removeObserver:self forKeyPath:#"sessionRunningAndDeviceAuthorized" context:SessionRunningAndDeviceAuthorizedContext];
[self removeObserver:self forKeyPath:#"vid_Output.recording" context:RecordingContext];
});
}
- (BOOL)prefersStatusBarHidden
{
return YES;
}
- (BOOL)shouldAutorotate
{
// Disable autorotation of the interface when recording is in progress.
return ![self lockInterfaceRotation];
}
- (NSUInteger)supportedInterfaceOrientations
{
return UIInterfaceOrientationMaskAll;
}
- (void)willRotateToInterfaceOrientation:(UIInterfaceOrientation)toInterfaceOrientation duration:(NSTimeInterval)duration
{
[[(AVCaptureVideoPreviewLayer *)[[self previewView] layer] connection] setVideoOrientation:(AVCaptureVideoOrientation)toInterfaceOrientation];
}
- (void)observeValueForKeyPath:(NSString *)keyPath ofObject:(id)object change:(NSDictionary *)change context:(void *)context
{
if (context == SessionRunningAndDeviceAuthorizedContext)
{
BOOL isRunning = [change[NSKeyValueChangeNewKey] boolValue];
dispatch_async(dispatch_get_main_queue(), ^{
if (isRunning)
{
[[self cameraButton] setEnabled:YES];
}
else
{
[[self cameraButton] setEnabled:NO];
}
});
}
else
{
[super observeValueForKeyPath:keyPath ofObject:object change:change context:context];
}
}
#pragma mark Actions
- (IBAction)changeCamera:(id)sender {
[[self cameraButton] setEnabled:NO];
dispatch_async([self sessionQueue], ^{
AVCaptureDevice *currentVideoDevice = [[self videoDeviceInput] device];
AVCaptureDevicePosition preferredPosition = AVCaptureDevicePositionUnspecified;
AVCaptureDevicePosition currentPosition = [currentVideoDevice position];
switch (currentPosition)
{
case AVCaptureDevicePositionUnspecified:
preferredPosition = AVCaptureDevicePositionBack;
break;
case AVCaptureDevicePositionBack:
preferredPosition = AVCaptureDevicePositionFront;
break;
case AVCaptureDevicePositionFront:
preferredPosition = AVCaptureDevicePositionBack;
break;
}
AVCaptureDevice *videoDevice = [AVCamViewController deviceWithMediaType:AVMediaTypeVideo preferringPosition:preferredPosition];
AVCaptureDeviceInput *videoDeviceInput = [AVCaptureDeviceInput deviceInputWithDevice:videoDevice error:nil];
[[self session] beginConfiguration];
[[self session] removeInput:[self videoDeviceInput]];
if ([[self session] canAddInput:videoDeviceInput])
{
[[NSNotificationCenter defaultCenter] removeObserver:self name:AVCaptureDeviceSubjectAreaDidChangeNotification object:currentVideoDevice];
[AVCamViewController setFlashMode:AVCaptureFlashModeAuto forDevice:videoDevice];
[[NSNotificationCenter defaultCenter] addObserver:self selector:#selector(subjectAreaDidChange:) name:AVCaptureDeviceSubjectAreaDidChangeNotification object:videoDevice];
[[self session] addInput:videoDeviceInput];
[self setVideoDeviceInput:videoDeviceInput];
}
else
{
[[self session] addInput:[self videoDeviceInput]];
}
[[self session] commitConfiguration];
dispatch_async(dispatch_get_main_queue(), ^{
[[self cameraButton] setEnabled:YES];
});
});
}
- (IBAction)focusAndExposeTap:(UIGestureRecognizer *)gestureRecognizer
{
CGPoint devicePoint = [(AVCaptureVideoPreviewLayer *)[[self previewView] layer] captureDevicePointOfInterestForPoint:[gestureRecognizer locationInView:[gestureRecognizer view]]];
[self focusWithMode:AVCaptureFocusModeAutoFocus exposeWithMode:AVCaptureExposureModeAutoExpose atDevicePoint:devicePoint monitorSubjectAreaChange:YES];
}
- (void)subjectAreaDidChange:(NSNotification *)notification
{
CGPoint devicePoint = CGPointMake(.5, .5);
[self focusWithMode:AVCaptureFocusModeContinuousAutoFocus exposeWithMode:AVCaptureExposureModeContinuousAutoExposure atDevicePoint:devicePoint monitorSubjectAreaChange:NO];
}
#pragma mark Device Configuration
- (void)focusWithMode:(AVCaptureFocusMode)focusMode exposeWithMode:(AVCaptureExposureMode)exposureMode atDevicePoint:(CGPoint)point monitorSubjectAreaChange:(BOOL)monitorSubjectAreaChange
{
dispatch_async([self sessionQueue], ^{
AVCaptureDevice *device = [[self videoDeviceInput] device];
NSError *error = nil;
if ([device lockForConfiguration:&error])
{
if ([device isFocusPointOfInterestSupported] && [device isFocusModeSupported:focusMode])
{
[device setFocusMode:focusMode];
[device setFocusPointOfInterest:point];
}
if ([device isExposurePointOfInterestSupported] && [device isExposureModeSupported:exposureMode])
{
[device setExposureMode:exposureMode];
[device setExposurePointOfInterest:point];
}
[device setSubjectAreaChangeMonitoringEnabled:monitorSubjectAreaChange];
[device unlockForConfiguration];
}
else
{
NSLog(#"%#", error);
}
});
}
+ (void)setFlashMode:(AVCaptureFlashMode)flashMode forDevice:(AVCaptureDevice *)device
{
if ([device hasFlash] && [device isFlashModeSupported:flashMode])
{
NSError *error = nil;
if ([device lockForConfiguration:&error])
{
[device setFlashMode:flashMode];
[device unlockForConfiguration];
}
else
{
NSLog(#"%#", error);
}
}
}
+ (AVCaptureDevice *)deviceWithMediaType:(NSString *)mediaType preferringPosition:(AVCaptureDevicePosition)position
{
NSArray *devices = [AVCaptureDevice devicesWithMediaType:mediaType];
AVCaptureDevice *captureDevice = [devices firstObject];
for (AVCaptureDevice *device in devices)
{
if ([device position] == position)
{
captureDevice = device;
break;
}
}
return captureDevice;
}
#pragma mark UI
- (void)checkDeviceAuthorizationStatus
{
NSString *mediaType = AVMediaTypeVideo;
[AVCaptureDevice requestAccessForMediaType:mediaType completionHandler:^(BOOL granted) {
if (granted)
{
//Granted access to mediaType
[self setDeviceAuthorized:YES];
}
else
{
//Not granted access to mediaType
dispatch_async(dispatch_get_main_queue(), ^{
[[[UIAlertView alloc] initWithTitle:#"AVCam!"
message:#"AVCam doesn't have permission to use Camera, please change privacy settings"
delegate:self
cancelButtonTitle:#"OK"
otherButtonTitles:nil] show];
[self setDeviceAuthorized:NO];
});
}
}];
}
- (IBAction)gpuButtonPress:(id)sender {
self.cpu_processing = false;
self.gpu_processing = true;
}
- (IBAction)cpuButtonPress:(id)sender {
self.cpu_processing = true;
self.gpu_processing = false;
}
- (UIImage *) imageFromSampleBuffer:(CMSampleBufferRef) sampleBuffer
{
CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
CVPixelBufferLockBaseAddress(imageBuffer, 0);
void *baseAddress = CVPixelBufferGetBaseAddress(imageBuffer);
size_t bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer);
size_t width = CVPixelBufferGetWidth(imageBuffer);
size_t height = CVPixelBufferGetHeight(imageBuffer);
CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();
CGContextRef context = CGBitmapContextCreate(baseAddress, width, height, 8,
bytesPerRow, colorSpace, kCGBitmapByteOrder32Little | kCGImageAlphaPremultipliedFirst);
CGImageRef quartzImage = CGBitmapContextCreateImage(context);
CVPixelBufferUnlockBaseAddress(imageBuffer,0);
CGContextRelease(context);
CGColorSpaceRelease(colorSpace);
int frontCameraImageOrientation = UIImageOrientationLeftMirrored;
//int backCameraImageOrientation = UIImageOrientationRight;
UIImage *image = [UIImage imageWithCGImage:quartzImage scale:(CGFloat)1.0 orientation:frontCameraImageOrientation];
//UIImage *image = [UIImage imageWithCGImage:quartzImage];
CGImageRelease(quartzImage);
return (image);
}
#pragma mark File Output Delegate
- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection
{
#autoreleasepool {
dispatch_async([self im_processingQueue], ^{
UIImage *img = [self imageFromSampleBuffer:sampleBuffer];
});
//To get back main thread for UI display
dispatch_async(dispatch_get_main_queue(), ^{
});
}
}
#end

Opening camera in half screen and webview in half screen on same view on ios

My requirement is that i want to open a webview in upper half of iphone screen and camera for video recording in lower half.Is it possible and if its is please describe how to achieve this.I have been struggling with this for past 3 days.Heres how i capture the video
#import "RecordVideoViewController.h"
#interface RecordVideoViewController ()
#end
#implementation RecordVideoViewController
- (void)viewDidLoad
{
[super viewDidLoad];
// Do any additional setup after loading the view, typically from a nib.
x=1;
}
- (void)didReceiveMemoryWarning
{
[super didReceiveMemoryWarning];
// Dispose of any resources that can be recreated.
}
- (IBAction)recordAndPlay:(id)sender {
[self startCameraControllerFromViewController:self usingDelegate:self];
}
-(BOOL)startCameraControllerFromViewController:(UIViewController*)controller
usingDelegate:(id )delegate {
// 1 - Validations
if (([UIImagePickerController isSourceTypeAvailable:UIImagePickerControllerSourceTypeCamera] == NO)
|| (delegate == nil)
|| (controller == nil)) {
return NO;
}
// 2 - Get image picker
UIImagePickerController *cameraUI = [[UIImagePickerController alloc] init];
cameraUI.sourceType = UIImagePickerControllerSourceTypeCamera;
// Displays a control that allows the user to choose movie capture
cameraUI.mediaTypes = [[NSArray alloc] initWithObjects:(NSString *)kUTTypeMovie, nil];
// Hides the controls for moving & scaling pictures, or for
// trimming movies. To instead show the controls, use YES.
cameraUI.allowsEditing = NO;
cameraUI.delegate = delegate;
//3 - Display image picker
[controller presentViewController:cameraUI animated:YES completion:nil];
return YES;
}
Solved it myself.Heres the code
// ViewController.m
// AppleVideoCapture
// Copyright (c) 2014 NetProphets. All rights reserved.
#import "ViewController.h"
#interface ViewController (){
AVCaptureSession * session;
AVCaptureMovieFileOutput * output;
}
#end
#implementation ViewController
- (void)viewDidLoad
{
[super viewDidLoad];
// Do any additional setup after loading the view, typically from a nib.
//1. SetUp an AV session
session= [[AVCaptureSession alloc] init];
if ([session canSetSessionPreset:AVCaptureSessionPresetMedium]) {
session.sessionPreset= AVCaptureSessionPresetMedium;
}
//Get the front facing camera as input device
AVCaptureDevice * device= [self frontCamera ];
//Setup the device capture input
NSError * error;
AVCaptureDeviceInput * videoInput= [AVCaptureDeviceInput deviceInputWithDevice:device error:&error];
if (error) {
NSLog(#"Error with video capture...%#",[error description]);
}
else{
if ([session canAddInput:videoInput])
[session addInput:videoInput];
else
NSLog(#"Error adding video input to session");
}
AVCaptureDevice * audioDevice= [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeAudio];
AVCaptureDeviceInput * audioInput= [AVCaptureDeviceInput deviceInputWithDevice:audioDevice error:&error];
if (error) {
NSLog(#"Error with audio input...%#",[error description]);
}
else{
if ([session canAddInput:audioInput])
[session addInput:audioInput];
else
NSLog(#"Error adding audio to session");
}
//Customize and add your customized video capturing layer to the view
CALayer * viewLayer= [self.view layer];
AVCaptureVideoPreviewLayer * previewLayer= [[AVCaptureVideoPreviewLayer alloc]initWithSession:session];
previewLayer.videoGravity=AVLayerVideoGravityResizeAspectFill;
previewLayer.frame= CGRectMake(0.0f,530.0f,320.0f,-250.0f);
[viewLayer addSublayer:previewLayer];
//Configure the movie output
output= [[AVCaptureMovieFileOutput alloc]init];
if ([session canAddOutput:output]) {
[session addOutput:output];
}
[session startRunning];
}
- (void)didReceiveMemoryWarning
{
[super didReceiveMemoryWarning];
// Dispose of any resources that can be recreated.
}
- (IBAction)recordVideo:(id)sender {
NSLog(#"Record video called");
NSString * path= [NSString stringWithFormat:#"%#%#",NSTemporaryDirectory(),#"output.mov"];
NSURL * outputUrl= [NSURL fileURLWithPath:path];
NSFileManager * myManager= [NSFileManager defaultManager];
NSError * error;
if ([myManager fileExistsAtPath:path]) {
if ([myManager removeItemAtPath:path error:&error]==NO) {
NSLog(#"File removal at temporary directory failed..%#",[error description]);
}
}
[output startRecordingToOutputFileURL:outputUrl recordingDelegate:self];
}
-(AVCaptureDevice *)frontCamera{
NSArray * devices= [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo];
for (AVCaptureDevice * device in devices) {
if ([device position]==AVCaptureDevicePositionFront) {
return device;
}
}
return nil;
}
-(void)captureOutput:(AVCaptureFileOutput *)captureOutput didFinishRecordingToOutputFileAtURL:(NSURL *)outputFileURL fromConnections:(NSArray *)connections error:(NSError *)error{
NSLog(#"Recording Finished");
ALAssetsLibrary * library=[[ALAssetsLibrary alloc]init];
if ([library videoAtPathIsCompatibleWithSavedPhotosAlbum:outputFileURL])
{
[library writeVideoAtPathToSavedPhotosAlbum:outputFileURL
completionBlock:^(NSURL *assetURL, NSError *error)
{
if (error)
{
NSLog(#"Error saving file to photos album");
}
}];
}
}
-(void)captureOutput:(AVCaptureFileOutput *)captureOutput didStartRecordingToOutputFileAtURL:(NSURL *)fileURL fromConnections:(NSArray *)connections{
NSLog(#"Recording Started");
}
- (IBAction)stopRecording:(id)sender {
NSLog(#"Stop Recording called");
[output stopRecording];
}
#end

Resources