iOS 13 error: Inconsistent state from AVCaptureStillImageOutput captureStillImageAsynchronouslyFromConnection - ios

I'm developing command line tool that will take photo from camera (programmatically, without modal views or controls) and save it locally on device. So I'm trying to implement [AVCaptureStillImageOutput].
(Yes, I'm aware this has been deprecated since iOS 10, but my goal is to target also previous system version, starting iOS 9.)
So the problem I'm facing is fatal exception in function [captureStillImageAsynchronouslyFromConnection] that states Inconsistent state:
*** -[AVCaptureStillImageOutput captureStillImageAsynchronouslyFromConnection:completionHandler:] Inconsistent state
I've seen other SO threads and I think I implemented all the possible solutions to bypass this error but none worked that's why I'm posting this question with my code.
I've tested this code on iOS 9 and it works fine, but error rises on iOS 13. May the deprecation cause the error?
Anyway program compiles fine and works fine up until this specific function. All seems fine, front camera is found, session is started, videoConnection is active and enabled.. Still captureStillImageAsynchronouslyFromConnection returns exception.
Please find my code attached below. Any help much appreciated!
Header:
#import <AVFoundation/AVFoundation.h>
#import <Foundation/Foundation.h>
#interface CameraManager : NSObject
#property (retain) AVCaptureStillImageOutput *stillImageOutput;
#property (retain) AVCaptureConnection *videoConnection;
#property (retain) AVCaptureSession *session;
+ (instancetype)sharedInstance;
- (void) takePhoto;
#end
Class implementation:
#import "CameraManager.h"
#interface CameraManager()
#end
#implementation CameraManager
+ (instancetype)sharedInstance {
static CameraManager *sharedInstance = nil;
static dispatch_once_t onceToken;
dispatch_once(&onceToken, ^{
sharedInstance = [[CameraManager alloc] init];
});
return sharedInstance;
}
-(AVCaptureDevice *) frontFacingCameraIfAvailable{
NSArray *videoDevices = [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo];
AVCaptureDevice *captureDevice = nil;
for (AVCaptureDevice *device in videoDevices){
if (device.position == AVCaptureDevicePositionFront){
captureDevice = device;
break;
}
}
if (!captureDevice){
captureDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
}
return captureDevice;
}
-(void) setupCaptureSession {
self.session = [[AVCaptureSession alloc] init];
self.session.sessionPreset = AVCaptureSessionPresetMedium;
AVCaptureVideoPreviewLayer *captureVideoPreviewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:self.session];
NSError *error = nil;
AVCaptureDevice *device = [self frontFacingCameraIfAvailable];
AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:device error:&error];
if (!input) {
NSLog(#"ERROR: trying to open camera: %#", error);
}
[self.session addInput:input];
self.stillImageOutput = [[AVCaptureStillImageOutput alloc] init];
NSDictionary *outputSettings = [[NSDictionary alloc] initWithObjectsAndKeys: AVVideoCodecJPEG, AVVideoCodecKey, nil];
[self.stillImageOutput setOutputSettings:outputSettings];
[self.session addOutput:self.stillImageOutput];
[self.session startRunning];
[[NSNotificationCenter defaultCenter] addObserver:self selector:#selector(sessionWasInterrupted:) name:AVCaptureSessionWasInterruptedNotification object:self.session];
[[NSNotificationCenter defaultCenter] addObserver:self selector:#selector(sessionInterruptionEnded:) name:AVCaptureSessionInterruptionEndedNotification object:self.session];
}
- (void) sessionWasInterrupted:(NSNotification*)notification
{
AVCaptureSessionInterruptionReason reason = [notification.userInfo[AVCaptureSessionInterruptionReasonKey] integerValue];
NSLog(#"Capture session was interrupted with reason %ld", (long)reason);
}
- (void) sessionInterruptionEnded:(NSNotification*)notification
{
NSLog(#"sessionInterruptionEnded");
}
-(void) takePhoto {
[self setupCaptureSession];
self.videoConnection = nil;
for (AVCaptureConnection *connection in self.stillImageOutput.connections) {
for (AVCaptureInputPort *port in [connection inputPorts]) {
if ([[port mediaType] isEqual:AVMediaTypeVideo] ) {
self.videoConnection = connection;
break;
}
}
if (self.videoConnection) { break; }
}
NSLog(#"about to request a capture from: %#", self.stillImageOutput);
[self.videoConnection setVideoOrientation:AVCaptureVideoOrientationPortrait];
if (!self.videoConnection || !self.videoConnection.enabled || !self.videoConnection.active) {
NSLog(#"videoConnection not enabled or not active");
return;
} else {
NSLog(#"videoConnection enabled: %hhd, active: %hhd", (char)self.videoConnection.enabled, (char)self.videoConnection.active);
}
// NSLog(#"videoConnection: %#", self.videoConnection);
NSError *error;
if (error) {
NSLog(#"Error %#", error);
}
#try {
// here captureStillImageAsynchronouslyFromConnection crash with:
// *** Terminating app due to uncaught exception 'NSInternalInconsistencyException', reason: '*** -[AVCaptureStillImageOutput
captureStillImageAsynchronouslyFromConnection:completionHandler:] Inconsistent state'
[self.stillImageOutput captureStillImageAsynchronouslyFromConnection:self.videoConnection completionHandler: ^(CMSampleBufferRef imageSampleBuffer, NSError *error) {
NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageSampleBuffer];
NSLog(#"Here should have imageData %#", imageData);
}];
}
#catch (NSException *exception) {
NSLog(#"EXCEPTION %#", exception.reason);
}
#finally {
NSLog(#"Continue..");
}
}
#end

Related

Why the autofocus didn't work?

I am developing a camera related app. Now I succeeded to capture camera preview. However, when I was trying to set autofocus to my app, it didn't work. I tried both AVCaptureFocusModeContinuousAutoFocus and AVCaptureFocusModeAutoFocus, neither of them worked.
By the way, I tested on iPhone 6s.
my ViewController.h file
#import <UIKit/UIKit.h>
#include <AVFoundation/AVFoundation.h>
#interface ViewController : UIViewController
{
AVCaptureSession *cameraCaptureSession;
AVCaptureVideoPreviewLayer *cameraPreviewLayer;
}
- (void) initializeCaptureSession;
#end
my ViewController.m file
#import "ViewController.h"
#interface ViewController ()
#end
#implementation ViewController
- (void)viewDidLoad {
[super viewDidLoad];
// Do any additional setup after loading the view, typically from a nib.
[self initializeCaptureSession];
}
- (void) initializeCaptureSession
{
//Attempt to initialize AVCaptureDevice with back camera
NSArray *videoDevices = [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo];
AVCaptureDevice *captureDevice = nil;
for (AVCaptureDevice *device in videoDevices){
if (device.position == AVCaptureDevicePositionBack)
{
captureDevice = device;
break;
}
}
//If camera is accessible by capture session
if (captureDevice)
{
if ([captureDevice isFocusModeSupported:AVCaptureFocusModeAutoFocus]){
NSError *error;
if ([captureDevice lockForConfiguration:&error]){
[captureDevice setFocusPointOfInterest:CGPointMake(0.5f, 0.5f)];
[captureDevice setFocusMode:AVCaptureFocusModeAutoFocus];
[captureDevice unlockForConfiguration];
}else{
NSLog(#"Error: %#", error);
}
}
//Allocate camera capture session
cameraCaptureSession = [[AVCaptureSession alloc] init];
cameraCaptureSession.sessionPreset = AVCaptureSessionPresetMedium;
//Configure capture session input
AVCaptureDeviceInput *videoIn = [AVCaptureDeviceInput deviceInputWithDevice:captureDevice error:nil];
[cameraCaptureSession addInput:videoIn];
//Configure capture session output
AVCaptureVideoDataOutput *videoOut = [[AVCaptureVideoDataOutput alloc] init];
[videoOut setAlwaysDiscardsLateVideoFrames:YES];
[cameraCaptureSession addOutput:videoOut];
//Bind preview layer to capture session data
cameraPreviewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:cameraCaptureSession];
CGRect layerRect = self.view.bounds;
cameraPreviewLayer.bounds = self.view.bounds;
cameraPreviewLayer.position = CGPointMake(CGRectGetMidX(layerRect), CGRectGetMidY(layerRect));
cameraPreviewLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;
//Add preview layer to UIView layer
[self.view.layer addSublayer:cameraPreviewLayer];
//Begin camera capture
[cameraCaptureSession startRunning];
}
}
- (void)didReceiveMemoryWarning {
[super didReceiveMemoryWarning];
// Dispose of any resources that can be recreated.
[cameraCaptureSession stopRunning];
}
#end
here is of my code on swift, its tested on my iPhone 5s and working only on back camera, i don't know how about iPhone 6s
`if let device = captureDevice {
try! device.lockForConfiguration()
if device.isFocusModeSupported(.autoFocus) {
device.focusMode = .autoFocus
}
device.unlockForConfiguration()
}`
i think you need to remove
[captureDevice setFocusPointOfInterest:CGPointMake(0.5f, 0.5f)];
Try some thing like this:
if ([self.session canAddInput:videoDeviceInput]) {
[[NSNotificationCenter defaultCenter] removeObserver:self name:AVCaptureDeviceSubjectAreaDidChangeNotification object:currentDevice];
[[NSNotificationCenter defaultCenter] addObserver:self selector:#selector(subjectAreaDidChange:) name:AVCaptureDeviceSubjectAreaDidChangeNotification object:videoDevice];
[self.session addInput:videoDeviceInput];
self.videoDeviceInput = videoDeviceInput;
}
Then:
- (void)subjectAreaDidChange:(NSNotification *)notification
{
CGPoint devicePoint = CGPointMake( 0.5, 0.5 );
[self focusWithMode:AVCaptureFocusModeContinuousAutoFocus exposeWithMode:AVCaptureExposureModeContinuousAutoExposure atDevicePoint:devicePoint monitorSubjectAreaChange:NO];
}
And Then:
- (void)focusWithMode:(AVCaptureFocusMode)focusMode exposeWithMode:(AVCaptureExposureMode)exposureMode atDevicePoint:(CGPoint)point monitorSubjectAreaChange:(BOOL)monitorSubjectAreaChange
{
dispatch_async( self.sessionQueue, ^{
AVCaptureDevice *device = self.videoDeviceInput.device;
NSError *error = nil;
if ( [device lockForConfiguration:&error] ) {
// Setting (focus/exposure)PointOfInterest alone does not initiate a (focus/exposure) operation.
// Call -set(Focus/Exposure)Mode: to apply the new point of interest.
if ( device.isFocusPointOfInterestSupported && [device isFocusModeSupported:focusMode] ) {
device.focusPointOfInterest = point;
device.focusMode = focusMode;
}
if ( device.isExposurePointOfInterestSupported && [device isExposureModeSupported:exposureMode] ) {
device.exposurePointOfInterest = point;
device.exposureMode = exposureMode;
}
device.subjectAreaChangeMonitoringEnabled = monitorSubjectAreaChange;
[device unlockForConfiguration];
}
else {
NSLog( #"Could not lock device for configuration: %#", error );
}
} );
}
This was a code from a project I worked. Feel free to ask if you have any doubt.
AVCaptureSession *captureSession = [[AVCaptureSession alloc]init];
previewLayer = [AVCaptureVideoPreviewLayer layerWithSession:captureSession];
previewLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;
AVCaptureDevice *videoDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
NSError *err = nil;
AVCaptureDeviceInput *videoIn = [AVCaptureDeviceInput deviceInputWithDevice:videoDevice error: &err];
if (err == nil){
if ([captureSession canAddInput:videoIn]){
[captureSession addInput:videoIn];
}else{
NSLog(#"Failed to add video input");
}
}else{
NSLog(#"Error: %#",err);
}
[captureSession startRunning];
previewLayer.frame = [self.view bounds];
[self.view.layer addSublayer:previewLayer];
I am not sure what happened but if I used the code above the camera can give me a clear preview.

How to mirror iOS screen via USB?

I'm trying to mirror iOS device screen via USB connection to OSX. QuickTime does this fine, and I read this article with a code example: https://nadavrub.wordpress.com/2015/07/06/macos-media-capture-using-coremediaio/
However, the callback of CMIOStreamCopyBufferQueue is never called and I'm wondering what am I doing wrong?
Have anyone faced this issue and can provide a working example ?
Thanks.
Well.. eventually I did what Nadav told me in his blog - discover DAL devices and capture their output using AVCaptureSession like this:
-(id) init {
// Allow iOS Devices Discovery
CMIOObjectPropertyAddress prop =
{ kCMIOHardwarePropertyAllowScreenCaptureDevices,
kCMIOObjectPropertyScopeGlobal,
kCMIOObjectPropertyElementMaster };
UInt32 allow = 1;
CMIOObjectSetPropertyData( kCMIOObjectSystemObject,
&prop, 0, NULL,
sizeof(allow), &allow );
// Get devices
NSArray *devices = [AVCaptureDevice devicesWithMediaType:AVMediaTypeMuxed];
BOOL deviceAttahced = false;
for (int i = 0; i < [devices count]; i++) {
AVCaptureDevice *device = devices[i];
if ([[device uniqueID] isEqualToString:/*deviceUDID*/]) {
deviceAttahced = true;
[self startSession:device];
break;
}
}
NSNotificationCenter *notificationCenter = [NSNotificationCenter defaultCenter];
// Device not attached - subscribe to onConnect notifications
if (!deviceAttahced) {
id deviceWasConnectedObserver = [notificationCenter addObserverForName:AVCaptureDeviceWasConnectedNotification
object:nil
queue:[NSOperationQueue mainQueue]
usingBlock:^(NSNotification *note) {
AVCaptureDevice *device = note.object;
[self deviceConnected:device];
}];
observers = [[NSArray alloc] initWithObjects:deviceWasConnectedObserver, nil];
}
return self;
}
- (void) deviceConnected:(AVCaptureDevice *)device {
if ([[device uniqueID] isEqualToString:/*deviceUDID*/]) {
[self startSession:device];
}
}
- (void) startSession:(AVCaptureDevice *)device {
// Init capturing session
session = [[AVCaptureSession alloc] init];
// Star session configuration
[session beginConfiguration];
// Add session input
NSError *error;
newVideoDeviceInput = [AVCaptureDeviceInput deviceInputWithDevice:device error:&error];
if (newVideoDeviceInput == nil) {
dispatch_async(dispatch_get_main_queue(), ^(void) {
NSLog(#"%#", error);
});
} else {
[session addInput:newVideoDeviceInput];
}
// Add session output
videoDataOutput = [[AVCaptureVideoDataOutput alloc] init];
videoDataOutput.videoSettings = [NSDictionary dictionaryWithObject: [NSNumber numberWithInt:kCVPixelFormatType_32BGRA] forKey: (id)kCVPixelBufferPixelFormatTypeKey];
dispatch_queue_t videoQueue = dispatch_queue_create("videoQueue", NULL);
[videoDataOutput setSampleBufferDelegate:self queue:videoQueue];
[session addOutput:videoDataOutput];
// Finish session configuration
[session commitConfiguration];
// Start the session
[session startRunning];
}
#pragma mark - AVCaptureAudioDataOutputSampleBufferDelegate
- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection {
NSImage *resultNSImage = [self imageFromSampleBuffer:sampleBuffer];
/*
* Here you can do whatever you need with the frame (e.g. convert to JPG)
*/
}

Why AVCaptureVideoDataOutputSampleBufferDelegate method is not called

I use AVCaptureVideoDataOutput to display camera image at display preview. My intention is to use two dispatch queues, one for display session queue and another one is for image processing. So that I can display images on the preview and then in the background, I can do image processing. That is why I use AVCaptureVideoDataOutput and AVCaptureVideoDataOutputSampleBufferDelegate.
AVCaptureVideoDataOutput is properly setup to link to AVCaptureSession for display at preview. But the delegate method
- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection
is never get called to retrieve the image and proceed for image processing. What could be wrong with my setting? The code is shown below.
#import "AVCamViewController.h"
#import <AVFoundation/AVFoundation.h>
#import <AssetsLibrary/AssetsLibrary.h>
#import "AVCamPreviewView.h"
#import "ImageProcessing.h"
static void * CapturingStillImageContext = &CapturingStillImageContext;
static void * RecordingContext = &RecordingContext;
static void * SessionRunningAndDeviceAuthorizedContext = &SessionRunningAndDeviceAuthorizedContext;
#interface AVCamViewController() <AVCaptureVideoDataOutputSampleBufferDelegate>
// For use in the storyboards.
#property (weak, nonatomic) IBOutlet UIButton *cpuButton;
#property (weak, nonatomic) IBOutlet UIButton *gpuButton;
#property (weak, nonatomic) IBOutlet AVCamPreviewView *previewView;
#property (nonatomic, weak) IBOutlet UIButton *cameraButton;
- (IBAction)changeCamera:(id)sender;
- (IBAction)focusAndExposeTap:(UIGestureRecognizer *)gestureRecognizer;
- (UIImage *) imageFromSampleBuffer:(CMSampleBufferRef) sampleBuffer;
// Session management.
#property (nonatomic) dispatch_queue_t sessionQueue; // Communicate with the session and other session objects on this queue.
#property (nonatomic) AVCaptureSession *session;
#property (nonatomic) AVCaptureDeviceInput *videoDeviceInput;
#property (nonatomic) AVCaptureVideoDataOutput *vid_Output;
// Utilities.
#property (nonatomic) UIBackgroundTaskIdentifier backgroundRecordingID;
#property (nonatomic, getter = isDeviceAuthorized) BOOL deviceAuthorized;
#property (nonatomic, readonly, getter = isSessionRunningAndDeviceAuthorized) BOOL sessionRunningAndDeviceAuthorized;
#property (nonatomic) BOOL lockInterfaceRotation;
#property (nonatomic) id runtimeErrorHandlingObserver;
//fps
//#property int fps;
//Image processing management
#property (nonatomic) dispatch_queue_t im_processingQueue;
#property bool cpu_processing;
#property bool gpu_processing;
#end
#implementation AVCamViewController
- (BOOL)isSessionRunningAndDeviceAuthorized
{
return [[self session] isRunning] && [self isDeviceAuthorized];
}
+ (NSSet *)keyPathsForValuesAffectingSessionRunningAndDeviceAuthorized
{
return [NSSet setWithObjects:#"session.running", #"deviceAuthorized", nil];
}
- (void)viewDidLoad
{
[super viewDidLoad];
// self.fps = 30;
// Create the AVCaptureSession
AVCaptureSession *session = [[AVCaptureSession alloc] init];
[self setSession:session];
// Setup the preview view
[[self previewView] setSession:session];
// Check for device authorization
[self checkDeviceAuthorizationStatus];
// In general it is not safe to mutate an AVCaptureSession or any of its inputs, outputs, or connections from multiple threads at the same time.
// Why not do all of this on the main queue?
// -[AVCaptureSession startRunning] is a blocking call which can take a long time. We dispatch session setup to the sessionQueue so that the main queue isn't blocked (which keeps the UI responsive).
dispatch_queue_t sessionQueue = dispatch_queue_create("session queue", DISPATCH_QUEUE_SERIAL);
[self setSessionQueue:sessionQueue];
//we will use a separate dispatch session not to block the main queue in processing
dispatch_queue_t im_processingQueue = dispatch_queue_create("im_processing queue", DISPATCH_QUEUE_SERIAL);
[self setIm_processingQueue:im_processingQueue];
dispatch_async(sessionQueue, ^{
[self setBackgroundRecordingID:UIBackgroundTaskInvalid];
NSError *error = nil;
AVCaptureDevice *videoDevice = [AVCamViewController deviceWithMediaType:AVMediaTypeVideo preferringPosition:AVCaptureDevicePositionBack];
AVCaptureDeviceInput *videoDeviceInput = [AVCaptureDeviceInput deviceInputWithDevice:videoDevice error:&error];
if (error)
{
NSLog(#"%#", error);
}
if ([session canAddInput:videoDeviceInput])
{
[session addInput:videoDeviceInput];
[self setVideoDeviceInput:videoDeviceInput];
dispatch_async(dispatch_get_main_queue(), ^{
// Why are we dispatching this to the main queue?
// Because AVCaptureVideoPreviewLayer is the backing layer for AVCamPreviewView and UIView can only be manipulated on main thread.
// Note: As an exception to the above rule, it is not necessary to serialize video orientation changes on the AVCaptureVideoPreviewLayer’s connection with other session manipulation.
[[(AVCaptureVideoPreviewLayer *)[[self previewView] layer] connection] setVideoOrientation:(AVCaptureVideoOrientation)[self interfaceOrientation]];
});
}
AVCaptureDevice *audioDevice = [[AVCaptureDevice devicesWithMediaType:AVMediaTypeAudio] firstObject];
AVCaptureDeviceInput *audioDeviceInput = [AVCaptureDeviceInput deviceInputWithDevice:audioDevice error:&error];
if (error)
{
NSLog(#"%#", error);
}
if ([session canAddInput:audioDeviceInput])
{
[session addInput:audioDeviceInput];
}
AVCaptureVideoDataOutput *vid_Output = [[AVCaptureVideoDataOutput alloc] init];
vid_Output.alwaysDiscardsLateVideoFrames = NO;
if ([session canAddOutput:vid_Output])
{
[session addOutput:vid_Output];
AVCaptureConnection *connection = [vid_Output connectionWithMediaType:AVMediaTypeVideo];
if ([connection isVideoStabilizationSupported])
[connection setEnablesVideoStabilizationWhenAvailable:YES];
[self setVid_Output:vid_Output];
}
});
}
- (void)viewWillAppear:(BOOL)animated
{
dispatch_async([self sessionQueue], ^{
[self addObserver:self forKeyPath:#"sessionRunningAndDeviceAuthorized" options:(NSKeyValueObservingOptionOld | NSKeyValueObservingOptionNew) context:SessionRunningAndDeviceAuthorizedContext];
[self addObserver:self forKeyPath:#"vid_Output.recording" options:(NSKeyValueObservingOptionOld | NSKeyValueObservingOptionNew) context:RecordingContext];
[[NSNotificationCenter defaultCenter] addObserver:self selector:#selector(subjectAreaDidChange:) name:AVCaptureDeviceSubjectAreaDidChangeNotification object:[[self videoDeviceInput] device]];
__weak AVCamViewController *weakSelf = self;
[self setRuntimeErrorHandlingObserver:[[NSNotificationCenter defaultCenter] addObserverForName:AVCaptureSessionRuntimeErrorNotification object:[self session] queue:nil usingBlock:^(NSNotification *note) {
AVCamViewController *strongSelf = weakSelf;
dispatch_async([strongSelf sessionQueue], ^{
// Manually restarting the session since it must have been stopped due to an error.
[[strongSelf session] startRunning];
});
}]];
[[self session] startRunning];
});
}
- (void)viewDidDisappear:(BOOL)animated
{
dispatch_async([self sessionQueue], ^{
[[self session] stopRunning];
[[NSNotificationCenter defaultCenter] removeObserver:self name:AVCaptureDeviceSubjectAreaDidChangeNotification object:[[self videoDeviceInput] device]];
[[NSNotificationCenter defaultCenter] removeObserver:[self runtimeErrorHandlingObserver]];
[self removeObserver:self forKeyPath:#"sessionRunningAndDeviceAuthorized" context:SessionRunningAndDeviceAuthorizedContext];
[self removeObserver:self forKeyPath:#"vid_Output.recording" context:RecordingContext];
});
}
- (BOOL)prefersStatusBarHidden
{
return YES;
}
- (BOOL)shouldAutorotate
{
// Disable autorotation of the interface when recording is in progress.
return ![self lockInterfaceRotation];
}
- (NSUInteger)supportedInterfaceOrientations
{
return UIInterfaceOrientationMaskAll;
}
- (void)willRotateToInterfaceOrientation:(UIInterfaceOrientation)toInterfaceOrientation duration:(NSTimeInterval)duration
{
[[(AVCaptureVideoPreviewLayer *)[[self previewView] layer] connection] setVideoOrientation:(AVCaptureVideoOrientation)toInterfaceOrientation];
}
- (void)observeValueForKeyPath:(NSString *)keyPath ofObject:(id)object change:(NSDictionary *)change context:(void *)context
{
if (context == SessionRunningAndDeviceAuthorizedContext)
{
BOOL isRunning = [change[NSKeyValueChangeNewKey] boolValue];
dispatch_async(dispatch_get_main_queue(), ^{
if (isRunning)
{
[[self cameraButton] setEnabled:YES];
}
else
{
[[self cameraButton] setEnabled:NO];
}
});
}
else
{
[super observeValueForKeyPath:keyPath ofObject:object change:change context:context];
}
}
#pragma mark Actions
- (IBAction)changeCamera:(id)sender {
[[self cameraButton] setEnabled:NO];
dispatch_async([self sessionQueue], ^{
AVCaptureDevice *currentVideoDevice = [[self videoDeviceInput] device];
AVCaptureDevicePosition preferredPosition = AVCaptureDevicePositionUnspecified;
AVCaptureDevicePosition currentPosition = [currentVideoDevice position];
switch (currentPosition)
{
case AVCaptureDevicePositionUnspecified:
preferredPosition = AVCaptureDevicePositionBack;
break;
case AVCaptureDevicePositionBack:
preferredPosition = AVCaptureDevicePositionFront;
break;
case AVCaptureDevicePositionFront:
preferredPosition = AVCaptureDevicePositionBack;
break;
}
AVCaptureDevice *videoDevice = [AVCamViewController deviceWithMediaType:AVMediaTypeVideo preferringPosition:preferredPosition];
AVCaptureDeviceInput *videoDeviceInput = [AVCaptureDeviceInput deviceInputWithDevice:videoDevice error:nil];
[[self session] beginConfiguration];
[[self session] removeInput:[self videoDeviceInput]];
if ([[self session] canAddInput:videoDeviceInput])
{
[[NSNotificationCenter defaultCenter] removeObserver:self name:AVCaptureDeviceSubjectAreaDidChangeNotification object:currentVideoDevice];
[AVCamViewController setFlashMode:AVCaptureFlashModeAuto forDevice:videoDevice];
[[NSNotificationCenter defaultCenter] addObserver:self selector:#selector(subjectAreaDidChange:) name:AVCaptureDeviceSubjectAreaDidChangeNotification object:videoDevice];
[[self session] addInput:videoDeviceInput];
[self setVideoDeviceInput:videoDeviceInput];
}
else
{
[[self session] addInput:[self videoDeviceInput]];
}
[[self session] commitConfiguration];
dispatch_async(dispatch_get_main_queue(), ^{
[[self cameraButton] setEnabled:YES];
});
});
}
- (IBAction)focusAndExposeTap:(UIGestureRecognizer *)gestureRecognizer
{
CGPoint devicePoint = [(AVCaptureVideoPreviewLayer *)[[self previewView] layer] captureDevicePointOfInterestForPoint:[gestureRecognizer locationInView:[gestureRecognizer view]]];
[self focusWithMode:AVCaptureFocusModeAutoFocus exposeWithMode:AVCaptureExposureModeAutoExpose atDevicePoint:devicePoint monitorSubjectAreaChange:YES];
}
- (void)subjectAreaDidChange:(NSNotification *)notification
{
CGPoint devicePoint = CGPointMake(.5, .5);
[self focusWithMode:AVCaptureFocusModeContinuousAutoFocus exposeWithMode:AVCaptureExposureModeContinuousAutoExposure atDevicePoint:devicePoint monitorSubjectAreaChange:NO];
}
#pragma mark Device Configuration
- (void)focusWithMode:(AVCaptureFocusMode)focusMode exposeWithMode:(AVCaptureExposureMode)exposureMode atDevicePoint:(CGPoint)point monitorSubjectAreaChange:(BOOL)monitorSubjectAreaChange
{
dispatch_async([self sessionQueue], ^{
AVCaptureDevice *device = [[self videoDeviceInput] device];
NSError *error = nil;
if ([device lockForConfiguration:&error])
{
if ([device isFocusPointOfInterestSupported] && [device isFocusModeSupported:focusMode])
{
[device setFocusMode:focusMode];
[device setFocusPointOfInterest:point];
}
if ([device isExposurePointOfInterestSupported] && [device isExposureModeSupported:exposureMode])
{
[device setExposureMode:exposureMode];
[device setExposurePointOfInterest:point];
}
[device setSubjectAreaChangeMonitoringEnabled:monitorSubjectAreaChange];
[device unlockForConfiguration];
}
else
{
NSLog(#"%#", error);
}
});
}
+ (void)setFlashMode:(AVCaptureFlashMode)flashMode forDevice:(AVCaptureDevice *)device
{
if ([device hasFlash] && [device isFlashModeSupported:flashMode])
{
NSError *error = nil;
if ([device lockForConfiguration:&error])
{
[device setFlashMode:flashMode];
[device unlockForConfiguration];
}
else
{
NSLog(#"%#", error);
}
}
}
+ (AVCaptureDevice *)deviceWithMediaType:(NSString *)mediaType preferringPosition:(AVCaptureDevicePosition)position
{
NSArray *devices = [AVCaptureDevice devicesWithMediaType:mediaType];
AVCaptureDevice *captureDevice = [devices firstObject];
for (AVCaptureDevice *device in devices)
{
if ([device position] == position)
{
captureDevice = device;
break;
}
}
return captureDevice;
}
#pragma mark UI
- (void)checkDeviceAuthorizationStatus
{
NSString *mediaType = AVMediaTypeVideo;
[AVCaptureDevice requestAccessForMediaType:mediaType completionHandler:^(BOOL granted) {
if (granted)
{
//Granted access to mediaType
[self setDeviceAuthorized:YES];
}
else
{
//Not granted access to mediaType
dispatch_async(dispatch_get_main_queue(), ^{
[[[UIAlertView alloc] initWithTitle:#"AVCam!"
message:#"AVCam doesn't have permission to use Camera, please change privacy settings"
delegate:self
cancelButtonTitle:#"OK"
otherButtonTitles:nil] show];
[self setDeviceAuthorized:NO];
});
}
}];
}
- (IBAction)gpuButtonPress:(id)sender {
self.cpu_processing = false;
self.gpu_processing = true;
}
- (IBAction)cpuButtonPress:(id)sender {
self.cpu_processing = true;
self.gpu_processing = false;
}
- (UIImage *) imageFromSampleBuffer:(CMSampleBufferRef) sampleBuffer
{
CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
CVPixelBufferLockBaseAddress(imageBuffer, 0);
void *baseAddress = CVPixelBufferGetBaseAddress(imageBuffer);
size_t bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer);
size_t width = CVPixelBufferGetWidth(imageBuffer);
size_t height = CVPixelBufferGetHeight(imageBuffer);
CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();
CGContextRef context = CGBitmapContextCreate(baseAddress, width, height, 8,
bytesPerRow, colorSpace, kCGBitmapByteOrder32Little | kCGImageAlphaPremultipliedFirst);
CGImageRef quartzImage = CGBitmapContextCreateImage(context);
CVPixelBufferUnlockBaseAddress(imageBuffer,0);
CGContextRelease(context);
CGColorSpaceRelease(colorSpace);
int frontCameraImageOrientation = UIImageOrientationLeftMirrored;
//int backCameraImageOrientation = UIImageOrientationRight;
UIImage *image = [UIImage imageWithCGImage:quartzImage scale:(CGFloat)1.0 orientation:frontCameraImageOrientation];
//UIImage *image = [UIImage imageWithCGImage:quartzImage];
CGImageRelease(quartzImage);
return (image);
}
#pragma mark File Output Delegate
- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection
{
#autoreleasepool {
dispatch_async([self im_processingQueue], ^{
UIImage *img = [self imageFromSampleBuffer:sampleBuffer];
});
//To get back main thread for UI display
dispatch_async(dispatch_get_main_queue(), ^{
});
}
}
#end

Opening camera in half screen and webview in half screen on same view on ios

My requirement is that i want to open a webview in upper half of iphone screen and camera for video recording in lower half.Is it possible and if its is please describe how to achieve this.I have been struggling with this for past 3 days.Heres how i capture the video
#import "RecordVideoViewController.h"
#interface RecordVideoViewController ()
#end
#implementation RecordVideoViewController
- (void)viewDidLoad
{
[super viewDidLoad];
// Do any additional setup after loading the view, typically from a nib.
x=1;
}
- (void)didReceiveMemoryWarning
{
[super didReceiveMemoryWarning];
// Dispose of any resources that can be recreated.
}
- (IBAction)recordAndPlay:(id)sender {
[self startCameraControllerFromViewController:self usingDelegate:self];
}
-(BOOL)startCameraControllerFromViewController:(UIViewController*)controller
usingDelegate:(id )delegate {
// 1 - Validations
if (([UIImagePickerController isSourceTypeAvailable:UIImagePickerControllerSourceTypeCamera] == NO)
|| (delegate == nil)
|| (controller == nil)) {
return NO;
}
// 2 - Get image picker
UIImagePickerController *cameraUI = [[UIImagePickerController alloc] init];
cameraUI.sourceType = UIImagePickerControllerSourceTypeCamera;
// Displays a control that allows the user to choose movie capture
cameraUI.mediaTypes = [[NSArray alloc] initWithObjects:(NSString *)kUTTypeMovie, nil];
// Hides the controls for moving & scaling pictures, or for
// trimming movies. To instead show the controls, use YES.
cameraUI.allowsEditing = NO;
cameraUI.delegate = delegate;
//3 - Display image picker
[controller presentViewController:cameraUI animated:YES completion:nil];
return YES;
}
Solved it myself.Heres the code
// ViewController.m
// AppleVideoCapture
// Copyright (c) 2014 NetProphets. All rights reserved.
#import "ViewController.h"
#interface ViewController (){
AVCaptureSession * session;
AVCaptureMovieFileOutput * output;
}
#end
#implementation ViewController
- (void)viewDidLoad
{
[super viewDidLoad];
// Do any additional setup after loading the view, typically from a nib.
//1. SetUp an AV session
session= [[AVCaptureSession alloc] init];
if ([session canSetSessionPreset:AVCaptureSessionPresetMedium]) {
session.sessionPreset= AVCaptureSessionPresetMedium;
}
//Get the front facing camera as input device
AVCaptureDevice * device= [self frontCamera ];
//Setup the device capture input
NSError * error;
AVCaptureDeviceInput * videoInput= [AVCaptureDeviceInput deviceInputWithDevice:device error:&error];
if (error) {
NSLog(#"Error with video capture...%#",[error description]);
}
else{
if ([session canAddInput:videoInput])
[session addInput:videoInput];
else
NSLog(#"Error adding video input to session");
}
AVCaptureDevice * audioDevice= [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeAudio];
AVCaptureDeviceInput * audioInput= [AVCaptureDeviceInput deviceInputWithDevice:audioDevice error:&error];
if (error) {
NSLog(#"Error with audio input...%#",[error description]);
}
else{
if ([session canAddInput:audioInput])
[session addInput:audioInput];
else
NSLog(#"Error adding audio to session");
}
//Customize and add your customized video capturing layer to the view
CALayer * viewLayer= [self.view layer];
AVCaptureVideoPreviewLayer * previewLayer= [[AVCaptureVideoPreviewLayer alloc]initWithSession:session];
previewLayer.videoGravity=AVLayerVideoGravityResizeAspectFill;
previewLayer.frame= CGRectMake(0.0f,530.0f,320.0f,-250.0f);
[viewLayer addSublayer:previewLayer];
//Configure the movie output
output= [[AVCaptureMovieFileOutput alloc]init];
if ([session canAddOutput:output]) {
[session addOutput:output];
}
[session startRunning];
}
- (void)didReceiveMemoryWarning
{
[super didReceiveMemoryWarning];
// Dispose of any resources that can be recreated.
}
- (IBAction)recordVideo:(id)sender {
NSLog(#"Record video called");
NSString * path= [NSString stringWithFormat:#"%#%#",NSTemporaryDirectory(),#"output.mov"];
NSURL * outputUrl= [NSURL fileURLWithPath:path];
NSFileManager * myManager= [NSFileManager defaultManager];
NSError * error;
if ([myManager fileExistsAtPath:path]) {
if ([myManager removeItemAtPath:path error:&error]==NO) {
NSLog(#"File removal at temporary directory failed..%#",[error description]);
}
}
[output startRecordingToOutputFileURL:outputUrl recordingDelegate:self];
}
-(AVCaptureDevice *)frontCamera{
NSArray * devices= [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo];
for (AVCaptureDevice * device in devices) {
if ([device position]==AVCaptureDevicePositionFront) {
return device;
}
}
return nil;
}
-(void)captureOutput:(AVCaptureFileOutput *)captureOutput didFinishRecordingToOutputFileAtURL:(NSURL *)outputFileURL fromConnections:(NSArray *)connections error:(NSError *)error{
NSLog(#"Recording Finished");
ALAssetsLibrary * library=[[ALAssetsLibrary alloc]init];
if ([library videoAtPathIsCompatibleWithSavedPhotosAlbum:outputFileURL])
{
[library writeVideoAtPathToSavedPhotosAlbum:outputFileURL
completionBlock:^(NSURL *assetURL, NSError *error)
{
if (error)
{
NSLog(#"Error saving file to photos album");
}
}];
}
}
-(void)captureOutput:(AVCaptureFileOutput *)captureOutput didStartRecordingToOutputFileAtURL:(NSURL *)fileURL fromConnections:(NSArray *)connections{
NSLog(#"Recording Started");
}
- (IBAction)stopRecording:(id)sender {
NSLog(#"Stop Recording called");
[output stopRecording];
}
#end

Capture images in the background?

I'm trying to capture images in the background from the camera without loading a camera or preview interface.
In my app photos are taken in the background with no preview screen, just the normal app screen and then shown to the user later.
Can someone please point me in the right direction?
You have to use AVCaptureSession & AVCaptureDeviceInput.
This is part of code may help you:
#interface MyViewController : UIViewController
{
AVCaptureStillImageOutput *_output;
AVCaptureConnection *_videoConnection;
bool _isCaptureSessionStarted;
}
#property (retain, nonatomic) AVCaptureDevice *frontalCamera;
- (void)takePhoto;
Implementation:
#interface MyViewController ()
#end
#implementation MyViewController
- (id)initWithNibName:(NSString *)nibNameOrNil bundle:(NSBundle *)nibBundleOrNil
{
self = [super initWithNibName:nibNameOrNil bundle:nibBundleOrNil];
if (self) {
_isCaptureSessionStarted = false;
}
return self;
}
- (void)viewDidLoad
{
[super viewDidLoad];
// Finding frontal camera
NSArray *cameras = [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo];
for (int i = 0; i < cameras.count; i++) {
AVCaptureDevice *camera = [cameras objectAtIndex:i];
if (camera.position == AVCaptureDevicePositionFront) {
self.frontalCamera = camera;
[self.frontalCamera addObserver:self forKeyPath:#"adjustingExposure" options:NSKeyValueObservingOptionNew context:nil];
[self.frontalCamera addObserver:self forKeyPath:#"adjustingWhiteBalance" options:NSKeyValueObservingOptionNew context:nil];
}
}
}
- (void)observeValueForKeyPath:(NSString *)keyPath ofObject:(id)object change:(NSDictionary *)change context:(void *)context
{
if (!self.frontalCamera.adjustingExposure && !self.frontalCamera.adjustingWhiteBalance) {
if (_isCaptureSessionStarted) {
[self captureStillImage];
}
}
}
- (void)takePhoto
{
if (self.frontalCamera != nil) {
// Add camera to session
AVCaptureSession *session = [[AVCaptureSession alloc] init];
NSError *error;
AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:self.frontalCamera error:&error];
if (!error && [session canAddInput:input]) {
[session addInput:input];
// Capture still image
_output = [[AVCaptureStillImageOutput alloc] init];
// Captured image settings
[_output setOutputSettings:[[NSDictionary alloc] initWithObjectsAndKeys:AVVideoCodecJPEG, AVVideoCodecKey, nil]];
if ([session canAddOutput:_output]) {
[session addOutput:_output];
_videoConnection = nil;
for (AVCaptureConnection *connection in _output.connections) {
for (AVCaptureInputPort *port in [connection inputPorts]) {
if ([[port mediaType] isEqual:AVMediaTypeVideo]) {
_videoConnection = connection;
break;
}
}
if (_videoConnection) {
break;
}
}
if (_videoConnection) {
[session startRunning];
NSLock *lock = [[[NSLock alloc] init] autorelease];
[lock lock];
_isCaptureSessionStarted = true;
[lock unlock];
}
}
} else {
NSLog(#"%#",[error localizedDescription]);
}
}
}
- (void) captureStillImage
{
[_output captureStillImageAsynchronouslyFromConnection:_videoConnection completionHandler:^(CMSampleBufferRef imageDataSampleBuffer, NSError *error) {
NSLock *lock = [[[NSLock alloc] init] autorelease];
[lock lock];
_isCaptureSessionStarted = false;
[lock unlock];
if (imageDataSampleBuffer != NULL) {
NSData *bitmap = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageDataSampleBuffer];
// You can get image here via [[UIImage alloc] initWithData:bitmap]
}
}];
}
In a (game ) application on ones phone.. example android cell phone;
is it.. you push the camera button and the frontal power button at the same time.. volume up and or down button with the frontal power button or volume up / down side power button? To capture image of application (app) for like extra points? via that games contest rules.

Resources