AVCaptureVideoPreviewLayer is pink on iPhone XR - ios

We have an AVCaptureVideoPreviewLayer that should display the Camera image.
On iPhone 10 (and iPhone 7), the image is displayed correctly.
However on iPhone XR it is displayed in pink.
Here is my code how I setup the Camera Session:
UIView* scanView = [[UIView alloc]initWithFrame:[self view].frame];
[self setCameraScanView:scanView];
[[self cameraScanView]setContentMode:UIViewContentModeScaleAspectFit];
[self setScanCaptureSession:[[AVCaptureSession alloc] init]];
[[self scanCaptureSession]setSessionPreset:AVCaptureSessionPresetLow];
AVCaptureDevice* camera = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
AVCaptureVideoDataOutput* videoOutput = [[AVCaptureVideoDataOutput alloc] init];
[[videoOutput connectionWithMediaType:AVMediaTypeVideo]setEnabled:YES];
[videoOutput setAlwaysDiscardsLateVideoFrames:YES];
[videoOutput setSampleBufferDelegate:self
queue:dispatch_get_main_queue()];
[[self scanCaptureSession]addOutput:videoOutput];
NSError* error;
AVCaptureDeviceInput* input = [[AVCaptureDeviceInput alloc]
initWithDevice:camera error:&error];
[[self scanCaptureSession]addInput:input];
if(error) {
NSLog(#"There was an error setting up Device Input!");
}
AVCaptureVideoPreviewLayer* previewLayer = [[AVCaptureVideoPreviewLayer alloc]initWithSession:[self scanCaptureSession]];
[self setCameraLayer:previewLayer];
[[self cameraLayer] setContentsFormat:kCAContentsFormatRGBA16Float];
[[self cameraLayer] setOpaque:YES];
CALayer* rootLayer = [[self cameraScanView]layer];
[[self cameraLayer]setFrame:[self cameraScanView].bounds];
[[self cameraLayer]setVideoGravity:AVLayerVideoGravityResizeAspectFill];
[rootLayer addSublayer:[self cameraLayer]];
[[self scanCaptureSession]commitConfiguration];
[[self view] addSubview:[self cameraScanView]];
Here is an image how it looks on iPhone XR.

Related

How to show same camera video in two views

I am trying to show the same camera video in two different views; However I only get the video in one view. Could you help. Code is below
-(void) showCameraPreview{
self.camerPreviewCaptureSession =[[AVCaptureSession alloc] init];
self.camerPreviewCaptureSession.sessionPreset = AVCaptureSessionPresetHigh;
AVCaptureDevice *device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
NSError *error = nil;
AVCaptureDeviceInput *videoInput1 = [AVCaptureDeviceInput deviceInputWithDevice:device error:&error];
[self.camerPreviewCaptureSession addInput:videoInput1];
AVCaptureVideoPreviewLayer *newCaptureVideoViewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:self.camerPreviewCaptureSession];
newCaptureVideoViewLayer.frame = self.viewPreview.bounds;
newCaptureVideoViewLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;
[newCaptureVideoViewLayer setFrame:CGRectMake(0.0, 0.0, self.viewPreview.bounds.size.width, self.viewPreview.bounds.size.height )];
AVCaptureVideoPreviewLayer *newCameraViewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:self.camerPreviewCaptureSession];
newCameraViewLayer.frame = self.viewPreview1.bounds;
newCameraViewLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;
[newCameraViewLayer setFrame:CGRectMake(0.0, 0.0, self.viewPreview1.bounds.size.width, self.viewPreview1.bounds.size.height )];
[self.viewPreview1.layer addSublayer:newCameraViewLayer];
[self.viewPreview.layer addSublayer:newCaptureVideoViewLayer];
[self.camerPreviewCaptureSession startRunning];
}

AVCaptureMetadataOutput Inverse Colors

I am making an app that scans a barcode that inverted color (black background & white bars). I have to use AVFoundation. Currently, I am using AVCaptureMetadataOutput. I can get it to work perfectly with a normal barcode. I need to invert the color on the white -> black & black -> white etc. Can I add a CIColorInvert to the Input in AVCaptureSession
- (void)viewDidLoad
{
[super viewDidLoad];
// Do any additional setup after loading the view from its nib.
mCaptureSession = [[AVCaptureSession alloc] init];
AVCaptureDevice *videoCaptureDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
NSError *error = nil;
AVCaptureDeviceInput *videoInput = [AVCaptureDeviceInput deviceInputWithDevice:videoCaptureDevice error:&error];
if([mCaptureSession canAddInput:videoInput]){
[mCaptureSession addInput:videoInput];
} else {
NSLog(#"Could not add video input: %#", [error localizedDescription]);
}
// set up metadata output and this class as its delegate so that if metadata (barcode 39) is detected it will send the data to this class
AVCaptureMetadataOutput *metadataOutput = [[AVCaptureMetadataOutput alloc] init];
if([mCaptureSession canAddOutput:metadataOutput]){
[mCaptureSession addOutput:metadataOutput];
[metadataOutput setMetadataObjectsDelegate:self queue:dispatch_get_main_queue()];
[metadataOutput setMetadataObjectTypes:#[AVMetadataObjectTypeCode39Code]];
} else {
NSLog(#"Could not add metadata output");
}
// sets up what the camera sees as a layer of the view
AVCaptureVideoPreviewLayer *previewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:mCaptureSession];
//CGRect frame = CGRectMake(0.0 - 50, 0.0, 1024.0, 1024.0 + 720.0);
CGRect bounds=self.view.layer.bounds;
previewLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;
previewLayer.bounds=bounds;
previewLayer.position=CGPointMake(CGRectGetMidX(bounds), CGRectGetMidY(bounds));
NSArray *filters = [[NSArray alloc] initWithObjects:[CIFilter filterWithName:#"CIColorInvert"], nil];
[previewLayer setFilters:filters];
//[previewLayer setFrame:self.view.bounds];
[self.view.layer addSublayer:previewLayer];
//starts the camera session
[mCaptureSession startRunning];
}

AVCapture doesn't fill whole view

I'm trying to display live camera video in a view, but when I run the following code, it doesn't fill up the whole screen. It only fills up 3.5 inch screen size (running on 5.5 inch screen). I have set up the autolayout of the view to 0 on each side.
session = [[AVCaptureSession alloc] init];
[session setSessionPreset:AVCaptureSessionPresetPhoto];
AVCaptureDevice *inputDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
NSError *error;
AVCaptureDeviceInput *deviceInput = [AVCaptureDeviceInput deviceInputWithDevice:inputDevice error:&error];
if ([session canAddInput:deviceInput]) {
[session addInput:deviceInput];
}
AVCaptureVideoPreviewLayer *previewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:session];
[previewLayer setVideoGravity:AVLayerVideoGravityResizeAspectFill];
CALayer *rootLayer = [[self view] layer];
[rootLayer setMasksToBounds:YES];
CGRect frame = frameForCapture.frame;
[previewLayer setFrame:frame];
[rootLayer insertSublayer:previewLayer atIndex:0];
stillImage = [[AVCaptureStillImageOutput alloc] init];
NSDictionary *outputSettings = [[NSDictionary alloc] initWithObjectsAndKeys:AVVideoCodecJPEG, AVVideoCodecKey, nil];
[stillImage setOutputSettings:outputSettings];
[session addOutput:stillImage];
[session startRunning];
When I set the view controller's size from 3.5-inch to 5.5-inch and run, it fills up the screen though. The autolayout is working when size changes, but not when It runs.
Layers don't adhere to autoresizing/auto layout the same way that views do, you'll need to manually set the layer's frame on resize. A nice way to do this is to introduce a view subclass that contains the preview layer and sets its frame to the view's bounds in -layoutSubviews

Barcode Scanning in iOS 7

I've written an application that takes advantage of the new AVCaptureMetadataOutput APIs in iOS 7 for barcode scanning.
I have the following code in one of my view controllers:
- (void)viewDidLoad
{
[super viewDidLoad];
highlightView = [[UIView alloc] init];
[highlightView setAutoresizingMask:UIViewAutoresizingFlexibleTopMargin | UIViewAutoresizingFlexibleLeftMargin | UIViewAutoresizingFlexibleRightMargin | UIViewAutoresizingFlexibleBottomMargin];
[[highlightView layer] setBorderColor:[[UIColor greenColor] CGColor]];
[[highlightView layer] setBorderWidth:3.0];
[[self view] addSubview:highlightView];
session = [[AVCaptureSession alloc] init];
device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
input = [AVCaptureDeviceInput deviceInputWithDevice:device error:nil];
[session addInput:input];
output = [[AVCaptureMetadataOutput alloc] init];
[output setMetadataObjectsDelegate:self queue:dispatch_get_main_queue()];
[session addOutput:output];
[output setMetadataObjectTypes:[output availableMetadataObjectTypes]];
[output setRectOfInterest:[[self view] bounds]];
previewLayer = [AVCaptureVideoPreviewLayer layerWithSession:session];
[previewLayer setFrame:[[self view] bounds]];
[previewLayer setVideoGravity:AVLayerVideoGravityResizeAspectFill];
if ([[previewLayer connection] isVideoOrientationSupported]) {
[[previewLayer connection] setVideoOrientation:(AVCaptureVideoOrientation)[[UIApplication sharedApplication] statusBarOrientation]];
}
[[[self view] layer] insertSublayer:previewLayer above:[[self view] layer]];
[session startRunning];
[[self view] bringSubviewToFront:highlightView];
}
- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputMetadataObjects:(NSArray *)metadataObjects fromConnection:(AVCaptureConnection *)connection
{
CGRect highlightViewRect = CGRectZero;
AVMetadataMachineReadableCodeObject *barcode;
NSArray *barCodeTypes = #[AVMetadataObjectTypeUPCECode, AVMetadataObjectTypeCode39Code,AVMetadataObjectTypeCode39Mod43Code, AVMetadataObjectTypeEAN13Code, AVMetadataObjectTypeEAN8Code, AVMetadataObjectTypeCode93Code, AVMetadataObjectTypeCode128Code, AVMetadataObjectTypePDF417Code, AVMetadataObjectTypeQRCode, AVMetadataObjectTypeAztecCode];
for (AVMetadataObject *metadata in metadataObjects) {
if ([barCodeTypes containsObject:[metadata type]]) {
barcode = (AVMetadataMachineReadableCodeObject *)[previewLayer transformedMetadataObjectForMetadataObject:(AVMetadataMachineReadableCodeObject *)metadata];
highlightViewRect = [barcode bounds];
break;
}
}
[highlightView setFrame:highlightViewRect];
[delegate barcodeScannerController:self didFinishScanningWithBarcode:[barcode stringValue]];
}
This code works in that it can detect various barcode types and convert barcodes into their string values. What I'm wondering why on iPhones, only barcodes near the center of the view are detected, while on iPads, only barcodes located near the bottom of the view are detected. It's a very peculiar behaviour, and in the case of the iPad, not intuitive at all.

AVErrorMediaServicesWereReset in AVCaptureSessionRuntimeErrorNotification

I have a problem with AVCaptureSession startRunning. It's problem only on iphone 5 with IOS 7. My app should record video and show all on AVCaptureVideoPreviewLayer, but if I test on iphone 5, first time call error = AVErrorMediaServicesWereReset.
This is my code, where I create capturemanager and startRunning:
-(void)startVideoTranslation{
CGRect r = videoBackView.bounds;
videoPreviewView = [[UIView alloc] initWithFrame:r];
[videoBackView addSubview:videoPreviewView];
if (currentGameType == MPGameTypeCrocodile){
if (captureManager == nil) {
captureManager = [[AVCamCaptureManager alloc] init];
[captureManager setDelegate:self];
if ([captureManager setupSession]) {
//Create video preview layer and add it to the UI
AVCaptureVideoPreviewLayer *newCaptureVideoPreviewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:[captureManager session]];
CALayer *viewLayer = [videoPreviewView layer];
[viewLayer setMasksToBounds:YES];
CGRect bounds = [videoPreviewView bounds];
bounds.origin.y = bounds.origin.y+1;
[newCaptureVideoPreviewLayer setFrame:bounds];
if ([newCaptureVideoPreviewLayer.connection isVideoOrientationSupported]) {
[newCaptureVideoPreviewLayer.connection setVideoOrientation:AVCaptureVideoOrientationLandscapeRight];
}
[newCaptureVideoPreviewLayer setVideoGravity:AVLayerVideoGravityResizeAspectFill];
[viewLayer insertSublayer:newCaptureVideoPreviewLayer below:[[viewLayer sublayers] objectAtIndex:0]];
captureVideoPreviewLayer = newCaptureVideoPreviewLayer;
[[NSNotificationCenter defaultCenter] addObserver:self selector:#selector(isErrorSession:) name:AVCaptureSessionRuntimeErrorNotification object:nil];
dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT,0), ^{
[[captureManager session] startRunning];
});
}
}
}
}
And this selector I call here:
-(void)showPrepareView{
[self playSound:#"start"];
BGView.frame = videoBackView.bounds;
[prepareView addSubview:videoBackView];
[self startVideoTranslation];
[videoBackView addSubview:BGView];
In this code I use AVAudioPlayer in [self playSound:],
on videobackView i add my PreviewLayer.
Does anyone know the reason for this problem?
Now,I solve this problem so, I move in -(id)initWithFrame:
captureManager = [[AVCamCaptureManager alloc] init];
[captureManager setDelegate:self]
[captureManager setupSession];
[[captureManager session] startRunning];
But I don't understand what is it.

Resources