IOS How to merge one image on camera preview layer? - ios

I want to add a png image (blue rect) on camera preview view layer. I get the preview image from this function:
-(void) captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection
{
UIImage * image1 = [self ImageFromSampleBuffer:sampleBuffer];
NSData *imageData1;
}
And this function is that set preview image to an UIImageview.
-(void) SetPreview
{
session =[[AVCaptureSession alloc]init];
session.sessionPreset = AVCaptureSessionPresetPhoto;
//Add device;
AVCaptureDevice*device = nil;
device = [self cameraWithPosition:AVCaptureDevicePositionFront];
AVCaptureDeviceInput*input = [AVCaptureDeviceInput deviceInputWithDevice:device error:nil];
if(!input)
{
NSLog(#"NO Input");
}
[session addInput:input];
//Output
output = [[AVCaptureVideoDataOutput alloc] init];
[session addOutput:output];
output.videoSettings = #{(NSString*)kCVPixelBufferPixelFormatTypeKey:#(kCVPixelFormatType_32BGRA)};
//Preview layer
AVCaptureVideoPreviewLayer *previewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:session];
cameraView = self.view;
previewLayer.frame =CGRectMake(cameraView.bounds.origin.x+5, cameraView.bounds.origin.y+5, cameraView.bounds.size.width - 10, cameraView.bounds.size.height-10);
previewLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;
[_vImage.layer addSublayer:previewLayer];
timer =[NSTimer scheduledTimerWithTimeInterval:1 target:self selector:#selector(snapshot) userInfo:nil repeats:YES];
[session startRunning];
}
How can I implement this feature?

I solved this problem putting "Back Img" on the View.

Related

Set Background Camera behind UILabels

I have been trying for very long and could not get it to work out but basically I would like to display the live camera feed in the background behind my labels and buttons. Here is the code I am working with to make the camera appear
- (void)viewDidLoad {
[super viewDidLoad];
AVCaptureSession *session = [[AVCaptureSession alloc] init];
session.sessionPreset = AVCaptureSessionPresetHigh;
AVCaptureDevice *device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
NSError *error = nil;
AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:device error:&error];
[session addInput:input];
AVCaptureVideoPreviewLayer *newCaptureVideoPreviewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:session];
newCaptureVideoPreviewLayer.frame = self.view.bounds;
[self.view addSublayer:newCaptureVideoPreviewLayer.view];
[self.view sendSubviewToBack:newCaptureVideoPreviewLayer.view];
[session startRunning];
}
I do not know how to place it behind the labels on the view did load. Any help would be much appreciated!
You just need to add the view in background by using sendSubviewToBack
For more detail you can check Apple AVCam Example
sendSubviewToBack:
Moves the specified subview so that it appears behind its siblings.
AVCaptureVideoPreviewLayer *newCaptureVideoPreviewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:session];
newCaptureVideoPreviewLayer.frame = self.view.bounds;
[self.view.layer addSublayer:newCaptureVideoPreviewLayer];
[session startRunning];
OR
[[self.view superView] insertSubview:newCaptureVideoPreviewLayer belowSubview:self.view];

Play audio ruins my AVCaptureSession ios

I am using AVCaptureSession to record audio and video and pass this data to
- (void) captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection {
if(!_isPaused) {
pass sample buffer to encoder
}
}
Everything works nice (delegate is called and I am processing frames) until I open control panel and play music (I pause processing video from appDelegate). After stop music and try to continue recording the delegate method is never called. I think that playing audio something "ruins" my AVCaptureSession or delegate and I need to fix this issue...
here is method where I init AVCaptureSession
- (void) startup
{
if (_session == nil)
{
NSLog(#"Starting camera");
// create capture device with video input
_session = [[AVCaptureSession alloc] init];
AVCaptureDevice* backCamera = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
AVCaptureDeviceInput* input = [AVCaptureDeviceInput deviceInputWithDevice:backCamera error:nil];
[_session addInput:input];
// audio input from default mic
AVCaptureDevice* mic = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeAudio];
AVCaptureDeviceInput* micinput = [AVCaptureDeviceInput deviceInputWithDevice:mic error:nil];
[_session addInput:micinput];
_captureQueue = dispatch_queue_create("myQueue", DISPATCH_QUEUE_SERIAL);
AVCaptureVideoDataOutput* videoout = [[AVCaptureVideoDataOutput alloc] init];
[videoout setSampleBufferDelegate:self queue:_captureQueue];
NSDictionary* setcapSettings = [NSDictionary dictionaryWithObjectsAndKeys:
[NSNumber numberWithInt:kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange], kCVPixelBufferPixelFormatTypeKey,
nil];
videoout.videoSettings = setcapSettings;
[_session addOutput:videoout];
_videoConnection = [videoout connectionWithMediaType:AVMediaTypeVideo];
NSDictionary* actual = videoout.videoSettings;
AVCaptureAudioDataOutput* audioout = [[AVCaptureAudioDataOutput alloc] init];
[audioout setSampleBufferDelegate:self queue:_captureQueue];
[_session addOutput:audioout];
_audioConnection = [audioout connectionWithMediaType:AVMediaTypeAudio];
[_session startRunning];
}
}
and here is my interface:
#interface CameraController () <AVCaptureVideoDataOutputSampleBufferDelegate, AVCaptureAudioDataOutputSampleBufferDelegate>
{
AVCaptureSession* _session;
AVCaptureVideoPreviewLayer* _preview;
dispatch_queue_t _captureQueue;
AVCaptureConnection* _audioConnection;
AVCaptureConnection* _videoConnection; }
and in appDelegate I pause and resume rocording like:
-(void)applicationDidBecomeActive:(UIApplication *)application {
[[CamerraController controller] resume capture]; //this only set _isPaused = NO;
}
-(void)applicationWillResignActive:(UIApplication *)application {
[[CamerraController controller] pauseCapture]; //this only set _isPaused = YES;
}
I am in dead end, thank you for any help

UIImagePickerController slow when there is an active AVSession

I have a UIImagePickerController that can be used for uploading profile images in my social networking application.
It works fine when it is used alone, i.e. no other camera interfering.
In another view, I am using AVCaptureSession and AVCaptureVideoPreviewLayer to embed the camera view inside the view. Here users can upload various photos that they have captured.
This also works fine when it is used alone, i.e. no other camera interfering.
(This is a Tab-Bar Application)
Whenever the AVCapturePreviewLayer is active, and I enter the view with the UIImagePickerController, the imagePicker takes a very long time to load, and sometimes it just freezes.
This is how I initialise the AVSession/AVCapturePreviewLayer:
self.session = [[AVCaptureSession alloc] init];
self.session.sessionPreset = AVCaptureSessionPreset640x480;
self.captureVideoPreviewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:self.session];
self.captureVideoPreviewLayer.frame = self.cameraView.bounds;
[self.captureVideoPreviewLayer setVideoGravity:AVLayerVideoGravityResizeAspectFill];
[self.cameraView.layer addSublayer:self.captureVideoPreviewLayer];
AVCaptureDevice *device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
NSError *error = nil;
AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:device error:&error];
if (!input) {
// Handle the error appropriately.
NSLog(#"ERROR: trying to open camera: %#", error);
}else
[self.session addInput:input];
self.stillImageOutput = [[AVCaptureStillImageOutput alloc] init];
NSDictionary *outputSettings = [[NSDictionary alloc] initWithObjectsAndKeys: AVVideoCodecJPEG, AVVideoCodecKey, nil];
[self.stillImageOutput setOutputSettings:outputSettings];
[self.session addOutput:self.stillImageOutput];
This is how i initialise the UIImagePickerController:
self.picker = [[UIImagePickerController alloc] init];
self.picker.delegate = self;
self.picker.allowsEditing = YES;
self.picker.sourceType = UIImagePickerControllerSourceTypeCamera;
[self presentViewController:self.picker animated:YES completion:NULL];
Why does the UIImagePickerController take forever to load when the previewLayer is active in another view?
How can I reduce the loading time for the UIImagePickerController?
Ok. It seems like calling:
[AVSession stopRunning];
in viewDidDisappear:(BOOL)animated
fixes this issue for me.

AVCaptureSession output is not properly displayed and AVCaptureMetadataOutputObjectsDelegate method is not called

I have a single view application in which I am trying to test iOS7's AVCaptureMetadataOutput based on this explanation. My ViewController conforms to AVCaptureMetadataOutputObjectsDelegate and the code looks like this (almost exactly the same as Mattt's):
- (void)viewDidLoad
{
[super viewDidLoad];
// Do any additional setup after loading the view, typically from a nib.
// Testing the VIN Scanner before I make it part of the library
NSLog(#"Setting up the vin scanner");
AVCaptureSession *session = [[AVCaptureSession alloc] init];
AVCaptureDevice *device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
NSError *error = nil;
AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:device
error:&error];
if (input) {
[session addInput:input];
} else {
NSLog(#"Error: %#", error);
}
AVCaptureMetadataOutput *output = [[AVCaptureMetadataOutput alloc] init];
[output setMetadataObjectsDelegate:self queue:dispatch_get_main_queue()];
[session addOutput:output];
[session startRunning];
}
- (void)captureOutput:(AVCaptureOutput *)captureOutput
didOutputMetadataObjects:(NSArray *)metadataObjects
fromConnection:(AVCaptureConnection *)connection
{
NSString *code = nil;
for (AVMetadataObject *metadata in metadataObjects) {
if ([metadata.type isEqualToString:AVMetadataObjectTypeCode39Code]) {
code = [(AVMetadataMachineReadableCodeObject *)metadata stringValue];
break;
}
}
NSLog(#"code: %#", code);
}
When I run this on an iOS7 device (I've tried an iPhone 4 and iPhone 4s) XCode logs "Setting up the vin scanner" but the camera (ie the AVCaptureSession) never opens.
Edit 1:
I added the following code to show the camera output on screen:
AVCaptureVideoPreviewLayer *previewLayer = [AVCaptureVideoPreviewLayer layerWithSession:session];
// Display full screen
previewLayer.frame = self.view.frame;
// Add the video preview layer to the view
[self.view.layer addSublayer:previewLayer];
But the display is very odd, does not conform to the screen and the way it rotates does not make sense. The other issue is that when I focus the camera on a bar code the metadata delegate method is never called. Please see pictures below:
The camera will not open the way it does for the UIImagePickerController. The problem is that your code does nothing with the output. You'll need to add a preview layer to display the output of the camera as it streams in.
AVCaptureVideoPreviewLayer *previewLayer = [AVCaptureVideoPreviewLayer layerWithSession:session];
// Display full screen
previewLayer.frame = CGRectMake(0.0, 0.0, self.view.frame.size.width, self.view.frame.size.height);
// Add the video preview layer to the view
[self.view.layer addSublayer:previewLayer];
[session startRunning];
Edit**
After taking a deeper look at your code I noticed a few more issues.
First you need to also set the MetaDataObjectTypes you want to search for, right now your not looking for any valid object types. This should be added after you add the output to the session. You can view the full list of available types in the documentation
[output setMetadataObjectTypes:#[AVMetadataObjectTypeQRCode, AVMetadataObjectTypeEAN13Code]];
Second your AVCaptureSession *session is a local variable in your viewDidLoad, take this and place it just after your #interface ViewController () as shown below.
#interface ViewController ()
#property (nonatomic, strong) AVCaptureSession *session;
#end
#implementation ViewController
- (void)viewDidLoad
{
[super viewDidLoad];
self.session = [[AVCaptureSession alloc] init];
// Testing the VIN Scanner before I make it part of the library
NSLog(#"Setting up the vin scanner");
self.session = [[AVCaptureSession alloc] init];
AVCaptureDevice *device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
NSError *error = nil;
AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:device
error:&error];
if (input) {
[self.session addInput:input];
} else {
NSLog(#"Error: %#", error);
}
AVCaptureMetadataOutput *output = [[AVCaptureMetadataOutput alloc] init];
[output setMetadataObjectsDelegate:self queue:dispatch_get_main_queue()];
[self.session addOutput:output];
[output setMetadataObjectTypes:#[AVMetadataObjectTypeQRCode, AVMetadataObjectTypeEAN13Code]];
[self.session startRunning];
}

iOS : AVFoundation Image Capture Dark

I'm developping an app which captures photos from my iPad front camera.
The photos are coming very dark.
Does someone have an idea about how to fix this issue, please ?
Here is my code and some explainations :
1) I initialize my capture session
-(void)viewDidAppear:(BOOL)animated{
captureSession = [[AVCaptureSession alloc] init];
NSArray *devices = [AVCaptureDevice devices];
AVCaptureDevice *frontCamera;
for (AVCaptureDevice *device in devices){
if ([device position] == AVCaptureDevicePositionFront) {
frontCamera = device;
}
}
if ([frontCamera isExposureModeSupported:AVCaptureExposureModeContinuousAutoExposure]){
NSError *error=nil;
if ([frontCamera lockForConfiguration:&error]){
frontCamera.exposureMode = AVCaptureExposureModeContinuousAutoExposure;
frontCamera.focusMode=AVCaptureFocusModeAutoFocus;
[frontCamera unlockForConfiguration];
}
}
NSError *error = nil;
AVCaptureDeviceInput *frontFacingCameraDeviceInput = [AVCaptureDeviceInput deviceInputWithDevice:frontCamera error:&error];
[captureSession addInput:frontFacingCameraDeviceInput];
[captureSession setSessionPreset:AVCaptureSessionPresetHigh];
captureVideoOutput = [[AVCaptureVideoDataOutput alloc] init];
captureImageOutput =[[AVCaptureStillImageOutput alloc] init];
[captureSession addOutput:captureVideoOutput];
[captureSession addOutput:captureImageOutput];
}
2) When the user presses the button Record, it starts a timer and preview the content of the camera to a preview layer
- (IBAction)but_record:(UIButton *)sender {
MainInt = 4;
timer = [NSTimer scheduledTimerWithTimeInterval:1.0 target:self selector:#selector(countup) userInfo:nil repeats:YES];
previewLayer = [[AVCaptureVideoPreviewLayer alloc]initWithSession:captureSession];
previewLayer.connection.videoOrientation = AVCaptureVideoOrientationPortrait;
CGRect rect = CGRectMake(0, 0, self.aView.bounds.size.width, self.aView.bounds.size.height);
previewLayer.frame = rect;
[self.aView.layer addSublayer:previewLayer];
[captureSession startRunning];
}
3) At the end of the timer, the photo is taken and saved
- (void)countup {
MainInt -=1;
if (MainInt == 0) {
[timer invalidate];
timer = nil;
[captureSession stopRunning];
AVCaptureConnection *videoConnection = nil;
for (AVCaptureConnection *connection in captureImageOutput.connections)
{
for (AVCaptureInputPort *port in [connection inputPorts])
{
if ([[port mediaType] isEqual:AVMediaTypeVideo] )
{
videoConnection = connection;
break;
}
}
if (videoConnection) { break; }
}
[captureImageOutput captureStillImageAsynchronouslyFromConnection:videoConnection completionHandler: ^(CMSampleBufferRef imageSampleBuffer, NSError *error)
{
CFDictionaryRef exifAttachments = CMGetAttachment( imageSampleBuffer, kCGImagePropertyExifDictionary, NULL);
NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageSampleBuffer];
stillImage = [[UIImage alloc] initWithData:imageData];
}];
[captureSession startRunning];
[captureSession stopRunning];
}
}
4) Finally, when the user press the save button, the image is recorded in a specific album
- (IBAction)but_save:(UIButton *)sender {
UIImage *img = stillImage;
[self.library saveImage:img toAlbum:#"mySpecificAlbum" withCompletionBlock:^(NSError *error)];
}
In fact, all the code works properly but the resulting images are very dark...
This was happening to me as well and it turned out I was trying to capture too soon and the camera didn't have enough time to stabilize. I had to add about 0.5 seconds of delay before the pictures would be normal brightness.
HTH
I had the same issue on an iOS 7 5th gen ipod touch, but not on a 4th gen ipod touch with iOS 6.1.
I found that the fix is to show a preview of the camera:
// Setup camera preview image
AVCaptureVideoPreviewLayer *previewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:_captureSession];
[_previewImage.layer addSublayer:previewLayer];
As instructed at https://developer.apple.com/library/ios/documentation/AudioVideo/Conceptual/AVFoundationPG/Articles/04_MediaCapture.html#//apple_ref/doc/uid/TP40010188-CH5-SW22
Note: I did not investigate accomplishing this without a preview

Resources