Camera in iOS app Tutorial - ios

I was wondering if anyone was willing to share how to put a Camera feature into an iOS app, or if anyone knew of a simple tutorial. NOT one with any buttons, just showing on the screen what the camera is seeing. I tried Apple's Documentation, but it was too complex for my needs.
Thank you so much!
EDIT: Any simple tutorial will do fine. Like I said, I don't need anything else, besides it to display what the camera is seeing.

I don't know about a simple tutorial but adding a view that shows what the camera sees is super easy.
First:
Add a UIView to your interface builder that will be where the camera will be shown.
Second:
Add the AVFoundation framework to your project, and add its import to your ViewController .m file.
#import <AVFoundation/AVFoundation.h>
Third:
Add these 2 variables to your interface variable declarations
AVCaptureVideoPreviewLayer *_previewLayer;
AVCaptureSession *_captureSession;
Fourth:
Add this code to your viewDidLoad. (The explanation of what it does is commented)
//-- Setup Capture Session.
_captureSession = [[AVCaptureSession alloc] init];
//-- Creata a video device and input from that Device. Add the input to the capture session.
AVCaptureDevice * videoDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
if(videoDevice == nil)
assert(0);
//-- Add the device to the session.
NSError *error;
AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:videoDevice
error:&error];
if(error)
assert(0);
[_captureSession addInput:input];
//-- Configure the preview layer
_previewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:_captureSession];
_previewLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;
[_previewLayer setFrame:CGRectMake(0, 0,
self.cameraPreviewView.frame.size.width,
self.cameraPreviewView.frame.size.height)];
//-- Add the layer to the view that should display the camera input
[self.cameraPreviewView.layer addSublayer:_previewLayer];
//-- Start the camera
[_captureSession startRunning];
Notes:
The asserts will make the program exit in places where a camera is not available.
This only shows "The preview" of what the camera sees, if you want to manipulate the input, or take a picture or record the video you need to configure additional things like the SessionPreset and add the corresponding capture delegates. But in that case you should follow a proper tutorial or read the documentation.

Related

Lag when setting AVCaptureConnection video orientation

The problem in question uses AVFoundation to setup a camera whose output is displayed in a AVCaptureVideoPreviewLayer, and is also processed as a pixel buffer. In order for the pixel buffer to be processed by the - processSampleBuffer: method, it must be provided in the correct orientation, which is dependent on the device orientation.
As far as I know, this can be done either by rotating the pixel buffer as its given in the sample buffer delegate method by accessing the raw pixel values in -captureOutput:didOutputSampleBuffer:fromConnection:, or by setting the videoOrientation property on the appropriate AVCaptureConnection, which ensures the pixel buffer is provided at the desired orientation. An outline of the setup is as follows:
- (void)setupCamera
{
AVCaptureSession *session = [AVCaptureSession new];
AVCaptureDeviceInput *deviceInput = [AVCaptureDeviceInput deviceInputWithDevice:backCamera error:nil];
AVCaptureVideoPreviewLayer *previewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:session];
[session addInput:deviceInput];
dispatch_queue_t videoOutputQueue = dispatch_queue_create("com.MyApp.videoQueue", DISPATCH_QUEUE_SERIAL);
dispatch_set_target_queue(videoOutputQueue, dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_HIGH, 0));
AVCaptureVideoDataOutput *videoOutput = [AVCaptureVideoDataOutput new];
videoOutput.alwaysDiscardsLateVideoFrames = YES;
[videoOutput setSampleBufferDelegate:self queue:videoOutputQueue];
[session addOutput:videoOutput];
// more setup
}
- (void)captureOutput:(AVCaptureOutput *)captureOutput
didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer
fromConnection:(AVCaptureConnection *)connection
{
connection.videoOrientation = [self getCurrentOrientation]; // setting this to a new value causes the preview layer to freeze momentarily
[self processSampleBuffer:sampleBuffer]; // some arbitrary image processing method
}
This works as intended as far as the orientation of the pixel buffer is concerned, however, whenever the device is rotated to a new orientation giving connection.videoOrientation a new value, the preview layer freezes for a fraction of a second. Blocking the delegate method's thread (e.g. by adding a sleep) doesn't freeze the preview layer, so that's not the problem. Any help towards a solution is hugely appreciated!

PBJVision setCameraMode

How to switch mode video converter photo In PBJVision
now
PBJVision *vision = [PBJVision sharedInstance];
vision.delegate = self;
[vision setCameraMode:PBJCameraModePhoto];
[vision setCameraOrientation:PBJCameraOrientationPortrait];
[vision setFocusMode:PBJFocusModeAutoFocus];
[vision setOutputFormat:PBJOutputFormatPreset];
[[PBJVision sharedInstance] capturePhoto];
You can change camera mode as adding just one line. The answer is already exist in your code. That is.
[vision setCameraMode:PBJCameraModeVideo];
And use this to recording video.
[[PBJVision sharedInstance] startVideoCapture];
[[PBJVision sharedInstance] endVideoCapture];
It might be better if you know additionally these.
Changing camera mode to another seems like need a bit time.
When I had used like this, error occurred.
(In my case, change to photo mode from video mode)
[vision setCameraMode:PBJCameraModePhoto];
[vision capturePhoto];
The cause is that session setting for camera mode changing is not end completely yet.
- (void)capturePhoto
{
if (![self _canSessionCaptureWithOutput:_currentOutput] || _cameraMode != PBJCameraModePhoto) {
DLog(#"session is not setup properly for capture");
return; <--- I'm returned;
}
....
}
So be careful to write sequentially changing camera mode and calling capture. :)

How to get NSData from AVCaptureSession?

I am recording video in iOS using AVCaptureSession.
-(id)init
{
if ((self = [super init]))
{
[self setCaptureSession:[[AVCaptureSession alloc] init]];
}
return self;
}
-(void)addVideoPreviewLayer
{
[self setPreviewLayer:[[[AVCaptureVideoPreviewLayer alloc] initWithSession:[self captureSession]] autorelease]];
[[self previewLayer] setVideoGravity:AVLayerVideoGravityResizeAspectFill];
}
How can I create NSData of recorded video simultaneously with recording ?
Access the encoded frames? You cannot do that with the iOS SDK alone. You can record a bit to file, access the encoded frames in the file, record a new file, access more ... if you need to do so.
However, if you are trying to get the raw frames, while also writing, that's fairly straightforward. Instead of capturing output to a file, use –captureOutput:didOutputSampleBuffer:fromConnection: on your AVCaptureAudioDataOutputSampleBufferDelegate. Just make sure to also route the data to something that is encoding/writing the buffers, otherwise you will lose the "...simultaneously with recording" aspect.
This isn't an NSData, but a CMSampleBufferRef, which depending on if the buffer is audio or video, can be converted to NSData in various ways.
My be these links will help you to solve problem:
http://www.ios-developer.net/iphone-ipad-programmer/development/camera/record-video-with-avcapturesession-2
http://indieambitions.com/idevblogaday/raw-video-data-app-quick-dirty/

Setting a background image/view to live camera view?

Is it possible to set a background for a particular view controller to show a live camera view? If so, could one lead me in the right direction to make this possible?
Yes, definitely possible. You can embed live camera feed in UIView, which you can place anywhere you like.
Start by reading here: AVFoundation Reference - this is your framework
Particular class that you are looking for is AVCaptureVideoPreviewLayer
Which works in unison with AVCaptureSession
And this is an example project that covers everything you need: AVCam
Import:
#import <AVFoundation/AVFoundation.h>
To add camera view to a controller's view add this code in the viewDidLoad:
AVCaptureSession *session = [[AVCaptureSession alloc] init];
session.sessionPreset = AVCaptureSessionPresetHigh;
AVCaptureDevice *device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
NSError *error = nil;
AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:device error:&error];
[session addInput:input];
AVCaptureVideoPreviewLayer *newCaptureVideoPreviewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:session];
newCaptureVideoPreviewLayer.frame = self.view.bounds;
[self.view.layer addSublayer:newCaptureVideoPreviewLayer];
[session startRunning];
I think your best bet is to grab and understand this apple sample code, called AVCam. You'll see in the code how to create an AVCaptureVideoPreviewLayer. You'll insert this as a sublayer of a UIView that you'll use as your "background".
Once you've got that working, that UIView will be just like any other part of your view hierarchy. You can treat it like a background UIImageView (albeit, one that consumes a lot more batter power).

How to subview a camera view?

I am making an app that will let the user see themselves in a 'mirror' (the front facing camera on the device). I know of multiple ways of making a UIImageViewController with a view overlay, but I want my app to have it be the opposite way. In my app, I want the camera view to be a subview of the main view, without the shutter animation or the ability to capture photos or take videos and without it being full screen. Any ideas?
The best way to accomplish this is to not use the built-in UIImagePickerController, but rather use the AVFoundation classes.
You want to create an AVCaptureSession and set the appropriate outputs and inputs. Once it's configured you can get an AVCapturePreviewLayer which can be added to a view that you have configured in your view controller. The preview layer has a number of properties that allow you to control how the preview is displayed.
AVCaptureSession *session = [[AVCaptureSession alloc] init];
AVCaptureOutput *output = [[AVCaptureStillImageOutput alloc] init];
[session addOutput:output];
//Setup camera input
NSArray *possibleDevices = [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo];
//You could check for front or back camera here, but for simplicity just grab the first device
AVCaptureDevice *device = [possibleDevices objectAtIndex:0];
NSError *error = nil;
// create an input and add it to the session
AVCaptureDeviceInput* input = [AVCaptureDeviceInput deviceInputWithDevice:device error:&error]; //Handle errors
//set the session preset
session.sessionPreset = AVCaptureSessionPresetMedium; //Or other preset supported by the input device
[session addInput:input];
AVCaptureVideoPreviewLayer *previewLayer = [AVCaptureVideoPreviewLayer layerWithSession:session];
//Set the preview layer frame
previewLayer.frame = self.cameraView.bounds;
//Now you can add this layer to a view of your view controller
[self.cameraView.layer addSublayer:previewLayer]
[session startRunning];
You can then use the captureStillImageAsynchronouslyFromConnection:completionHandler: of the output device to capture an image.
For more information on how AVFoundation is structured and examples on how to do this in more detail checkout the Apple Docs.
Apple's AVCamDemo lays all of this out as well

Resources