creating custom camera with square view on iOS - ios

I am trying to create a custom camera experience on iOS and the following code snippet is as far as I got. Basically I want the usual camera view (i.e. with the following buttons: capture, flash, grid, front/back, cancel). But the only difference between the normal camera and mine is that I want a square for the preview surface; not a rectangle. And then, what you see is what you get (WYSIWYG) such that there is no cropping necessary; as the user would have taken a square picture in the first place.
I have also been looking at the library https://github.com/danielebogo/DBCamera but I am not seeing how to customize it to my end. Any help? Thanks.
MY CODE SO FAR:
- (void)viewDidLoad {
[super viewDidLoad];
// Do any additional setup after loading the view.
//Capture Session
AVCaptureSession *session = [[AVCaptureSession alloc]init];
session.sessionPreset = AVCaptureSessionPresetPhoto;
//Add device
AVCaptureDevice *device =
[AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
//Input
AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:device error:nil];
if (!input)
{
NSLog(#"No Input");
}
[session addInput:input];
//Output
AVCaptureVideoDataOutput *output = [[AVCaptureVideoDataOutput alloc] init];
[session addOutput:output];
output.videoSettings =
#{ (NSString *)kCVPixelBufferPixelFormatTypeKey : #(kCVPixelFormatType_32BGRA) };
//Preview Layer
AVCaptureVideoPreviewLayer *previewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:session];
UIView *myView = self.view;
previewLayer.frame = myView.bounds;
previewLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;
[self.view.layer addSublayer:previewLayer];
//Start capture session
[session startRunning];
}
This is the only custom code in a single view project on Xcode

You have two options for doing what you want, either stick with and customize a UIImagePickerController, or create your own by using the AVFoundation.
The UIImagePickerController does provide a fair bit of customization options, and this similar thread has some good information on that: link.
If you still want to make your own, I suggest heading over to the Apple Documentation and checking out this demo project called AVCam: link. However, it's way more in-depth than you'll probably need so I can recommend this video tutorial as well: link.
If going for the last option, I would like to mention that to make the "actual camera" fit the frame of your previewLayer, you can set the videoGravity on the AVCaptureVideoPreviewLayer to AVLayerVideoGravityResizeAspectFill.

Working with a custom camera can be a bit of a pain, but it’ll pay dividends given that you’ll really be able to customize your app experience.
The easiest way to do it is to use TGCameraViewController.
Using this TGCameraViewController, you can edit whole camera view. Also, It provides following functionalities:-
Easy way to access album (camera roll)
Flash auto, off and on
Focus
Front and back camera
Also you can refer AVCamManual: Extending AVCam to Use Manual Capture document for creating own custom camera.

Related

Camera access with Rectangle capabilities (Objc iOS)

I need some help with our CameraView inside our app. We have built a CameraView with Rectangle / Zoom capabilities (in a webView). When integrating this webCameraView in our app, accessing the camera does not work. Seems like we do have 2 options:
Make the app allow camera access from a webView (is this still not possible 2021?)
Build a copy of the webCameraView directly in the app with rectangle frame instead
Above is what the webCameraView looks like in the app. Rectangle / Zoom is displayed, but the app does not get access to the camera.
If it is not possible to access camera directly in the app via a webView I need to go for alternative nr 2.
I will try to build the same UI via Objc in the app directly. But good updated tutorials for Objc is hard to find or being outdated. All Swift boys have taken over the town! This what I have done so far:
- (void) viewDidLoad {
//-- Setup Capture Session.
_captureSession = [[AVCaptureSession alloc] init];
//-- Creata a video device and input from that Device. Add the input to the capture session.
AVCaptureDevice * videoDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
if(videoDevice == nil)
assert(0);
//-- Add the device to the session.
NSError *error;
AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:videoDevice
error:&error];
if(error)
assert(0);
[_captureSession addInput:input];
//-- Configure the preview layer
_previewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:_captureSession];
_previewLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;
[_previewLayer setFrame:CGRectMake(0, 0,
self.view.frame.size.width,
self.view.frame.size.height)];
//-- Add the layer to the view that should display the camera input
[self.view.layer addSublayer:_previewLayer];
//-- Start the camera
[_captureSession startRunning];
[super viewDidLoad];
}
The code above gives me access to the camera
How do I create a dragable Rectangle / Zoom area in the middle of the screen via Objc?
Also With capabilities to take a photo (and the photo taken should only represent the inside-area of the rectangle-area) ?
Help!
The sample code you provided is for native camera control, I wonder what it has to do with webview camera?
If you want to turn on the system camera in webview and process the acquired content, you can try to look at webrtc. This has version restrictions, but it is open in WKWebView after iOS 14.3.

AVCaptureSessionPresetLow on iPhone 6

I'm a n00b to AVCaptureSession. I'm using OpenTok to implement video chat. I want to preserve bandwidth and the UI is designed so the video views are only 100 x 100 presently.
This is part of the code from an OpenTok example where it sets the preset:
- (void) setCaptureSessionPreset:(NSString*)preset {
AVCaptureSession *session = [self captureSession];
if ([session canSetSessionPreset:preset] &&
![preset isEqualToString:session.sessionPreset]) {
[_captureSession beginConfiguration];
_captureSession.sessionPreset = preset;
_capturePreset = preset;
[_videoOutput setVideoSettings:
[NSDictionary dictionaryWithObjectsAndKeys:
[NSNumber numberWithInt:
kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange],
kCVPixelBufferPixelFormatTypeKey,
nil]];
[_captureSession commitConfiguration];
}
}
When I pass in AVCaptureSessionPresetLow (on an iPhone 6) I get NO. Is there any way I can set AVCaptureSession so I can only capture video with a frame as close to 100 x 100 as possible?
Also, is this the correct strategy for trying to save bandwidth?
You cannot force the camera to a resolution it does not support.
A lower resolution frame size will lead to lower network traffic.
Lowering FPS is one other way.
A view size does not have to map to a resolution. You can always fit a frame in any size view.
Look at the Let-Build-OTPublisher app in OpenTok SDK and more specifically TBExampleVideoCapture.m file, on how resolution and FPS are handled.

AVCaptureVideoPreviewLayer frame in Swift

I'm trying to get the video output on my screen in Swift. But the screen stays completely white. I found this tutorial in ObjC and I followed it (only in Swift style syntax).
In there there is a line previewLayer.frame = myView.bounds;
But the field .frame seems to be read only in swift. And I think this might be why I don't see anything on the screen.
How can I set the frame for the previewLayer in Swift?
I see three points in that tutorial where you could end up not displaying the preview, and thus getting a white screen. Below are the Obj-C and Swift counterparts.
1) You might have missed adding the input to the capture session:
// [session addInput:input];
session.addInput(input)
2) You might not have initialized the preview layer's bounds to that of your view controller:
// UIView *myView = self.view;
// previewLayer.frame = myView.bounds;
previewLayer.frame = self.view.bounds
3) You might not have added the preview layer as a sublayer of your view:
// [self.view.layer addSublayer:previewLayer];
self.view.layer.addSublayer(previewLayer)

AVFoundation camera zoom

I use AVFoundation framework to display video from camera.
The code how i use it is usual:
session = [[AVCaptureSession alloc] init] ;
...
captureVideoPreviewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:session];
...
[cameraView.layer addSublayer:captureVideoPreviewLayer];
...
So i want to add zoom function to camera.
I have found 2 solutions how to implement zoom.
First : is to use CGAffineTransform:
cameraView.transform = CGAffineTransformMakeScale(x,y);
Second : is to put cameraView in the scroll view ,set up max and min scrolling and set this view as zooming view.
- (UIView *)viewForZoomingInScrollView:(UIScrollView *)scrollView
{
return cameraView;
}
What is the best way to make zooming better performance and quality? Are there any else solutions to make zoom? Maybe i missed some AVFoundation methods for zooming.
Thank you.
Well there is actually a GCFloat called setVideoScaleandCropFactor.
You can find the documentation here.
I'm not sure if this is just for still image output but I've been working on it and it does well if you set it to a gesture or a slider and let that control the float.
You can find a demo of it here.
Good stuff. Im trying to loop it so I can create a barcode scanner with a zoom. What im doing is rough though haha.

Cropping an AVCaptureSession preview

I'm displaying a video preview of a 320x320 capture window, and using videoGravity to have it fill the square:
captureVideoPreviewLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;
I'm then grabbing a photo quality image from the session and cropping it, starting at 0, 0. The problem I have is the saved image shows more to the top (but not left) of the frame than the preview. I'd like to basically only see the top of the frame, it seems I'm seeing the middle section, hope I'm explaining myself properly.
Here is the code snippet if it helps:
AVCaptureVideoPreviewLayer *captureVideoPreviewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:session];
captureVideoPreviewLayer.frame = _cameraView.bounds;
[_cameraView.layer addSublayer:captureVideoPreviewLayer];
captureVideoPreviewLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;
AVCaptureDevice *device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
Thanks in advance for any help.
Cheers.
I couldn't figure out how to do this, so I decided to fix it by changing the cropping region instead of the preview region.

Resources