Achieving seamless front/back camera switching with AVFoundation - ios

I'm developing a photo app using AVFoundation and when I switch from front to rear camera (and vice-versa), I've noticed my AVCaptureVideoPreviewLayer is briefly darkened immediately after the switch. It's not an enormous deal, but it does make the transition look very amateurish. Has anyone else experienced this/found a solution?

Related

Camera preview is missing while using torch on web view

We have QR-scanner feature implemented on web view (WKWebView). If it's dark we want to light our QR-code using flashlight in iPhone. In general, this feature works, but without flashlight.
But have that problems:
If we turn on torch then camera preview disappears.
If turn off torch then camera preview appears.
Do you have any idea why it's so? And how can we sort it out?

Video Preview Layer Running While in Split-View?

I currently have an app that displays the front facing camera atop a video preview layer. By default in iOS 9, the preview layer is interrupted/paused and will not resume until split-view is dismissed. Based on the nature of the app, maintaining the running camera preview layer while multitasking is essential.
Is there any way to force the capture session to continue previewing while in split view?
Update: Seems as if Apple does not allow any sort of camera use while the device has more than one application open. You can, however, invoke UIImagePickerController in order to take a photo while in split-view. Of course this solution only allows you to snap a single photo, and nothing more. Hope this helps someone!

AVCaptureSession VS UIImagePickerController camera preview

I'm developing an application similar to Instagram iOS app. Instagram have a custom camera preview. I want to develop something similar and the question is - what to use better for this purpose - UIImagePickerController with custom cameraOverlayView property or should I use AVCaptureSession ? Maybe someone have such experience and can give me an advice. Will be appreciate.
AVCaptureSession is more customisable than UIImagePickerController. In case of speed, there is not much difference. If you are using AVCaptureSession it is possible to switch between whiteBalanceMode, focusMode and exposureMode when taking still images. Also we can specify the quality of the taking photo. In case of UIImagePickerController, the camera view will be presented and we can add overlay on it. But in case of AVCaptureSession, we can manage the camera as a normal view and add controls over it. So I think AVCaptureSession will be more suitable for your requirement.
The below link will give you more details about implementing AVCaptureSession.
https://developer.apple.com/library/prerelease/ios/samplecode/AVCam/Introduction/Intro.html

iOS App: AVFoundation Video camera recording delegates

I am trying to develop an app that records the video using the rear camera.
I want the video to be tweaked (adding overlay image, green screen) before saving it to a file.
I am using AVFoundation, and have researched anywhere, but I still cant make the correct codes to preview and record (without the tweaking)
Some codes that Iget from the internet use BufferDelegate, the rest are using RecordingDelegate.
Do I need to use both to achieve what I want? or is it just one ofthem?
Could you please pinpoint me how I can achieve this?

Custom ios camera shutter

Is there any ready custom iris/shutter view for ios camera?
I'm going to implement camera app using AVFoundataion, and there is no shutter in it.
Nope, if you are purely using AVFoundation for capture, you need to build it yourself.
The alternative is using UIImagePickerController with a custom cameraOverlayView if your camera app is simple enough.
But UImagePickerController is not working correctly .because shutter is not opening in interrupted call or background apps.

Resources