I am working on camera app. i am using AVCapturePhotoOutput for ios 10.x device and AVCaptureStillImageOutput for below 10.x devices.
I am using below capture settings while capturing Photo
let settings = AVCapturePhotoSettings()
let previewPixelType = settings.availablePreviewPhotoPixelFormatTypes.first!
let previewFormat = [kCVPixelBufferPixelFormatTypeKey as String: previewPixelType,
kCVPixelBufferWidthKey as String: 1080,
kCVPixelBufferHeightKey as String: 1080,
]
settings.previewPhotoFormat = previewFormat
settings.isHighResolutionPhotoEnabled = true
settings.flashMode = .on
settings.isAutoStillImageStabilizationEnabled = true
self.captureOutputPhoto?.capturePhoto(with: settings, delegate: self)
when i am try to capture photo using above setting
captureOutput:didFinishProcessingPhotoSampleBuffer:previewPhotoSampleBuffer:resolvedSettings:bracketSettings:error
above delegate throws error first time. I am beginner for AVCapturePhotoSettings. the problem is occurs after every successful photo capture with flash mode.
From Apple documentation:
You may not enable image stabilization if the flash mode is
on
. (Enabling the flash takes priority over the
isAutoStillImageStabilizationEnabled
setting.)
Not sure, if it should throw error, but you can try to remove this string
settings.isAutoStillImageStabilizationEnabled = true
captureOutput:didFinishProcessingPhotoSampleBuffer:previewPhotoSampleBuffer:resolvedSettings:bracketSettings:error:, an Objective-C delegate method, whose Swift version is photoOutput(_:didFinishProcessingPhoto:previewPhoto:resolvedSettings:bracketSettings:error:), is deprecated.
Instead implement the Swift method photoOutput(_:didFinishProcessingPhoto:error:).
I'm using this method for handling the flash settings. AVCaptureDevice is basically the camera that you are using and the AVCaptureFlashMode is the flash mode you want to use.
func changeFlashSettings(device: AVCaptureDevice, mode: AVCaptureFlashMode) {
do {
try device.lockForConfiguration()
device.flashMode = mode
device.unlockForConfiguration()
} catch {
print("Change Flash Configuration Error: \(error)")
}
}
With this you can set the flash setting to on, off or auto. Hope this helps.
Related
Problem: use AVCaptureDevice.setExposureModeCustom to set a fast "shutter speed" (exposureDuration) and high ISO, call AVCapturePhotoOutput to take a photo, see in the resulting image that the exposureDuration / ISO are not used (even though the live video feed shows that it is using the duration/ISO by brightening/darkening as expected)
Turns out that AVCapturePhotoSettings.isAutoStillImageStabilizationEnabled is to blame: by default this is true, and when true your exposure duration and ISO setting can be ignored / reset.
Solution is to set this to false when you're using a custom exposure setting, like this:
// self.customDuration is nil if we're on auto-exposure, non-nil if we are on manual exposure, ie. we called AVCaptureDevice.setExposureModeCustom
let photoSettings: AVCapturePhotoSettings
if self.photoOutput.availablePhotoCodecTypes.contains(.hevc), heicSupported {
photoSettings = AVCapturePhotoSettings(format:
[AVVideoCodecKey: AVVideoCodecType.hevc])
} else {
photoSettings = AVCapturePhotoSettings(format:
[AVVideoCodecKey: AVVideoCodecType.jpeg])
}
// auto still image stabilization destroys our settings for custom exposure (iso, duration), so turn it off if we have any
photoSettings.isAutoStillImageStabilizationEnabled = self.customDuration == nil ?
self.photoOutput.isStillImageStabilizationSupported : false
Showing preview in 1080 x 1440; getting photo with max resolution (3024 x 4032) and quality on iPhone 8 Plus with code:
capturePhotoOutput?.capturePhoto(with: configurePhotoSettings(), delegate: self)
with photo settings:
private func configurePhotoSettings() -> AVCapturePhotoSettings {
let photoSettings = AVCapturePhotoSettings()
photoSettings.isHighResolutionPhotoEnabled = true
photoSettings.isAutoStillImageStabilizationEnabled = (capturePhotoOutput?.isStillImageStabilizationSupported)!
photoSettings.isAutoDualCameraFusionEnabled = (capturePhotoOutput?.isDualCameraFusionSupported)!
return photoSettings
}
Doing this one by one (like sequential shooting mode) and preview freezes each time for a short time even if I do nothing in didFinishProcessingPhoto.
Looking for solution to make capturing smooth, maybe in background thread, but currently I'm stuck..
The reason of preview hangs is feature called optical stabilization.
You just need to turn it off for smooth preview while capturing photo:
photoSettings.isAutoStillImageStabilizationEnabled = false
I am using ios 11, swift 4 and capturing a picture with av foundation library. I have a custom preview as shown and mysettings are as suggested. The problem is when I capture and save the CMSample buffer, it is leftLandscape oriented. I tried to change CapturePhotoOutput orientation but it resist to change?(changing photoOutputConnection.videoOrientation changes nothing?)
if let photoOutputConnection = capturePhotoOutput.connection(with: AVMediaType.video) {
if(photoOutputConnection.isVideoOrientationSupported) {
print("video oryantasyonu = \(photoOutputConnection.videoOrientation)")
} else {
print("video oryantasyonu desteklenmiyor ?!")
}
}
Here is the preview (phono screen):
and here is the capture output taken from xcode debug quick view:
Here is session configuration :
self.capturePhotoOutput = AVCapturePhotoOutput()
capturePhotoOutput.isHighResolutionCaptureEnabled = true
// A Live Photo captures both a still image and a short movie centered on the moment of capture,
// which are presented together in user interfaces such as the Photos app.
capturePhotoOutput.isLivePhotoCaptureEnabled = capturePhotoOutput.isLivePhotoCaptureSupported
guard self.captureSession.canAddOutput(capturePhotoOutput) else { return }
// The sessionPreset property of the capture session defines the resolution and quality level of the video output.
// For most photo capture purposes, it is best set to AVCaptureSessionPresetPhoto to deliver high resolution photo quality output.
self.captureSession.sessionPreset = AVCaptureSession.Preset.photo
self.captureSession.addOutput(capturePhotoOutput)
self.captureSession.commitConfiguration()
I'm trying to capture photo using AVFoundation
Here is my code:
#objc func didTapCameraView() {
self.cameraView.isUserInteractionEnabled = false
let settings = AVCapturePhotoSettings()
let previewPixelType = settings.availablePreviewPhotoPixelFormatTypes.first!
let previewFormat = [kCVPixelBufferPixelFormatTypeKey as String: previewPixelType,
kCVPixelBufferWidthKey as String: 160,
kCVPixelBufferHeightKey as String: 160]
settings.previewPhotoFormat = previewFormat
if flashControlState == .off {
settings.flashMode = .off
} else {
settings.flashMode = .on
}
cameraOutput.capturePhoto(with: settings, delegate: self)
}
but I'm getting same error every time.
If I comment the code inside methods then it builds fine.
Let me know what I missed.
Update:
using iPhone 6 iOS11
Build architectures
Snapshot while using physical device
I found the explanation from Apple develop forum, Check this link
This bug will be fixed later(unfortunately it still exists in ode 9 GM Version).
Quote from Apple forum, we can use this workaround:
As a workaround you can use the SwiftPrivate versions of these API by prepending each API with double underscore (__). For example, change AVCaptureDevice.Format.supportedColorSpaces to AVCaptureDevice.Format.__supportedColorSpaces.
I tested it and it works, and they said such workaround is considered legitimate and is not considered SPI use.
I'm using the AVCaptureStillImageOutput to capture a photo on iOS.
To capture the photo I'm calling captureStillImageAsynchronously.
This requires a connection so I use:
let connection = stillImageOutput.connection(withMediaType: AVMediaTypeVideo)
However on certain devices namely iPad 2 on iOS 9.3.5 I see that the connection returned is nil.
I also tried iterating through all connections using:
stillImageOutput.connections
This shows there are no connections available.
Has anyone else encountered this issue? Is there another better way to obtain the connection? I realize I'm using a deprecated class however the new method is not available on iOS 9 and we still need to support this platform. BTW the camera app itself appears to work just fine on this device.
Also just noticed that canAddInput on AVCaptureSession is returning false.
The input is obtained as follows:
guard let captureDevices = AVCaptureDevice.devices(withMediaType: AVMediaTypeVideo) as? [AVCaptureDevice],
let captureDevice = captureDevices.first(where: { $0.position == .back }),
let captureDeviceInput = try? AVCaptureDeviceInput(device: captureDevice) else {
return
}