Device camera data (not images but actual camera properties) - ios

I need access to my devices camera data (not image data! Already have that). Such as, "pinhole" fx & fy & anything else I can possibly get.
Currently, I'm using AVFoundation's 'AVCaptureSession' with a custom UI. But previously I used 'UIImagePickerController' which has a delegate method called
imagePickerController(_ picker: UIImagePickerController, didFinishPickingMediaWithInfo info: [String : Any])
I was able to retrieve the taken photograph from the "info" dictionary. It also gave me very detailed information regarding the camera & its capabilities. As it stands, I don't know how to retrieve this same detailed information during 'AVCaptureSession' photography. Please help

You need to use AVCapturePhotoOutput and not AVCaptureStillImageOutput.
First the code below needs the following variables in your class
private let photoOutput = AVCapturePhotoOutput()
private var inProgressPhotoCaptureDelegates = [Int64 : AVPhotoCaptureDelegate]()
private let sessionQueue = DispatchQueue(label: "session queue", attributes: [], target: nil) // Communicate with the session
private var videoDeviceOrientation : AVCaptureVideoOrientation = .portrait // this needs updated as the device orientation changes
Add photo output at AVSession setup
// Add photo output.
if session.canAddOutput(photoOutput) {
session.addOutput(photoOutput)
self.photoOutput.isHighResolutionCaptureEnabled = true
}
Your capturePhoto() function should be setup as follows
func capturePhoto(aspectRatio : Float, metaData : NSDictionary?) {
sessionQueue.async {
// Update the photo output's connection to match the video orientation of the video preview layer.
if let photoOutputConnection = self.photoOutput.connection(with: AVMediaType.video) {
photoOutputConnection.videoOrientation = self.videoDeviceOrientation
}
// Capture a JPEG photo with flash set to off and high resolution photo enabled.
let photoSettings = AVCapturePhotoSettings()
photoSettings.flashMode = .off
photoSettings.isHighResolutionPhotoEnabled = true
if photoSettings.availablePreviewPhotoPixelFormatTypes.count > 0 {
photoSettings.previewPhotoFormat = [kCVPixelBufferPixelFormatTypeKey as String : photoSettings.availablePreviewPhotoPixelFormatTypes.first!]
}
// Use a separate object for the photo capture delegate to isolate each capture life cycle.
let photoCaptureDelegate = MyAVPhotoCaptureDelegate(completed: { [unowned self] photoCaptureDelegate in
// When the capture is complete, remove a reference to the photo capture delegate so it can be deallocated.
self.sessionQueue.async { [unowned self] in
self.inProgressPhotoCaptureDelegates[photoCaptureDelegate.requestedPhotoSettings.uniqueID] = nil
}
)
/*
The Photo Output keeps a weak reference to the photo capture delegate so
we store it in an array to maintain a strong reference to this object
until the capture is completed.
*/
self.inProgressPhotoCaptureDelegates[photoCaptureDelegate.requestedPhotoSettings.uniqueID] = photoCaptureDelegate
self.photoOutput.capturePhoto(with: photoSettings, delegate: photoCaptureDelegate)
}
}
The MyAVPhotoCaptureDelegate class referenced in the capturePhoto() function above will need to be setup as follows
class MyAVPhotoCaptureDelegate: NSObject, AVCapturePhotoCaptureDelegate {
init(completed: #escaping (AVPhotoCaptureDelegate) -> ()) {
self.completed = completed
}
private func didFinish() {
completed(self)
}
func photoOutput(_ captureOutput: AVCapturePhotoOutput, willCapturePhotoFor resolvedSettings: AVCaptureResolvedPhotoSettings) {
}
func photoOutput(_ captureOutput: AVCapturePhotoOutput, didFinishProcessingPhoto photoSampleBuffer: CMSampleBuffer?, previewPhoto previewPhotoSampleBuffer: CMSampleBuffer?, resolvedSettings: AVCaptureResolvedPhotoSettings, bracketSettings: AVCaptureBracketedStillImageSettings?, error: Error?) {
if let photoSampleBuffer = photoSampleBuffer {
let propertiesDictionary = NSMutableDictionary()
if let exif = CMGetAttachment(photoSampleBuffer, kCGImagePropertyExifDictionary as NSString, nil) {
if let exifDictionary = exif as? NSMutableDictionary {
// view exif data
}
}
photoData = AVCapturePhotoOutput.jpegPhotoDataRepresentation(forJPEGSampleBuffer: photoSampleBuffer, previewPhotoSampleBuffer: previewPhotoSampleBuffer)
}
else {
print("Error capturing photo: \(String(describing:error))")
return
}
}
func photoOutput(_ captureOutput: AVCapturePhotoOutput, didFinishCaptureFor resolvedSettings: AVCaptureResolvedPhotoSettings, error: Error?) {
// Use PHPhotoLibrary to save photoData to photo library
...
}
private let completed : (MyAVPhotoCaptureDelegate) -> ()
}
Most of this code comes from my version of AVCam with the specifics of my implementation removed. I have left out the code for saving to the photo library but you can extract that from the sample code. You can view the exif data at the point I have commented "view exif data"

Related

Swift base64 image encoding with EXIF data

I am currently using base64 encoding to convert and sent multiple images in a JSON file from my Swift app to my API using:
let imageData = image.jpegData(compressionQuality: 1.0)
let sSideL = imageData.base64EncodedString(options: .lineLength64Characters)
While extending my API, I now would like to use the rich EXIF data provided by most smartphones like lense information, field of view or the device model. Most important for my current purpose is the "Image Model" tag, in order to identify the device, which took the picture.
I recognized that there are some EXIF data left in the base64 data coming through my API but it is limited to the orientation and very basic information like the orientation. Also when I directly print the base64String in Xcode and analyze it, it has very poor EXIF information. Technically it should be possible, because converting the same image in an online base64 converter and analyzing the returning string, I am able to see EXIF information like "Image Model", etc.
Is there a way to convert my UIImage to a base64 string keeping all EXIF details?
The API represents the main part of my system, so I would like to keep it as simple as possible and not add additional upload parameter.
EDIT
Here my code to capture the UIImage
extension CameraController: AVCapturePhotoCaptureDelegate {
public func photoOutput(_ captureOutput: AVCapturePhotoOutput, didFinishProcessingPhoto photoSampleBuffer: CMSampleBuffer?, previewPhoto previewPhotoSampleBuffer: CMSampleBuffer?,
resolvedSettings: AVCaptureResolvedPhotoSettings, bracketSettings: AVCaptureBracketedStillImageSettings?, error: Swift.Error?) {
if let error = error {
// ERROR
}
else if let buffer = photoSampleBuffer,
let data = AVCapturePhotoOutput.jpegPhotoDataRepresentation(forJPEGSampleBuffer: buffer, previewPhotoSampleBuffer: nil),
let image = UIImage(data: data) {
// SEND IMAGE TO SERVER
}
else {
// UNKNOWN ERROR
}
}
}
You can use the newer (iOS 11+) delegate method:
public func photoOutput(_ output: AVCapturePhotoOutput, didFinishProcessingPhoto photo: AVCapturePhoto, error: Error?) {
if let error = error {
// ERROR
} else if let data = photo.fileDataRepresentation() {
// SEND IMAGE DATA TO SERVER
}
else {
// UNKNOWN ERROR
}
}
or the method you are using:
public func photoOutput(_ captureOutput: AVCapturePhotoOutput, didFinishProcessingPhoto photoSampleBuffer: CMSampleBuffer?, previewPhoto previewPhotoSampleBuffer: CMSampleBuffer?,
resolvedSettings: AVCaptureResolvedPhotoSettings, bracketSettings: AVCaptureBracketedStillImageSettings?, error: Swift.Error?) {
if let error = error {
// ERROR
} else if let buffer = photoSampleBuffer,
let data = AVCapturePhotoOutput.jpegPhotoDataRepresentation(forJPEGSampleBuffer: buffer, previewPhotoSampleBuffer: nil) {
// SEND IMAGE DATA TO SERVER
}
else {
// UNKNOWN ERROR
}
}
Like leo-dabus mentioned, you need to send the image data to the server, that has the metadata in it. If you first create an UIImage and convert that back again to data, you have lost the metadata.

Get used exposure duration and ISO values after the capture is complete from the AVCapturePhotoOutput

Background
I am using AVCaptureSession with AVCapturePhotoOutput to save captures as JPEG images.
let captureSession = AVCaptureSession()
let stillImageOutput = AVCapturePhotoOutput()
var captureDevice : AVCaptureDevice?
...
func setupCamera() {
captureDevice = AVCaptureDevice.default(AVCaptureDevice.DeviceType.builtInWideAngleCamera, for: AVMediaType.video, position: .back)
if (captureDevice != nil) {
captureSession.addInput(try AVCaptureDeviceInput(device: captureDevice!))
if captureSession.canAddOutput(stillImageOutput) {
captureSession.addOutput(stillImageOutput)
}
}
}
The AVCaptureDevice is set to automatically and continuously adjust exposure settings
func configureCamera() {
do {
try captureDevice?.lockForConfiguration()
captureDevice?.exposureMode = AVCaptureDevice.ExposureMode.continuousAutoExposure
captureDevice?.unlockForConfiguration()
} catch let error as NSError {
// Errors handled here...
}
}
The capture is started by
func capture(){
// Get an instance of AVCapturePhotoSettings class
let photoSettings = AVCapturePhotoSettings()
// Set photo settings
photoSettings.isAutoStillImageStabilizationEnabled = true
photoSettings.flashMode = .off
// Call capturePhoto method by passing photo settings and a
// delegate implementing AVCapturePhotoCaptureDelegate
stillImageOutput.capturePhoto(with: photoSettings, delegate: self)
}
The parent class is set as an AVCapturePhotoCaptureDelegate and the photoOutput is handled by it
//Delegate
func photoOutput(_ captureOutput: AVCapturePhotoOutput,
didFinishProcessingPhoto photoSampleBuffer: CMSampleBuffer?,
previewPhoto previewPhotoSampleBuffer: CMSampleBuffer?,
resolvedSettings: AVCaptureResolvedPhotoSettings,
bracketSettings: AVCaptureBracketedStillImageSettings?,
error: Error?) {
// Make sure there is a photo sample buffer
guard error == nil,
let photoSampleBuffer = photoSampleBuffer else {
//Errors handled here
return
}
// Convert photo same buffer to a jpeg image data by using // AVCapturePhotoOutput
guard let imageData =
AVCapturePhotoOutput.jpegPhotoDataRepresentation(forJPEGSampleBuffer: photoSampleBuffer, previewPhotoSampleBuffer: previewPhotoSampleBuffer) else {
return
}
let capturedImage = UIImage.init(data: imageData , scale: 1.0)
if let image = capturedImage {
//save photo ...
}
}
And everything works as it should, however...
Problem
I need to know the exposure duration and ISO values that were used for each capture. The values vary because the camera is set to automatically adjust exposure and it has to be like that.
I know the metadata of the capture holds these values but I can't figure out how to access them.
The exposure duration and ISO values are necessary for fine tuning the exposure to achieve optimal results. After fine tuning the capture is started with these manual exposure values
captureDevice?.setExposureModeCustom(duration: customTime, iso: customISO, completionHandler: nil)
Instead of getting the used ISO and exposure duration from the capture metadata, I read these values just before capturing a photo. When doing it this way it is important to check that the exposure has finished adjusting.
Just before calling the capture:
check that the auto exposure is not adjusting
while ((captureDevice?.isAdjustingExposure)!){
usleep(100000) // wait 100 msec
}
Read the current exposure parameters
let current_exposure_duration : CMTime = (captureDevice?.exposureDuration)!
let current_exposure_ISO : Float = (captureDevice?.iso)!
And then take a photo
stillImageOutput.capturePhoto(with: photoSettings, delegate: self)

__availableRawPhotoPixelFormatTypes is empty on iPhone 7+ and iOS11

I'm trying to capture RAW files with AVFoundation. However I'm getting empty array in __availableRawPhotoPixelFormatTypes
Here is my snippet
if self._photoOutput == nil {
self._photoOutput = AVCapturePhotoOutput()
print(self._photoOutput!.__availableRawPhotoPixelFormatTypes)
}
And the output is empty array []
What may cause this?
Here are three things that will cause the availableRawPhotoPixelFormatTypes array to be empty:
You are reading the availableRawPhotoPixelFormatTypes property before adding your _photoOutput to an AVCaptureSession with a video source.
You are using the dual camera input. If so, you can't capture RAW images.
You are using the front camera. If so, you can't capture RAW images.
Here is some modified sample code from an excellent Apple guide (see link below). I have copied from several places and updated it slightly for brevity, simplicity and better overview:
let session = AVCaptureSession()
let photoOutput = AVCapturePhotoOutput()
private func configureSession() {
// Get camera device.
guard let videoCaptureDevice = AVCaptureDevice.default(.builtInWideAngleCamera, for: .video, position: .back) else {
print("Unable to get camera device.")
return
}
// Create a capture input.
guard let videoInput = try? AVCaptureDeviceInput(device: videoCaptureDevice) else {
print("Unable to obtain video input for default camera.")
return
}
// Make sure inputs and output can be added to session.
guard session.canAddInput(videoInput) else { return }
guard session.canAddOutput(photoOutput) else { return }
// Configure the session.
session.beginConfiguration()
session.sessionPreset = .photo
session.addInput(videoInput)
// availableRawPhotoPixelFormatTypes is empty.
session.addOutput(photoOutput)
// availableRawPhotoPixelFormatTypes should not be empty.
session.commitConfiguration()
}
private func capturePhoto() {
// Photo settings for RAW capture.
let rawFormatType = kCVPixelFormatType_14Bayer_RGGB
// At this point the array should not be empty (session has been configured).
guard photoOutput.availableRawPhotoPixelFormatTypes.contains(NSNumber(value: rawFormatType).uint32Value) else {
print("No available RAW pixel formats")
return
}
let photoSettings = AVCapturePhotoSettings(rawPixelFormatType: rawFormatType)
photoOutput.capturePhoto(with: photoSettings, delegate: self)
}
// MARK: - AVCapturePhotoCaptureDelegate methods
func photoOutput(_ output: AVCapturePhotoOutput,
didFinishProcessingRawPhoto rawSampleBuffer: CMSampleBuffer?,
previewPhoto previewPhotoSampleBuffer: CMSampleBuffer?,
resolvedSettings: AVCaptureResolvedPhotoSettings,
bracketSettings: AVCaptureBracketedStillImageSettings?,
error: Error?) {
guard error == nil, let rawSampleBuffer = rawSampleBuffer else {
print("Error capturing RAW photo:\(error)")
return
}
// Do something with the rawSampleBuffer.
}
Apple's Photo Capture Guide:
https://developer.apple.com/library/content/documentation/AudioVideo/Conceptual/PhotoCaptureGuide/index.html
The availableRawPhotoPixelFormatTypes property:
https://developer.apple.com/documentation/avfoundation/avcapturephotooutput/1778628-availablerawphotopixelformattype)
iPhone camera capabilities:
https://developer.apple.com/library/content/documentation/DeviceInformation/Reference/iOSDeviceCompatibility/Cameras/Cameras.html
One supplyment to #Thomas's answer:
If you change AVCaptureDevice.activeFormat, AVCaptureSession.sessionPreset will be set to inputPriority automatically. In such situation, availableRawPhotoPixelFormatTypes will be empty too.

UIButton to capture image not responding

I am new to iOS development and I am trying to follow the following tutorial to learn how to capture images and then save them on a server for my purposes:
https://medium.com/#rizwanm/swift-camera-part-2-c6de440a9404
The tutorial is great and I am only initializing the camera and trying to snap the image (not doing the QR part).
Unfortunately, when I hit the 'capture' button, the image is not being taken or saved in my gallery and I am not able to understand why. I use Xcode 8.3 and I am running iOS 10.3 on my iPhone.
I have connected the button with the view controller and called it in the function onTapTakePhoto as shown in the code below. Please advise as to why this might be happening.
import UIKit
import AVFoundation
class ViewController: UIViewController {
#IBOutlet weak var previewView: UIView!
#IBOutlet weak var CaptureButton: UIButton!
//helps transfer data between one or more input device like camera
var captureSession: AVCaptureSession?
//instance variable
var capturePhotoOutput: AVCapturePhotoOutput? //capturePhotoOutput will help us snap a live photo
//helps render the camera view finder in ViewController
var videoPreviewLayer: AVCaptureVideoPreviewLayer?
override func viewDidLoad() {
super.viewDidLoad()
CaptureButton.layer.cornerRadius = CaptureButton.frame.size.width / 2
CaptureButton.clipsToBounds = true
//set up camera here
let captureDevice = AVCaptureDevice.defaultDevice(withMediaType: AVMediaTypeVideo)
// get an instance of the AVCAptureDevicenput
// serves as a middle man to attach input device to capture device
// chance the input device unavailable so wrap in do catch
do {
// Get an instance of the AVCaptureDeviceInput class using the previous deivce object
let input = try AVCaptureDeviceInput(device: captureDevice)
// Initialize the captureSession object
captureSession = AVCaptureSession()
// Set the input devcie on the capture session
captureSession?.addInput(input)
//set up preview view to see live feed
videoPreviewLayer = AVCaptureVideoPreviewLayer(session: captureSession)
videoPreviewLayer?.videoGravity = AVLayerVideoGravityResizeAspectFill
videoPreviewLayer?.frame = view.layer.bounds
previewView.layer.addSublayer(videoPreviewLayer!)
// finally start the capture session
captureSession?.startRunning()
// get an instance of AVCapturePhotoOutput class
capturePhotoOutput = AVCapturePhotoOutput()
capturePhotoOutput?.isHighResolutionCaptureEnabled = true
//set the outut on the capture session
captureSession?.addOutput(capturePhotoOutput)
} catch {
print (error)
return
}
}
#IBAction func onTapTakePhoto(_ sender: UIButton) {
// Make sure capturePhotoOutput is valid
guard let capturePhotoOutput = self.capturePhotoOutput else { return }
// Get an instance of AVCapturePhotoSettings class
let photoSettings = AVCapturePhotoSettings()
// Set photo settings for our need
photoSettings.isAutoStillImageStabilizationEnabled = true
photoSettings.isHighResolutionPhotoEnabled = true
photoSettings.flashMode = .auto
// Call capturePhoto method by passing our photo settings and a delegate implementing AVCapturePhotoCaptureDelegate
capturePhotoOutput.capturePhoto(with: photoSettings, delegate: self)
}
}
// to get the captured image
extension ViewController : AVCapturePhotoCaptureDelegate {
func capture(_ captureOutput: AVCapturePhotoOutput,
didFinishProcessingPhotoSampleBuffer photoSampleBuffer: CMSampleBuffer?,
previewPhotoSampleBuffer: CMSampleBuffer?,
resolvedSettings: AVCaptureResolvedPhotoSettings,
bracketSettings: AVCaptureBracketedStillImageSettings?,
error: Error?) {
// Make sure we get some photo sample buffer
guard error == nil,
let photoSampleBuffer = photoSampleBuffer else {
print("Error capturing photo: \(String(describing: error))")
return
}
// Convert photo same buffer to a jpeg image data by using AVCapturePhotoOutput
guard let imageData = AVCapturePhotoOutput.jpegPhotoDataRepresentation(forJPEGSampleBuffer: photoSampleBuffer, previewPhotoSampleBuffer: previewPhotoSampleBuffer) else {
return
}
// Initialise an UIImage with our image data
let capturedImage = UIImage.init(data: imageData , scale: 1.0)
if let image = capturedImage {
// Save our captured image to photos album
UIImageWriteToSavedPhotosAlbum(image, nil, nil, nil)
}
}
}
One thing that you should check is that, if you have added the permission in the info.plist file to access the camera.
You have to make a key-value entry for the key
Privacy - Photo Library Usage Description
(if you are using the gallery)
and
Privacy - Camera Usage Description
for using the camera itself.
Refer to this for more details.
If it doesn't work, put a debugger on the click action of the button and check the code flow. If the debugger is not hit after pressing the button, there could be an issue with the button action outlet, try to remake the outlet action.
First of all with the help of break points, check which section of your code is not being executed.There might be some problem related to your button connection.
OR
Try this one
https://www.youtube.com/watch?v=994Hsi1zs6Q
OR
this one in Objective-C
https://github.com/omergul/LLSimpleCamera
Thank you for the answers :)
The problem was a missing connection to my action button code from the story board. Feels like such a silly mistake in retrospect
func imagePickerController(_ picker: UIImagePickerController, didFinishPickingMediaWithInfo info: [String : Any]) {
if (info[UIImagePickerControllerOriginalImage] as? UIImage) != nil {
}
dismiss(animated: true, completion:
{
// write your set image code for UIButton is here.
})
}

Swift 3 - Custom camera view - display still image of photo taken after button click

I am using Swift 3, Xcode 8.2.
I have a custom camera view which displays the video feed fine and a button that I want to act as a shutter. When the user taps on the button, I want a picture taken and it to be displayed on the screen. (e.g. like a Snapchat or Facebook Messenger style camera behavior)
Here is my code:
import UIKit
import AVFoundation
class CameraVC: UIViewController, AVCapturePhotoCaptureDelegate {
// this is where the camera feed from the phone is going to be displayed
#IBOutlet var cameraView : UIView!
var shutterButton : UIButton = UIButton.init(type: .custom)
// manages capture activity and coordinates the flow of data from input devices to capture outputs.
var capture_session = AVCaptureSession()
// a capture output for use in workflows related to still photography.
var session_output = AVCapturePhotoOutput()
// preview layer that we will have on our view so users can see the photo we took
var preview_layer = AVCaptureVideoPreviewLayer()
// still picture image is what we show as the picture taken, frozen on the screen
var still_picture_image : UIImage!
... //more code in viewWillAppear that sets up the camera feed
// called when the shutter button is pressed
func shutterButtonPressed() {
// get the actual video feed and take a photo from that feed
session_output.capturePhoto(with: AVCapturePhotoSettings.init(format: [AVVideoCodecKey : AVVideoCodecJPEG]), delegate: self as AVCapturePhotoCaptureDelegate)
}
func capture(_ captureOutput: AVCapturePhotoOutput, didFinishProcessingPhotoSampleBuffer photoSampleBuffer: CMSampleBuffer?, previewPhotoSampleBuffer: CMSampleBuffer?, resolvedSettings: AVCaptureResolvedPhotoSettings, bracketSettings: AVCaptureBracketedStillImageSettings?, error: Error?) {
// take the session output, get the buffer, and create an image from that buffer
if let sampleBuffer = photoSampleBuffer,
let previewBuffer = previewPhotoSampleBuffer,
let imageData = AVCapturePhotoOutput.jpegPhotoDataRepresentation(forJPEGSampleBuffer: sampleBuffer, previewPhotoSampleBuffer: previewBuffer) {
print("Here") // doesn't get here
}
}
It doesn't seem to print "Here" when running this code and I can't find any Swift 3 tutorials on how to display this image. I'm guessing I want to take the imageData and assign it to my still_picture_image and overlay that over the camera feed somehow.
Any help or a point in the right direction would be great help.
EDIT
After adding the following to my code:
if let error = error {
print(error.localizedDescription)
}
But I still don't get any error printed.
Add the following code into your delegate method to print out the error being thrown:
if let error = error {
print(error.localizedDescription)
}
Once you get your error resolved, I think this post should help you to extract the image: Taking photo with custom camera Swift 3
Okay, I figured out my problem:
First, drag an UIImageView to the Storyboard and have it take up the entire screen. This is where the still picture will be displayed after pressing the shutter button.
Create that variable in the code and link it.
#IBOutlet weak var stillPicture : UIImageView!
Then, in viewDidLoad make sure that you insert the UIImageView on top of the camera view.
self.view.insertSubview(stillPicture, aboveSubview: your_camera_view)
This is the function that is called when the shutter button is clicked:
func shutterButtonPressed() {
let settings = AVCapturePhotoSettings()
let previewPixelType = settings.availablePreviewPhotoPixelFormatTypes.first!
let previewFormat = [kCVPixelBufferPixelFormatTypeKey as String: previewPixelType,
kCVPixelBufferWidthKey as String: 160,
kCVPixelBufferHeightKey as String: 160,
]
settings.previewPhotoFormat = previewFormat
session_output.capturePhoto(with: settings, delegate: self)
}
Then, in your capture delegate:
func capture(_ captureOutput: AVCapturePhotoOutput, didFinishProcessingPhotoSampleBuffer photoSampleBuffer: CMSampleBuffer?, previewPhotoSampleBuffer: CMSampleBuffer?, resolvedSettings: AVCaptureResolvedPhotoSettings, bracketSettings: AVCaptureBracketedStillImageSettings?, error: Error?) {
if let error = error {
print(error.localizedDescription)
}
// take the session output, get the buffer, and create an image from that buffer
if let sampleBuffer = photoSampleBuffer, let previewBuffer = previewPhotoSampleBuffer, let dataImage = AVCapturePhotoOutput.jpegPhotoDataRepresentation(forJPEGSampleBuffer: sampleBuffer, previewPhotoSampleBuffer: previewBuffer) {
// this is the image that the user has taken!
let takenImage : UIImage = UIImage(data: dataImage)!
stillPicture?.image = takenImage
} else {
print("Error setting up photo capture")
}
}

Resources