Xcode Camera: Failed to read exposureBiasesByMode dictionary - ios

I recently got this error with the UIImagePickerController in Xcode Version 12.0.1
[Camera] Failed to read exposureBiasesByMode dictionary: Error Domain=NSCocoaErrorDomain Code=4864 "*** -[NSKeyedUnarchiver _initForReadingFromData:error:throwLegacyExceptions:]: data is NULL" UserInfo={NSDebugDescription=*** -[NSKeyedUnarchiver _initForReadingFromData:error:throwLegacyExceptions:]: data is NULL}
Has anyone else seen this error? How do you fix it?

If you customize your image picker as imagePicker.allowsEditing = true
you have to fetch image using:
if let pickedImage = info[UIImagePickerController.InfoKey.editedImage] as? UIImage {
capturedImage = pickedImage
}
If you instead use imagePicker.allowsEditing = false, use this to pick image:
if let pickedImage = info[UIImagePickerController.InfoKey.originalImage] as? UIImage {
capturedImage = pickedImage
}
If you don't follow this combination, you may get this error.

in my case, I got this bug from trying to use the image data and syncing with Files. Adding this permission in Info.plist made all the difference and made that error go away:
<key>LSSupportsOpeningDocumentsInPlace</key> <true/>

I experienced the same issue. I imported AVKit instead og AVFoundation and tried to present the video in the native recorder view. That gave me an exception telling me to add NSMicrophoneUsageDescription to the info.plist file, and after this, I was able to display the live video in a custom view.
So I believe the issue is with iOS 14 being very picky about permissions, and probably something goes wrong with showing the correct exception when the video is not presented in the native view.
Anyway, this worked for me:
import AVKit
import MobileCoreServices
#IBOutlet weak var videoViewContainer: UIView!
private let imagePickerController = UIImagePickerController()
override func viewDidLoad() {
super.viewDidLoad()
initCameraView()
}
func initCameraView() {
// Device setup
imagePickerController.delegate = self
imagePickerController.sourceType = .camera
imagePickerController.mediaTypes = [kUTTypeMovie as String]
imagePickerController.cameraCaptureMode = .video
imagePickerController.cameraDevice = .rear
// UI setup
addChild(imagePickerController)
videoViewContainer.addSubview(imagePickerController.view)
imagePickerController.view.frame = videoViewContainer.bounds
imagePickerController.allowsEditing = false
imagePickerController.showsCameraControls = false
imagePickerController.view.autoresizingMask = [.flexibleWidth, .flexibleHeight]
}
And then the added description for the NSMicrophoneUsageDescription in the info.plist file :-)
Hope it will work for you as well!

I managed to solve the problem. In fact, it is not directly related to react-native-image-crop-picker. The problem was that I was using react-native-actionsheet to give the user the option to open the camera or the gallery. When I opened the react-native-actionsheet and pressed one of the options, the camera was superimposing the react-native-actionsheet (modal) which generated a conflict, because apparently in IOS it is not possible for one Modal to overlap the other.
So, to solve the problem, I defined a timeout so that it is possible to close the modal before opening the camera.

I got this error when I tried to copy from a URL I couldn't copy. Which was coming from the mediaURL from the UIImagePickerControllerDelegate.
Basically, what I did was to use UISaveVideoAtPathToSavedPhotosAlbum
Like in this example ⤵️
if UIVideoAtPathIsCompatibleWithSavedPhotosAlbum(url.absoluteString) {
UISaveVideoAtPathToSavedPhotosAlbum(url.absoluteString, self, #selector(self.didSaveVideo), nil)
} else {
return /* do something*/
}
#objc private func didSaveVideo(videoPath: String, error: NSError, contextInfo: Any) {}

I found the same error with Xcode 12 & iOS 14 when imagePicker's source type is camera.
But the app is working fine, I could take picture using camera and put it in my collection view cell. Thus, maybe something on Xcode 12 I guess.
#objc func addPerson() {
let picker = UIImagePickerController()
if UIImagePickerController.isSourceTypeAvailable(.camera) {
picker.sourceType = .camera
} else {
fatalError("Camera is not available, please use real device.")
}
picker.allowsEditing = true
picker.delegate = self
present(picker, animated: true)
}

I faced the same error with Xcode 12 & iOS 14.
But in my case, I used ActionSheet to choose camera or photo library before that. So I changed to open camera just after close that ActionSheet, and it works well.
Hope this will be helpful on your issue.
enum MediaOptions: Int {
case Photos
case Camera
}
func selectImage(mediaType: MediaOptions) {
self.mediaOption = mediaType
let iPicker = UIImagePickerController()
iPicker.delegate = self
iPicker.allowsEditing = false
if mediaType == .Camera {
if UIImagePickerController.isSourceTypeAvailable(.camera) {
iPicker.sourceType = .camera
iPicker.allowsEditing = true
}
} else {
iPicker.sourceType = .photoLibrary
}
self.present(iPicker, animated: true, completion: nil)
self.imagePicker = iPicker
}
func choosePhoto() {
let actionSheet = UIAlertController(title: "Choose", message: "", preferredStyle: .actionSheet)
if UIImagePickerController.isSourceTypeAvailable(.camera) {
actionSheet.addAction(UIAlertAction(title: "Camera", style: .default, handler: { (action) -> Void in
actionSheet.dismiss(animated: true) {
self.selectImage(mediaType: .Camera) // Just moved here - inside the dismiss callback
}
}))
}
if UIImagePickerController.isSourceTypeAvailable(.photoLibrary) {
actionSheet.addAction(UIAlertAction(title: "Photo Library", style: .default, handler: { (action) -> Void in
self.selectImage(mediaType: .Photos)
}))
}
actionSheet.addAction(UIAlertAction(title: "Cancel", style: .cancel, handler: nil))
self.present(actionSheet, animated: true, completion: nil)
}

In my case, I was missing an Info.plist key for NSCameraUsageDescription.
You should enter the purpose of using camera as the description.
It fixed the crash for me.
Plus, if you don't give the purpose, your app is likely to be rejected.

If like me you have this second message :
[access] This app has crashed because it attempted to access privacy-sensitive data without a usage description. The app's Info.plist must contain an NSCameraUsageDescription key with a string value explaining to the user how the app uses this data.
Then you have to add this to your info.plist dictionary:
<key>NSCameraUsageDescription</key>
<string>so you can choose a photo or take a picture for object detection</string>
It solved the problem for me

Related

Using camera with Mac Catalyst

In an app ported to Mac Catalyst, the camera interface always turned out to be blank.
I have checked: the capabilities includes "Camera", the privacy setting in info.plist is there (the iPad app shows the camera fine), and I even try to include front camera for UIImagePickerController.
if UIImagePickerController.isSourceTypeAvailable(.camera) {
let imagePicker = UIImagePickerController()
imagePicker.sourceType = .camera
imagePicker.delegate = self
imagePicker.cameraDevice = .front // added for Mac
self.present(imagePicker, animated:true, completion:nil)
}
The error I got is: "[Generic] Could not create video device input: Error Domain=AVFoundationErrorDomain Code=-11814"
I have the same mistake, I'm afraid it's a bug ...
Everything works correctly on iOS and iPadOS.
For the moment I think there's no way to make it work...
You need Requesting Authorization for Media Capture on macOS
import AVFoundation
AVCaptureDevice.requestAccess(for: .video) { granted in
let ctrl = UIImagePickerController()
ctrl.sourceType = .camera
self.present(ctrl, animated: true, completion: nil)
}
Add to info.plist
<key>NSCameraUsageDescription</key>
<string> .... </string>

Open Camera Programmatically Through iOS App - Deep Link?

I want to create an IBAction to open the iOS native camera app in my app, but I can't seem to find the address for the camera app online.
I know for messages it's: UIApplication.shared.open(URL(string: "sms:")!, options: [:], completionHandler: nil)
Does anyone know which is the correct scheme?
I suggest you to follow a clean way doing so:
let cameraVc = UIImagePickerController()
cameraVc.sourceType = UIImagePickerControllerSourceType.camera
self.present(cameraVc, animated: true, completion: nil)
in such case you must add into the Info.plist:
<key>NSCameraUsageDescription</key>
<string>whatever</string>
Here we suggest you to user the following github :
https://github.com/shantaramk/AttachmentHandler
!. 1. Drag drop the AttachmentHandler folder in project folder
func showCameraActionSheet() {
AttachmentHandler.shared.showAttachmentActionSheet(viewController: self)
AttachmentHandler.shared.imagePickedBlock = { (image) in
let chooseImage = image.resizeImage(targetSize: CGSize(width: 500, height: 600))
self.imageList.insert(chooseImage, at: self.imageList.count-1)
self.collectionView.reloadData()
}
}
Swift 5 version, after adding NSCameraUsageDescription in your Info.plist:
let cameraVc = UIImagePickerController()
cameraVc.sourceType = UIImagePickerController.SourceType.camera
self.present(cameraVc, animated: true, completion: nil)

How to create a custom camera preview view with UIImagePickerController?

Sorry if this question is somewhat broad, but I have been searching for hours without a great solution that fits my specific needs for this problem.
For some background information, the iOS app that I am working on calls for a camera overlay (a png of a mask image) where the user has the ability to place their face in and capture a photo with the mask overlay on top of it. The camera is initially set to the front because the main purpose is to have somewhat of a selfie functionality. After tapping the default capture button, the camera preview image gets mirrored or reversed.
Everything looks great before the photo is captured into camera preview view; however, after it is captured, it gets mirrored or reversed. For example, if I am holding up my left hand while setting up to capture the picture, and I capture that image, the camera preview will look like I am holding up my right hand.
During my research, I have found a couple of solutions that do not exactly fit my needs or fully explain how I could go about solving the problem. My project is exclusively written in Swift, and the management wants to keep it this way at all costs if possible. Below, I will list the solutions that I found:
This project is essentially everything that I need; however, it is written in Objective-C: https://github.com/lucasecf/LEMirroredImagePicker
All of the other solutions that I came upon were either to create a custom camera in its entirety (Capture, Cancel, Use Photo, Retake, Camera Preview, etc.) or to use AVFoundation (Never worked with this and there are not many examples in Swift).
Here is the only code that I have so far pertaining to working with UIImagePickerController:
var picker = UIImagePickerController()
#IBAction func shootPhoto(_ sender: AnyObject) {
if UIImagePickerController.availableCaptureModes(for: .front) != nil {
picker = UIImagePickerController() //make a clean controller
picker.allowsEditing = false
picker.sourceType = UIImagePickerControllerSourceType.camera
picker.cameraCaptureMode = .photo
picker.cameraDevice = .front
picker.showsCameraControls = true
//customView stuff
let customViewController = CustomOverlayViewController(
nibName:"CustomOverlayViewController",
bundle: nil)
let customView:CustomOverlayView = customViewController.view as! CustomOverlayView
customView.frame = self.picker.view.frame
picker.modalPresentationStyle = .fullScreen
present(picker,animated: true,completion: {
self.picker.cameraOverlayView = customView
})
} else { //no camera found -- alert the user.
let alertVC = UIAlertController(
title: "No Camera",
message: "Sorry, this device has no camera",
preferredStyle: .alert)
let okAction = UIAlertAction(
title: "OK",
style:.default,
handler: nil)
alertVC.addAction(okAction)
present(alertVC, animated: true, completion: nil)
}
}
So, if someone can either provide a full-fledged solution or guide me somehow (although, I think I have seen everything on the internet already) that would awesome!

iOS App architecture implementing MVVM, Networking and Bluetooth, how?

Problem:
I am currently facing a problem with developing an iOS Mobile Application in Swift that utilizes:
BTLE: Connecting to a peripheral device and sending/receiving data to/from it.
Networking: If the peripheral is connected to a network (wireless and/or ethernet), then the communication over BTLE "could" instead happen over the network.
Model-View-ViewModel architecture
RxSwift
About the App:
It starts with a Bluetooth Setup view, which walks the user through the process of pairing with the peripheral device (disjoint from the TabBarController).
After successfully pairing with the device, all configuration is requested by the iOS App from the device, which is sent as JSON.
This JSON contains the different Model information (programming) that the App displays to the user for manipulation and needs to be stored in a array somehow in a Singleton manor to where a view-model can request any index for displaying to the user.
After all the data is received, the Bluetooth View dismisses and the TabBarView's are presented.
Current Examples:
A good example to relate this App to would be the Apple Watch and the correlating iOS App that allows you to configure everything. I am having to do somewhat the same concept.
Another good example app from this blog post where they are doing something similar to what I am trying to achieve. The difference I am running into though, is their dependency injection setup for MVVM (as well as other similar examples). I've used a storyboard, where as they have programmatically instantiated their view controllers in the AppDelegate.
And my problem...
How can I pass the data (efficiently) from BluetoothView to TabBarView without NSNotifications or PrepareForSegues? Keeping in mind that I am intending to use the library RxSwift for asynchronous event handling and event/data streams. I am trying to keep this App as stateless as possible.
Are the Servers in this blog post a good practice for retrieving view-models and/or updating them?
I find that, when using RxSwift, the "view-model" ends up being a single pure function that takes observable parameters from the input UI parameters and returns observables that are then bound to the output UI elements.
Something that really helped me wrap my head around Rx was the tutorial videos for cycle.js.
As for your specific conundrum...
What you are doing doesn't have to be "forward" movement. Look at it this way... The TabBarView needs some data, and it doesn't care where that data comes from. So give the TabBarView access to a function that returns an observable which contains the necessary data. That closure will present the Bluetooth View, make the connection, get the necessary data and then dismiss the Bluetooth View and call onNext with the required data.
Looking at this gist might help get across what I'm talking about. Granted the gist uses PromiseKit instead of RxSwift, but the same principle can be used (instead of fulfill, you would want to call onNext and then onCompletion.) In the gist, the view controller that needs the data simply calls a function and subscribes to the result (in this case, the result contains a UIImage.) It is the function's job to determine what image sources are available, ask the user which source they want to retrieve the image from and present the appropriate view controller to get the image.
The current contents of the gist are below:
//
// UIViewController+GetImage.swift
//
// Created by Daniel Tartaglia on 4/25/16.
// Copyright © 2016 MIT License
//
import UIKit
import PromiseKit
enum ImagePickerError: ErrorType {
case UserCanceled
}
extension UIViewController {
func getImage(focusView view: UIView) -> Promise<UIImage> {
let proxy = ImagePickerProxy()
let cameraAction: UIAlertAction? = !UIImagePickerController.isSourceTypeAvailable(.Camera) ? nil : UIAlertAction(title: "Camera", style: .Default) { _ in
let controller = UIImagePickerController()
controller.delegate = proxy
controller.allowsEditing = true
controller.sourceType = .Camera
self.presentViewController(controller, animated: true, completion: nil)
}
let photobinAction: UIAlertAction? = !UIImagePickerController.isSourceTypeAvailable(.PhotoLibrary) ? nil : UIAlertAction(title: "Photos", style: .Default) { _ in
let controller = UIImagePickerController()
controller.delegate = proxy
controller.allowsEditing = false
controller.sourceType = .PhotoLibrary
self.presentViewController(controller, animated: true, completion: nil)
}
let cancelAction = UIAlertAction(title: "Cancel", style: .Cancel, handler: nil)
let alert = UIAlertController(title: nil, message: nil, preferredStyle: .ActionSheet)
if let cameraAction = cameraAction {
alert.addAction(cameraAction)
}
if let photobinAction = photobinAction {
alert.addAction(photobinAction)
}
alert.addAction(cancelAction)
let popoverPresentationController = alert.popoverPresentationController
popoverPresentationController?.sourceView = view
popoverPresentationController?.sourceRect = view.bounds
presentViewController(alert, animated: true, completion: nil)
let promise = proxy.promise
return promise.always {
self.dismissViewControllerAnimated(true, completion: nil)
proxy.retainCycle = nil
}
}
}
private final class ImagePickerProxy: NSObject, UIImagePickerControllerDelegate, UINavigationControllerDelegate {
let (promise, fulfill, reject) = Promise<UIImage>.pendingPromise()
var retainCycle: ImagePickerProxy?
required override init() {
super.init()
retainCycle = self
}
#objc func imagePickerController(picker: UIImagePickerController, didFinishPickingMediaWithInfo info: [String : AnyObject]) {
let image = (info[UIImagePickerControllerEditedImage] as? UIImage) ?? (info[UIImagePickerControllerOriginalImage] as! UIImage)
fulfill(image)
}
#objc func imagePickerControllerDidCancel(picker: UIImagePickerController) {
reject(ImagePickerError.UserCanceled)
}
}

IBAction func on button not triggering

I have 3 buttons in a VC, all hooked up to IBAction functions. Two of them work fine but the Submit button simply simply won't trigger.
I have made sure User Interaction is enabled. I have also tried adding sender: AnyObject as a parameter and re-hooking up the function to the button but still no luck. I have also cleaned the project. I am very baffled as to what is going on.
Here is how the VC looks:
Hooking the buttons up:
Accessibility of button:
Here is the code for each IBAction func:
#IBAction func captureImage(){
self.saveVideoVar = false
let imageFromSource = UIImagePickerController()
imageFromSource.delegate = self
imageFromSource.allowsEditing = false
//if there is a camera
if UIImagePickerController.isSourceTypeAvailable(UIImagePickerControllerSourceType.Camera){
imageFromSource.sourceType = UIImagePickerControllerSourceType.Camera
self.presentViewController(imageFromSource, animated: true){}
}
else{
let title = "Error"
let message = "Could not load camera"
let alert = UIAlertController(title: title, message: message, preferredStyle: .Alert)
alert.addAction(UIAlertAction(title: "OK", style: UIAlertActionStyle.Cancel, handler: nil))
presentViewController(alert, animated: true, completion: nil)
}
}
#IBAction func openImageLibrary(){
self.saveVideoVar = false
let imageFromSource = UIImagePickerController()
imageFromSource.delegate = self
imageFromSource.allowsEditing = false
imageFromSource.sourceType = UIImagePickerControllerSourceType.PhotoLibrary
//presents (loads) the library
self.presentViewController(imageFromSource, animated: true){}
}
//code to submit image and video to amazon S3
#IBAction func submitToS3(){
print("x")
if let img : UIImage = imageView.image! as UIImage{
let path = (NSTemporaryDirectory() as NSString).stringByAppendingPathComponent("image.png")
let imageData: NSData = UIImagePNGRepresentation(img)!
imageData.writeToFile(path as String, atomically: true)
// once the image is saved we can use the path to create a local fileurl
let url:NSURL = NSURL(fileURLWithPath: path as String)
nwyt.uploadS3(url)
}
}
Screenshot of control clicking the Submit button:
OH MY GOD! I feel stupid. There was a duplicate screen I had forgotten to delete that looked exactly the same but wasn't the one that was being displayed. I'm going to delete this in an hour. Below was the problem:
Check by setting background colors to the buttons so that you can understand whether any view is over the button or not .
I can see that there is an extra ":" in "submitToS3:", meaning that the function submitToS3 should have an argument, which is not your case.
In order to solve this, just remove the submitToS3 link, and then drag and drop from the Submit button to the yellow icon above in the controller, and link it to the "submitToS3" (you should see it there). When looking back at the Received Actions view, you should not see the ":"
Try this:
//code to submit image and video to amazon S3
#IBAction func submitToS3(sender:UIButton){
print("x")
if let img : UIImage = imageView.image! as UIImage{
let path = (NSTemporaryDirectory() as NSString).stringByAppendingPathComponent("image.png")
let imageData: NSData = UIImagePNGRepresentation(img)!
imageData.writeToFile(path as String, atomically: true)
// once the image is saved we can use the path to create a local fileurl
let url:NSURL = NSURL(fileURLWithPath: path as String)
nwyt.uploadS3(url)
}
}
Seems callback argument was missing. Should work. Finger crossed!!
Please check the enabled property in property inspector!
Check button name!
Create a new button and new method. Try and hook. I had faced this kinda problem. Could be xCode issue if you are using xCode 7.

Resources