I have been unable to capture Live Photos using UIImagePickerController. I can capture still photos and even video (which is not my scenario but I checked just to make sure), but the camera does not capture live photos. The documentation suggests it should (source):
To obtain the motion and sound content of a live photo for display (using the PHLivePhotoView class), include the kUTTypeImage and kUTTypeLivePhoto identifiers in the allowed media types when configuring an image picker controller. When the user picks or captures a Live Photo, the editingInfo dictionary contains the livePhoto key, with a PHLivePhoto representation of the photo as the corresponding value.
I've set up my controller:
let camera = UIImagePickerController()
camera.sourceType = .camera
camera.mediaTypes = [UTType.image.identifier, UTType.livePhoto.identifier]
camera.delegate = context.coordinator
In the delegate I check for the Live Photo:
func imagePickerController(_ picker: UIImagePickerController, didFinishPickingMediaWithInfo info: [UIImagePickerController.InfoKey : Any]) {
if let live = info[.livePhoto] as? PHLivePhoto {
// handle live photo
} else if let takenImage = info[.originalImage] as? UIImage, let metadata = info[.mediaMetadata] as? [AnyHashable:Any] {
// handle still photo
}
}
But I never get the Live Photo.
I've tried adding NSMicrophoneUsageDescription to the info.plist thinking it needs permissions for the microphone, but that did not help. Of course, I've added the NSCameraUsageDescription to give camera permissions.
Has anyone successfully captured Live Photos using UIImagePickerController?
Related
I'm extracting pixel colors from a CGImage using the code described in this answer.
However, I just realized that if I load an image that was created on another device, the pixels values look wrong. The first obvious problem is that the alpha is gone. The CGImageAlphaInfo reports .noneSkipLast, but I know the image is RGBA. If I read it from the same device it was created, it looks fine. The second problem is that there is some color bleeding, as if the image had been resized. Perhaps is being compressed or something.
Here's an example:
Source image is this watermelon, 12x12
It was created on my iPad. But if I load it on my iPhone through iCloud using that code I linked, I get this:
The alpha channel is gone, and colors bleed.
If from the same iPhone I send the little watermelon to my Mac using Airdrop, and send it back using Airdrop again (so it is supposedly the same image!), and load it now, I get the correct image:
(dark brown areas is where alpha is 0)
If you have a couple of iOS device with iCloud enabled, you can reproduce this behavior in this app: http://swiftpixels.endavid.com
where I'm using that code to read pixel colors.
What could be the difference between those images? How can I read the correct image from iCloud? Should I look for hints in UIImage instead of CGImage?
Any clues? Thanks!
Update
For reference, I'm reading the image using a UIImagePickerController, using this code:
func imagePickerController(_ picker: UIImagePickerController, didFinishPickingMediaWithInfo info: [UIImagePickerController.InfoKey : Any]) {
if let image = info[UIImagePickerController.InfoKey.originalImage] as? UIImage {
loadImageInMemory(image)
}
picker.dismiss(animated: true, completion: nil)
self.presentingViewController?.dismiss(animated: true, completion: nil)
}
fileprivate func loadImageInMemory(_ image: UIImage) {
/// skipping some preparation
guard let cgImage = image.cgImage else {
return
}
/// getImageData is a function like getPixelColor linked above,
/// but for a Rect
self.imageData = cgImage.getImageData(rect, width, height)
}
I also found this question which may be related: UIImagePickerController and iCloud photos
As Rob suggested in the comments below, I changed the Photos settings in my phone to "Download and Keep Originals" (instead of "Optimize iPhone Storage"), and that fixes the problem. So I guess the question is why iCloud tries to compress a PNG image that it's just 1,355 bytes, and whether it's possible to access the original image from the UIImagePickerController
In the Settings app, under “Photos” there is an “Optimize iPhone Storage” vs “Download and Keep Originals” option. The latter should synchronize the photos library without altering the images in any way.
It seems Airdrop can access the original image, though.
Airdrop is not relying upon photo library iCloud synchronization to send the image. That's why the asset makes it across, unaltered.
How can I make my app access the original as well?
It comes down to whether you really want to use the Photos library to synchronize your assets. This seems especially true as many people may not choose to sync their huge Photos library to iCloud at all. I'm not sure you want to rely upon that.
You might want to just use CloudKit yourself rather than relying upon the Photos library. Perhaps you should have an option in the app so a user can choose to use iCloud storage at all, and if so, do the iCloud storage within your app rather than relying upon the Photos library.
The reason why the image looked blurry and without alpha seems to be that Photos streams Jpeg-compressed images from iCloud by default. Even if your original image would be smaller without compression, like in this pixel art example.
As pointed out by Rob, a way to verify that this is the case is to change your Photos settings. From the Settings app:
Settings -> Photos -> Download and Keep Originals
This would fix the issue, but of course it's not desirable. If you want to keep using Photos, instead of implementing your own iCloud solution, while keeping the Optimize iPhone Storage setting, you can use PhotoKit to retrieve the original image.
Replace this code,
func imagePickerController(_ picker: UIImagePickerController, didFinishPickingMediaWithInfo info: [UIImagePickerController.InfoKey : Any]) {
if let image = info[UIImagePickerController.InfoKey.originalImage] as? UIImage {
loadImageInMemory(image)
}
picker.dismiss(animated: true, completion: nil)
self.presentingViewController?.dismiss(animated: true, completion: nil)
}
by this other code:
import Photos
// ...
func imagePickerController(_ picker: UIImagePickerController, didFinishPickingMediaWithInfo info: [UIImagePickerController.InfoKey : Any]) {
// it will be loaded asynchronously
loadImageFromPicker(info: info)
picker.dismiss(animated: true, completion: nil)
self.presentingViewController?.dismiss(animated: true, completion: nil)
}
private func loadImageFromPicker(info: [UIImagePickerController.InfoKey : Any]) {
var phAsset: PHAsset?
if #available(iOS 11.0, *) {
phAsset = info[UIImagePickerController.InfoKey.phAsset] as? PHAsset
} else {
// Fallback on earlier versions
if let referenceURL = info[UIImagePickerController.InfoKey.referenceURL] as? URL {
let fetchResult = PHAsset.fetchAssets(withALAssetURLs: [referenceURL], options: nil)
phAsset = fetchResult.firstObject
}
}
guard let asset = phAsset else {
return
}
// size doesn't matter, because resizeMode = .none
let size = CGSize(width: 32, height: 32)
let options = PHImageRequestOptions()
options.version = .original
options.deliveryMode = .highQualityFormat
options.resizeMode = .none
options.isNetworkAccessAllowed = true
PHImageManager.default().requestImage(for: asset, targetSize: size, contentMode: .aspectFit, options: options) { [weak self] (image, info) in
if let s = self, let image = image {
s.loadImageInMemory(image)
}
}
}
This code will work with both local images and iCloud images.
I'm creating an app that will capture a photo and reorient it based on the gyroscope data at the time of photo capture.
I've created functions to capture the gyro data from the phone in:
startUpdates()
and to stop capturing the data in:
stopUpdates()
I tried adding this into the UIImagePickerController:
if UIImagePickerController.isSourceTypeAvailable(.camera) {
guard ptInitials.text != "" else {
missingIdentifier(text: "Please add an identifier prior to taking a photo")
return}
startUpdates()
imagePicker = UIImagePickerController()
imagePicker.delegate = self
imagePicker.allowsEditing = false
imagePicker.sourceType = UIImagePickerControllerSourceType.camera
imagePicker.cameraCaptureMode = .photo
imagePicker.modalPresentationStyle = .fullScreen
present(imagePicker,
animated: true,
completion: nil)
}
This starts the gyro capture as the image capture process begins.
I have it passing the gyro data to an optional double, livePhotoAxis: Double?, while it is measuring this inside the "startUpdates()" function.
I was attempting to get it to "stop" capturing the data once the picture is taken so the last known gyro data would be kept in my variable and able to pass into other functions;
func imagePickerController(_ picker: UIImagePickerController, didFinishPickingMediaWithInfo info: [String : Any]) {
stopUpdates()
let chosenImage = info[UIImagePickerControllerOriginalImage] as! UIImage
saveImageToDocuments(image: chosenImage, fileNameWithExtension: "image.jpg")
dismiss(animated: true, completion: nil
)
}
However, the problem is that the stopUpdates() method isn't called until after the user "confirms" they like their snapped photo (vs retaking it).
From what I've read, there is a private API with uiImagePickerControllerUserDidCaptureItem that captures the exact moment the photo is taken. I could use NSNotification to try and find this and call StopUpdates() once the photo is actually taken.
Is this correct that this is a private API? Can I use that to capture this moment or will my app get rejected?
Likewise, is there a better way to turn off my gyro updates at the exact moment of photo capture?
Thanks!
I added this code and it fixed the problem, but I still don't know if this is a private API and will be allowed in the app store or not? Or if there is a better way to accomplish this?
NotificationCenter.default.addObserver(forName: NSNotification.Name(rawValue: "_UIImagePickerControllerUserDidCaptureItem"), object:nil, queue:nil, using: { note in
self.stopUpdates()
})
Update: This doesn't fire the EXACT moment of photo capture. It fires when the photo is "locked in". There are a few seconds (2ish) between the time the take photo button is pressed and the photo locks into place. As a result, if you move the phone during that time, the gyro data will be inaccurate.
Any thoughts on how to improve on this?
In my application, I have to store and upload multiple photos to a server using the standard iOS UIImagePickerController library. Currently, I am storing working on storing the UIImagePickerControllerReferenceURL for the photo library:
func imagePickerController(_ picker: UIImagePickerController, didFinishPickingMediaWithInfo info: [String : Any]) {
if let assetURL = info["UIImagePickerControllerReferenceURL"] as? URL {
assetArray.append(assetURL) //assetArray globally stored as 'fileprivate var assetArray = [URL]()'
}
}
However, I just read somewhere that the camera does not provide a "UIImagePickerControllerReferenceURL" key. Thus this method may not work, when I get around to the camera part, so I am re-thinking my logic now.
I am using KingFisher so I am currently researching to see if you can cache images using that framework, but I am not certain. I'm also reading that I could grab the camera image's URL using the Photo's API, which I will attempt next. If I could, I will grab the url of the the photo taken by the camera, and append it to the assetArray.
Had this be a single photo upload, this would be relatively simple, but since I do not want to hinder the performance of the application by storing a large data/image array, this task has become complex.
Is it best to continue what I am doing:
1) Grab URL's of each image I want to upload, and storing them into an URL array. Before uploading to the server, grab the data in .png form of each URL, and upload images to server
2) Cache actual images somewhere, so when uploading to server, images will already be there (not needed to be converted from URL -> image). Issue here is having many images may hinder app performance if not done correctly.
or
3) Some other method
Any help would be greatly appreciated. Thank you!
I found a solution using Kingfisher. First, I created an ImageCache object for the VC handling the multiple photos (you need to import Kingfisher):
fileprivate var imageCache: ImageCache = ImageCache(name: "newTaskCache")
After I select the photo using UIImagePickerController, I save the unique ID in a string array, cachedImageIDs:
func imagePickerController(_ picker: UIImagePickerController, didFinishPickingMediaWithInfo info: [String : Any]) {
let image : UIImage = (info[UIImagePickerControllerOriginalImage] as? UIImage)!
let imageID: String?
if let assetURL = info["UIImagePickerControllerReferenceURL"] as? URL {
imageID = assetURL.absoluteString
cachedImageIDs.append(imageID!)
}else{
//camera does not return a UIImagePickerControllerReferenceURL, so unique imageID will be posix time, until I figure out how to get imageURL string
imageID = String(describing: Date().posixTime())
cachedImageIDs.append(imageID!)
}
imageCache.store(image, forKey: imageID!) {
print("saved image")
picker.dismiss(animated: true, completion: nil)
}
}
When I need to retrieve the images, they are
func getImages() {
for id in cachedImageIDs {
//retrieve image
imageCache.retrieveImage(forKey: id, options: nil, completionHandler: { (image, cacheType) -> () in
print(image)
})
}
}
When you pick a video on an iOS device you can look through the video by dragging a playhead at the top of the screen. Is it possible to get this current time later when the image is chosen? Like in my imagePickerController function below where I get the video URL.
func imagePickerController(picker: UIImagePickerController, didFinishPickingMediaWithInfo info: [NSObject : AnyObject]) {
self.dismissViewControllerAnimated(true, completion: nil)
let info2 = info as NSDictionary
urlVideo = info2.objectForKey(UIImagePickerControllerMediaURL) as! NSURL
}
You cannot achieve this using the UIImagePickerController as it doesn't offer support for that. You will have to use a custom controller or create your own.
I am using UIImagePickerController to allow the user to take a picture. I want to allow him/her to edit it afterwards but, whatever I do, I get nothing.
Here is my code (I am using Xamarin):
UIImagePickerController imagePicker = new UIImagePickerController ();
// set our source to the camera
imagePicker.SourceType = UIImagePickerControllerSourceType.Camera;
// set what media types
//imagePicker.MediaTypes = UIImagePickerController.AvailableMediaTypes (UIImagePickerControllerSourceType.Camera);
// show the camera controls
imagePicker.ModalPresentationStyle = UIModalPresentationStyle.CurrentContext;
imagePicker.ShowsCameraControls = true;
imagePicker.AllowsEditing = true;
imagePicker.SetEditing (true,true);
imagePicker.PreferredContentSize = new SizeF(900,600);
imagePicker.CameraCaptureMode = UIImagePickerControllerCameraCaptureMode.Photo;
imagePicker.Title="taste.eat. image";
// attach the delegate
imagePicker.Delegate = new ImagePickerDelegate();
// show the picker
NavigationController.PresentViewController(imagePicker, true,null);
Am I missing something?
EDIT:
I have followed the tutorial and I am getting to the screen with the rectangle, but if i pan or zoom it just snaps back to the center once I lift my finger. Is it possible to get to this screen from the photos application?
When using UIImagePickerController's delegate
method - imagePickerController:didFinishPickingMediaWithInfo:, we get the image using
UIImage *image = [info objectForKey:UIImagePickerControllerOriginalImage];
This code will always return the original image, even if editing is ON.
Try using
UIImage *image = [info objectForKey:UIImagePickerControllerEditedImage];
This will return the edited image if editing is ON.
Hope this helps.
The AllowsEditing property simply allows the user to crop to a square if picking an image and trim the video if picking a video.
Any other functionality needs to be implemented with custom UI and code.
See this question:iPhone SDK - How to customize the crop rect in UIImagePickerController with allowsEditing on?
What you are showing in the screenshot is not part of UIImagePickerController, unfortunately
SWIFT 3
I was having a hard time returning the cropped image (simple mistake on my end). Instead of using UIImagePickerControllerOriginalImage, you need UIImagePickerControllerEditedImage. See below:
func imagePickerController(_ picker: UIImagePickerController, didFinishPickingMediaWithInfo info: [String : Any]) {
// The info dictionary contains multiple representations of the image, and this uses the cropped image.
let selectedImage = info[UIImagePickerControllerEditedImage] as! UIImage
// Set yourImageView to display the selected image.
yourImage.image = selectedImage
// Dismiss the picker.
dismiss(animated: true, completion: nil)
}
SWIFT 4+
There have been some changes after Swift 4. UIImagePickerControllerEditedImage changed to UIImagePickerController.InfoKey.editedImage.
func imagePickerController(_ picker: UIImagePickerController, didFinishPickingMediaWithInfo info: [UIImagePickerController.InfoKey : Any]) {
let selectedImage = info[UIImagePickerController.InfoKey.editedImage] as! UIImage
imageView.image = selectedImage
dismiss(animated: true, completion: nil)
}
It will be work like this way.
func imagePickerController(_ picker: UIImagePickerController, didFinishPickingMediaWithInfo info: [String : Any]) {
let selectedImage = info[UIImagePickerControllerEditedImage] as! UIImage
userPhoto.image = selectedImage
dismiss(animated: true, completion: nil)
}
There is no way to enable filters by just changing property like allowsEditing = YES. It will only display a cropping tool. As per your screenshot it's look like you have integrated some buggy open source library and without looking at the source code it would be difficult to fix your center cropping bug.
Better to post some concrete detail about your implementation or switch to standard open source library.