Get photos that contains disparity data in iOS11 from PhotoKit - ios

I want to get the user photos that contains disparity data, my first thought was to use the smart album .smartAlbumDepthEffect but since Portrait Mode photos taken in iOS 10 doesn't have disparity data that won't work. Is there a way to query PhotoKit and specify that I only want receive photos that has the disparity data?

Related

Raw Depth map SDK for IPhone X

I did some search and found various examples, documentation on iPhone X Face ID and how it can be used for various stuff like authentication, animated emojis.
Wanted to check if there is an API/SDK to get the raw depth map from iPhone X sensor to the app?
From my understanding the depth calculation is done based on the projected pattern. This can be used to get depth profile of any object in front of the sensor. (Might be dependent on the texture of the object.)
You'll need at least the iOS 11.1 SDK in Xcode 9.1 (both in beta as of this writing). With that, builtInTrueDepthCamera becomes one of the camera types you use to select a capture device:
let device = AVCaptureDevice.default(.builtInTrueDepthCamera, for: .video, position: .front)
Then you can go on to set up an AVCaptureSession with the TrueDepth camera device, and can use that capture session to capture depth information much like you can with the back dual camera on iPhone 7 Plus and 8 Plus:
Turn on depth capture for photos with AVCapturePhotoOutput.isDepthDataDeliveryEnabled, then snap a picture with AVCapturePhotoSettings.isDepthDataDeliveryEnabled. You can read the depthData from the AVCapturePhoto object you receive after the capture, or turn on embedsDepthDataInPhoto if you just want to fire and forget (and read the data from the captured image file later).
Get a live feed of depth maps with AVCaptureDepthDataOutput. That one is like the video data output; instead of recording directly to a movie file, it gives your delegate a timed sequence of image (or in this case, depth) buffers. If you're also capturing video at the same time, AVCaptureDataOutputSynchronizer might be handy for making sure you get coordinated depth maps and color frames together.
As Apple's Device Compatibility documentation notes, you need to select the builtInTrueDepthCamera device to get any of these depth capture options. If you select the front-facing builtInWideAngleCamera, it becomes like any other selfie camera, capturing only photo and video.
Just to emphasize: from an API point of view, capturing depth with the front-facing TrueDepth camera on iPhone X is a lot like capturing depth with the back-facing dual cameras on iPhone 7 Plus and 8 Plus. So if you want a deep dive on how all this depth capture business works in general, and what you can do with captured depth information, check out the WWDC17 Session 507: Capturing Depth in iPhone Photography talk.

Adding metadata to an image in an iOS app

I am creating an app to take an image and save the image to the camera roll but also adding a string of data to the metadata of the image at the same time and this string would mostly be a combination of user entered data in a couple of text fields. So I pick up an image using the camera of the iphone which is accessed by the UIImagePicker. Now I want to save this and edit the metadata before saving. I have looked up a solution but could not find a proper one for swift.

Why do some photos taken with iPhone camera not have stored GPS coordinates?

I have an app that allows a user to upload photos from his camera roll only if the photo has stored GPS coordinates. I was wondering what causes some of these photos- taken on iPhone camera- to not have a stored location.
Thanks in advance.
I can think of at least four different ways to get photos without GPS coordinates associated:
Screenshots - seem to not have any geo information associated
Downloaded / imported photos - from Mail, or apps like 9gag for example.
Photos taken during active airplane mode
Most uncertain about the last one:
Photos taken where there simply is no GPS available, e.g. deep inside a mountain?

Unable to read GPS coordinate of camera roll images

In one of app I'm developing,I am retrieving EXIF data from images.I want to get GPS coordinate of images on iOS device.As I get GPS coordinate of an image taken by UIImagePickerControllerSourceTypeCamera.But when I try to select images from gallery which are taken by camera roll it doesn't gives me GPS coordinate.
Thanks
Images taken with UIImagePickerController do not contain location data (unlike those taken with the Camera app).
Camera app developers have to capture the location data separately at the time of capture, add it to the metadata that is there, and then save the image with the revised metadata.

ALAssest of an Image taken from Camera without saving it

Hi I was wondering if theres a way to extract ALAsset of an image taken from Camera but without saving it...
Ive come across various example that used writeImageToSavedPhotosAlbum and then fetched the ALAssest, but i dont deem it necessary to save the image in the camera roll, was just wondering if this could be done otherwise
No ALAssest exists until the image has been successfully saved to the image library. Until then you just have a UIImage that has come from the picker. There is no mandate that the image needs to be saved into the library and any decision about whether you want to save the image should be based on what the app tells the user and if the user would naturally expect to find the image in the library after taking / saving it in the app.

Resources