How to obtain photo data/metadata after being picked in PHPickerViewController? - ios

In the app I work on, I am being tasked with a requirement to present the user their device photos using a PHPickerViewController.
Once selected, I need access to:
The original image data
The file name of the image
The creation date of the image
This is simple to do with a PHAsset if I had full access to the user's photos - I could use the PHPickerViewController result value which includes photo identifiers to query PhotoKit for the PHAssets. However, there is no guarantee that I have full access. Additionally if the user has granted no access or limited access, and the limited selection does not include the photos being selected in the PHPickerViewController, I won't be able to query for them. (I tried this, and as expected the result of the query is nil, which makes sense).
The UIImage that I can obtain is only a proxy of the original image data (missing things like exif data, etc), so that is not sufficient for the user case my app has. Assuming there was an item provider for data, I'd still need to get information such as file name and creation date as well though.
So, is there a way to obtain this information?

The UIImage that I can obtain is only a proxy of the original image data (missing things like exif data, etc), so that is not sufficient for the user case my app has
Go back to the NSItemProvider and ask it for the data representation of the image. Now extract the metadata in the usual way.
// prov is the item provider
prov.loadDataRepresentation(forTypeIdentifier: UTType.image.identifier) { data, err in
if let data = data {
let src = CGImageSourceCreateWithData(data as CFData, nil)!
let d = CGImageSourceCopyPropertiesAtIndex(src,0,nil) as! [AnyHashable:Any]
print("metadata", d)
}
}

Related

Save Image (SwiftUI) to Realm database

I am using SwiftUI and Realm.
I want to save an Image to the database. The image is a photo taken from the device's camera and is not stored in the gallery.
It's my understanding that I need to convert the image to NSData, but I can only find syntax that works with UIImage.
I tried
let uiImage: UIImage = image.asUIImage()
but get this error:
Value of type 'Image?' has no member 'asUIImage'
What am I doing wrong here, how can I have an Image to Realm local database?
EDIT: Thanks for the suggested "possible duplicate", but no: the duplicate (which I have already seen prior to making my post) is for UIImage (UIKit). I am using Image (SwiftUI).
The struct Image is only a building block for the UI in Swift UI and it is not an object that represents the literal image, but rather something that displays some image.
The common approach, is to see how you create Image - where do you get the actual image from - and to use the source, the image itself to save it.
Just as side note, I wanted to mention that storing data blobs in Realm database can be extremely slow and more commonly used and fast approach is to write files to disk and to store only the names of those files in the database.
Elaborating on this approach, you can:
create a folder to store your images in Library path in user domain mask
You can read about iOS Sandbox file system and where you can store stuff at Apple File System Programming Guide.
For our purposes, it will suffice to this method.
let imagesFolderUrl = try! FileManager.default.url(for: . applicationSupportDirectory, in: .userDomainMask)
.appendingPathComponent("images_database")
You should check if this directory exists and create it if it doesn't - there's plenty of information about this online.
Then, when you have an image Data, you give it a name, you create a URL that will point to where it will be stored and then you write it.
let imageData: Data
let imageName = "some new name for this particular image - maybe image id or something"
let imageUrl = imagesFolderUrl.appendingPathComponent(imageName)
imageData.write(to: url) // Very slow operation that you should perform in background and not on UI thread
Then, you store the name of the image in the Realm database
Then, when you pull a record from Realm database and see the name of the image, you can construct the url again and read a Data from it
let record: RealmRecord
let imageName = record.imageName
let url = imagesFolder.appendingPathComponent(imageName)
let data = Data(url: imageName)
That's overly simplifying it.

Custom defined file metadata keys [Swift]

I'm currently trying to save a jpeg representation of a UIImage with additional custom metadata (e.g. Thermal temperature statistics etc.). These don't fit within the apple predefined keys (https://developer.apple.com/documentation/imageio/cgimageproperties), so solutions I've found don't apply to my scenario.
I've tried saving the metadata with the image as a dictionary of keys and values, but the image is saved without the additional metadata.
func saveImage(imageToSave: UIImage, metadata: NSMutableDictionary) {
if let data: Data = imageToSave.jpegData(compressionQuality: ThermalImageView.JPEG_COMPRESSION) {
let fileName = self.buildFileName();
let source = CGImageSourceCreateWithData(data as CFData, nil)!;
let uniformTypeIdentifier = CGImageSourceGetType(source)!;
let destination = CGImageDestinationCreateWithURL(fileName as CFURL, uniformTypeIdentifier, 1, nil)!;
CGImageDestinationAddImageFromSource(destination, source, 0, metadata);
CGImageDestinationFinalize(destination);
}
}
When I try to read these values back with ExifTool (exiftool -j filename.jpg), the metadata is nowhere to be found. I expected this to happen as Apple seems to restrict what keys you can add to your metadata. So, is there a way to do this or should I go another route?
Thanks!
Edit: I think I may be barking up the wrong tree here. It seems like what I actually want to do is modify the header with additional metadata.
Photos framework is a way to go when dealing with metadata of assets. Moreover, PHAsset is the object you'd like to tinker with. https://developer.apple.com/documentation/photokit/phasset
One related question is: https://forums.developer.apple.com/thread/60664
If you still want to have it in your current way, perhaps try this? https://github.com/Nikita2k/SimpleExif

Crop Captured RAW Photo and save to file iOS [duplicate]

I want to build an iOS 10 app that lets you shoot a RAW (.dng) image, edit it, and then saved the edited .dng file to the camera roll. By combining code from Apple's 2016 "AVCamManual" and "RawExpose" sample apps, I've gotten to the point where I have a CIFilter containing the RAW image along with the edits.
However, I can't figure out how to save the resulting CIImage to the camera roll as a .dng file. Is this possible?
A RAW file is "raw" output direct from a camera sensor, so the only way to get it is directly from a camera. Once you've processed a RAW file, what you have is by definition no longer "raw", so you can't go back to RAW.
To extend the metaphor presented at WWDC where they introduced RAW photography... a RAW file is like the ingredients for a cake. When you use Core Image to create a viewable image from the RAW file, you're baking the cake. (And as noted, there are many different ways to bake a cake from the same ingredients, corresponding to the possible options for processing RAW.) But you can't un-bake a cake — there's no going back to original ingredients, much less a way that somehow preserves the result of your processing.
Thus, the only way to store an image processed from a RAW original is to save the processed image in a bitmap image format. (Use JPEG if you don't mind lossy compression, PNG or TIFF if you need lossless, etc.)
If you're writing the results of an edit to PHPhotoLibrary, use JPEG (high quality / less compressed if you prefer), and Photos will store your edit as a derived result, allowing the user to revert to the RAW original. You can also describe the set of filters you applied in PHAdjustmentData saved with your edit — with adjustment data, another instance of your app (or Photos app extension) can reconstruct the edit using the original RAW data plus the filter settings you save, then allow a user to alter the filter parameters to create a different processed image.
Note: There is a version of the DNG format called Linear DNG that supports non-RAW (or "not quite RAW") images, but it's rather rare in practice, and Apple's imaging stack doesn't support it.
Unfortunately DNG isn't supported as an output format in Apple's ImageIO framework. See the output of CGImageDestinationCopyTypeIdentifiers() for a list of supported output types:
(
"public.jpeg",
"public.png",
"com.compuserve.gif",
"public.tiff",
"public.jpeg-2000",
"com.microsoft.ico",
"com.microsoft.bmp",
"com.apple.icns",
"com.adobe.photoshop-image",
"com.adobe.pdf",
"com.truevision.tga-image",
"com.sgi.sgi-image",
"com.ilm.openexr-image",
"public.pbm",
"public.pvr",
"org.khronos.astc",
"org.khronos.ktx",
"com.microsoft.dds",
"com.apple.rjpeg"
)
This answer comes late, but it may help others with the problem. This is how I saved a raw photo to the camera roll as a .dng file.
Just to note, I captured the photo using the camera with AVFoundation.
import Photos
import AVFoundation
//reading in the photo data in as a data object
let photoData = AVCapturePhotoOutput.dngPhotoDataRepresentation(forRawSampleBuffer: sampleBuffer, previewPhotoSampleBuffer: previewPhotoSampleBuffer)
// put it into a temporary file
let temporaryDNGFileURL = NSURL(fileURLWithPath: NSTemporaryDirectory()).appendingPathComponent("temp.dng")!
try! photoData?.write(to: temporaryDNGFileURL)
// get access to photo library
PHPhotoLibrary.requestAuthorization { status in
if status == .authorized {
// Perform changes to the library
PHPhotoLibrary.shared().performChanges({
let options = PHAssetResourceCreationOptions()
options.shouldMoveFile = true
//Write Raw:
PHAssetCreationRequest.forAsset().addResource(with: .photo, fileURL: temporaryDNGFileURL, options: options)
}, completionHandler: { success, error in
if let error = error { print(error) }
})
}
else { print("cant access photo album") }
}
Hope it helps.
The only way to get DNG data as of the writing of this response (iOS 10.1) is:
AVCapturePhotoOutput.dngPhotoDataRepresentation(forRawSampleBuffer: CMSampleBuffer, previewPhotoSampleBuffer: CMSampleBuffer?)
Noting the OP refers to Core Image. As mentioned by rickster, CI works on processed image data, therefore only offers processed image results (JPEG, TIFF):
CIContext.writeJPEGRepresentation(of:to:colorSpace:options:)
CIContext.writeTIFFRepresentation(of:to:format:colorSpace:options:)

Save image to core data while selecting image from photo album

I have used Eureka to create a form and I have installed the image framework. I have added an image row but I am struggling to save the image to coredata.
The attribute in coredata is set to binary data
Based on how my user form works, I need to declare my image variable globally.
I think my problem is connecting the relationship between NSdata and UIImage. The tutorials I find handles this through UIImagePNGRepresentation. But they know the image file name. I don't know this as I select the file from photos.
Global variable = var otsPhoto: NSData? = nil
Form field
<<< ImageRow() {
$0.title = "Attachment"
$0.sourceTypes = [.PhotoLibrary, .SavedPhotosAlbum, .Camera] //1
$0.value = otsPhoto //2
$0.clearAction = .yes(style: .destructive) //3
$0.onChange { [unowned self] row in //4
self.otsPhoto = row.value
}
Returns 2 errors
1) cannot assign value type NSData? to type UIImage?
2) Cannot convert value of type (_) -> to expected argument
It is not advisable to store Images into Sqlite file. The best approach would be to:
Copy the Image from the Photos using UIImagePickerController
to the application bundle or download the image from the server to
the app bundle.
Get the path of the image stored from the application bundle. (you will already know this as you are storing at some location in the app bundle)
Store this path as a String in your table
This way even if the photo is deleted from the Photos, there is a copy of it in your app.
Some resources
https://developer.apple.com/documentation/uikit/uiimagepickercontroller
https://developer.apple.com/documentation/foundation/nsfilemanager

How to pass Core Data objectID and use it with Continuity

Just trying to update some Core Data apps with Continuity and have run into a bit of an issue with using the selected objects ID in the userInfo dictionary to display the correct data on the continuing device.
My first thought was to use the ObjectID, however on the receiving device this would never find a corresponding object in the Core Data store.
As it turns out the URL representation of the objectID contains the UUID of the store itself, and because the two stores UUID's are different this is obviously going to fail.
So I guess I could replace the Core Data store's UUID in the URL with the continuing devices UUID and use this, and no doubt it would work.
The Url seems to be of the following format
Does anyone know what the correct way would be to pass a reference to an object between two devices with core data stores that are synchronised via iCloud?
I'll answer this one myself and see if there are any better answers...
I pass the url of the objectID (from objectID.URIRepresentation) using Continuity API and on the receiving device create a new URL using the following:
url is the url passed in the NSUserActivity.userInfo dictionary
let storeUUID = self.identifierForStore()
// Switch the host component to be the local storeUUID
let newURL = NSURL(scheme: url.scheme!, host: storeUUID, path: url.path!)
func identifierForStore()->NSString? {
if let store = self.persistentStoreCoordinator?.persistentStores[0] as? NSPersistentStore {
return store.identifier
} else {
return nil
}
}
This seems to work just fine - hope it helps someone

Resources