Saving DNG (raw photos) taken on other camera to iOS Photo Library - ios

I write an app that reads dng files from my Ricoh GR II and save them to iOS 10's Photo Library, code like this
let photoLibrary = PHPhotoLibrary.shared()
photoLibrary.performChanges({
PHAssetChangeRequest.creationRequestForAsset(from: image)
}) { (success: Bool, error: Error?) -> Void in
if success {
print("Saving photo ok")
} else {
print("Error writing to photo library: \(error!.localizedDescription)")
}
}
And I got the error below:
ImageIO: PluginForUTType:316: file format 'com.adobe.raw-image' doesn't support writing
Error writing to photo library: The operation couldn’t be completed. (Cocoa error -1.)
I guess maybe iOS only support DNG that's taken on iPhone?

PHAssetChangeRequest.creationRequestForAsset(from: image) starts from a UIImage, so it wouldn't get the original file into your library anyway. A UIImage is the displayable result of reading and decoding an image file; by the time you have a UIImage it doesn't know whether it came from a JPEG or a DNG or a GIF or was rendered at run time via CGBitmapContext or whatever. When you try to save via creationRequestForAssetFromImage:, you're taking that end result and turning it back into a file — whatever kind of file that method wants. (Probably JPEG.)
If you want to put an actual DNG file into the photo library, you'll need to use a Photos framework method that takes original files, not decoded images. Furthermore, since not every Photos client can deal with RAW DNGs, Photos requires that every DNG file you put in the library be accompanied by a JPEG representation that non-RAW-supporting apps (sadly, including the Photos app itself) can see.
Luckily, there's an API for that.
PHPhotoLibrary.shared().performChanges( {
let creationRequest = PHAssetCreationRequest.forAsset()
let creationOptions = PHAssetResourceCreationOptions()
creationOptions.shouldMoveFile = true
creationRequest.addResource(with: .photo, data: jpegData, options: nil)
creationRequest.addResource(with: .alternatePhoto, fileURL: dngFileURL, options: creationOptions)
}, completionHandler: completionHandler)
PHAssetCreationRequest is for creating assets from the underlying resources - one or more image files, video files, some combination thereof (for Live Photos), etc. The photo and alternatePhoto resources are how you provide a DNG file and its accompanying JPEG preview. And the shouldMoveFile option is good for if you don't want to blow your device storage from copying the file from your app sandbox into the Photos library storage — good for big resources like DNGs and 4K video.
(The code snippet is from Apple's Photo Capture guide.)
That said, while Apple's RAW processing supports images from all sorts of third-party cameras, it doesn't look like their list includes any Ricoh models. (Not even this kind.)
That doesn't prevent you from storing Ricoh DNGs in the Photos library, though — it just means the only apps that will be able to usefully read them from the library will need their own Ricoh RAW processing support to see anything but the preview JPEG.

Related

How to get a thumbnail from MFi Card reader accessories?

I use the ExternalAccessory wrapper-based SDK to read the image source file with the following pseudo-code.
let read = buffer.withUnsafeMutableBytes { bufferPointer in
self.sessionController.readFile(handle, data: bufferPointer, len: UInt32(min(remainingToRead, UInt64(bufferSize))))
}
if read == 0 {
completion(fileURL!, FileError.readFile)
return // Read error
}
remainingToRead -= UInt64(read)
fileHandle?.write(buffer)
What I want to achieve is that I read the file stream of the image thumbnail directly from the external device, not the file stream of the original image file, does anyone have a good solution for this?
I tried to read the image raw file and then compress it, but this will affect the speed of the image display.
Also, do I need to consider C language to process the file stream to implement this feature?

What is the fastest way to convert an imageURL from Firebase into a UIImage?

In my iOS app I need to take an imageURL string and convert it into a UIImage.
I wrote the below function to handle this:
func getImage(urlString: String) -> UIImage {
let url = URL(string: urlString)!
do {
let data = try Data(contentsOf: url)
let image = UIImage(data: data)!
return image
} catch {
print(error, " This was the error in p2")
}
return UIImage(named: "media")!
}
The issue is that this takes too long. I believe it's a solid second or longer for this to complete.
I need that time to be significantly shorter.
Question: Is there a faster way to get the UIImage based on an imageURL from Firebase? (maybe a cocoa-pod? or better way to write the code?)
Additional questions:
Would this be any faster if the image in Firebase were of lower quality?
Would it be a viable solution to lower the quality of the image right before being passed into this function?
A lot the prominent iOS apps (and web and mobile and general) that do a lot of downloading of images take advantage of progressive jpeg. This way your user will see at least something while the image loads and over time the image will get progressively better. As a lot of commenters have mentioned, you’re not in control of Firebase like you would be if you had your own backend server delivering the pictures that you could do performance optimizations. Therefore one of the best things you can do is implement progressive jpeg in your app.
The first link is a library that will allow you to use progressive jpeg in your iOS app. The second link is a detailed approach used at FaceBook on faster loading of images.
https://www.airpair.com/ios/posts/loading-images-ios-faster-with-progressive-jpegs
https://code.fb.com/ios/faster-photos-in-facebook-for-ios/

Storage downloading costs too much for Firestore app in Swift 4

So the way my app is working is kind of like instagram. A user can upload a photo, and whenever someone loads the app it downloads each picture that was uploaded from the firebase.
I understand that I need to buy space or change my plan, but I didn't do that much and I'm wasting 1.7gb from a user in like an hour. Each photo costs like 17mb to upload and download.
I am not sure what I can do to lessen my downloading here.
The way I download from firestore is like this from the f:
// Create a reference to the file you want to download
let islandRef = storageRef.child("images/island.jpg")
// Download in memory with a maximum allowed size of 1MB (1 * 1024 * 1024 bytes)
islandRef.getData(maxSize: 1 * 10240 * 10240) { data, error in
if let error = error {
// Uh-oh, an error occurred!
} else {
// Data for "images/island.jpg" is returned
let image = UIImage(data: data!)
}
}
And each time it loads a photo into a collectionviewcontroller. Which means it is like 17mb for each photo which is a lot. Any suggestions? Thanks
So this is where you want to make a decision about the level of quality for the photos that you upload to firebase. I can assure you that instagram and any other social media platform only store versions of your pictures that are compressed and optimized for size.
You can easily compress your image by doing something like this
let data = imageToUpload.jpegData(compressionQuality: 0.3)
you would then upload that new compressed version of the image to firebase and dramatically improve your storage efficiency.

Crop Captured RAW Photo and save to file iOS [duplicate]

I want to build an iOS 10 app that lets you shoot a RAW (.dng) image, edit it, and then saved the edited .dng file to the camera roll. By combining code from Apple's 2016 "AVCamManual" and "RawExpose" sample apps, I've gotten to the point where I have a CIFilter containing the RAW image along with the edits.
However, I can't figure out how to save the resulting CIImage to the camera roll as a .dng file. Is this possible?
A RAW file is "raw" output direct from a camera sensor, so the only way to get it is directly from a camera. Once you've processed a RAW file, what you have is by definition no longer "raw", so you can't go back to RAW.
To extend the metaphor presented at WWDC where they introduced RAW photography... a RAW file is like the ingredients for a cake. When you use Core Image to create a viewable image from the RAW file, you're baking the cake. (And as noted, there are many different ways to bake a cake from the same ingredients, corresponding to the possible options for processing RAW.) But you can't un-bake a cake — there's no going back to original ingredients, much less a way that somehow preserves the result of your processing.
Thus, the only way to store an image processed from a RAW original is to save the processed image in a bitmap image format. (Use JPEG if you don't mind lossy compression, PNG or TIFF if you need lossless, etc.)
If you're writing the results of an edit to PHPhotoLibrary, use JPEG (high quality / less compressed if you prefer), and Photos will store your edit as a derived result, allowing the user to revert to the RAW original. You can also describe the set of filters you applied in PHAdjustmentData saved with your edit — with adjustment data, another instance of your app (or Photos app extension) can reconstruct the edit using the original RAW data plus the filter settings you save, then allow a user to alter the filter parameters to create a different processed image.
Note: There is a version of the DNG format called Linear DNG that supports non-RAW (or "not quite RAW") images, but it's rather rare in practice, and Apple's imaging stack doesn't support it.
Unfortunately DNG isn't supported as an output format in Apple's ImageIO framework. See the output of CGImageDestinationCopyTypeIdentifiers() for a list of supported output types:
(
"public.jpeg",
"public.png",
"com.compuserve.gif",
"public.tiff",
"public.jpeg-2000",
"com.microsoft.ico",
"com.microsoft.bmp",
"com.apple.icns",
"com.adobe.photoshop-image",
"com.adobe.pdf",
"com.truevision.tga-image",
"com.sgi.sgi-image",
"com.ilm.openexr-image",
"public.pbm",
"public.pvr",
"org.khronos.astc",
"org.khronos.ktx",
"com.microsoft.dds",
"com.apple.rjpeg"
)
This answer comes late, but it may help others with the problem. This is how I saved a raw photo to the camera roll as a .dng file.
Just to note, I captured the photo using the camera with AVFoundation.
import Photos
import AVFoundation
//reading in the photo data in as a data object
let photoData = AVCapturePhotoOutput.dngPhotoDataRepresentation(forRawSampleBuffer: sampleBuffer, previewPhotoSampleBuffer: previewPhotoSampleBuffer)
// put it into a temporary file
let temporaryDNGFileURL = NSURL(fileURLWithPath: NSTemporaryDirectory()).appendingPathComponent("temp.dng")!
try! photoData?.write(to: temporaryDNGFileURL)
// get access to photo library
PHPhotoLibrary.requestAuthorization { status in
if status == .authorized {
// Perform changes to the library
PHPhotoLibrary.shared().performChanges({
let options = PHAssetResourceCreationOptions()
options.shouldMoveFile = true
//Write Raw:
PHAssetCreationRequest.forAsset().addResource(with: .photo, fileURL: temporaryDNGFileURL, options: options)
}, completionHandler: { success, error in
if let error = error { print(error) }
})
}
else { print("cant access photo album") }
}
Hope it helps.
The only way to get DNG data as of the writing of this response (iOS 10.1) is:
AVCapturePhotoOutput.dngPhotoDataRepresentation(forRawSampleBuffer: CMSampleBuffer, previewPhotoSampleBuffer: CMSampleBuffer?)
Noting the OP refers to Core Image. As mentioned by rickster, CI works on processed image data, therefore only offers processed image results (JPEG, TIFF):
CIContext.writeJPEGRepresentation(of:to:colorSpace:options:)
CIContext.writeTIFFRepresentation(of:to:format:colorSpace:options:)

iOS saving photo to Camera Roll does not preserve EXIF/GPS metadata

I know a UIImage can be saved into Camera Roll with UIImageWriteToSavedPhotosAlbum, but this approach strips all metadata from the original file (EXIF, GPS data, etc). Is there any way to save the original file, rather than just the image data into the iOS device's Camera Roll?
Edit: I guess I should have been a bit more specific. The aim is to save a duplicate of an existing JPEG file into a user's Camera Roll. What's the most efficient way to do this?
Depending on how you have your image to save you can chose one of the methods provided by the ALAssetsLibrary.
– writeImageDataToSavedPhotosAlbum:metadata:completionBlock:
– writeImageToSavedPhotosAlbum:metadata:completionBlock:
(Depending on if you have the image as an actual UIImage, or as NSData)
http://developer.apple.com/library/ios/#documentation/AssetsLibrary/Reference/ALAssetsLibrary_Class/Reference/Reference.html
Take notice on the fact that you have to have set the correct keys for the dictionary or they might not be saved correctly.
Here is an example for the GPS information:
NSDictionary *gpsDict = [NSDictionary dictionaryWithObjectsAndKeys:
[NSNumber numberWithFloat:fabs(loc.coordinate.latitude)], kCGImagePropertyGPSLatitude,
((loc.coordinate.latitude >= 0) ? #"N" : #"S"), kCGImagePropertyGPSLatitudeRef,
[NSNumber numberWithFloat:fabs(loc.coordinate.longitude)], kCGImagePropertyGPSLongitude,
((loc.coordinate.longitude >= 0) ? #"E" : #"W"), kCGImagePropertyGPSLongitudeRef,
[formatter stringFromDate:[loc timestamp]], kCGImagePropertyGPSTimeStamp,
[NSNumber numberWithFloat:fabs(loc.altitude)], kCGImagePropertyGPSAltitude,
nil];
And here is a list of the keys:
http://developer.apple.com/library/ios/#documentation/GraphicsImaging/Reference/CGImageProperties_Reference/Reference/reference.html#//apple_ref/doc/uid/TP40005103
UIImagePickerControllerDelegate is what you're looking for.
Starting in iOS 4.0, you can save still-image metadata, along with a
still image, to the Camera Roll. To do this, use the
writeImageToSavedPhotosAlbum:metadata:completionBlock: method of the
Assets Library framework. See the description for the
UIImagePickerControllerMediaMetadata key.
UIImagePickerControllerDelegate Protocol Reference
For a Swift solution that uses the Photos API (ALAssetLibrary is deprecated in iOS 9), you can see my solution to this problem here, including sample code.
With the Photos API, the key thing to note is that the .location property of a PHAsset does NOT embed the CLLocation metadata into the file itself, so using an EXIF viewer will not turn up any results.
To get around this, you must embed any metadata changes directly into the Data of the image itself before writing it to the camera roll using the Photos API (or, for iOS versions prior to 9, you must write a temporary file using the Data with the embedded metadata and create the PHAsset from the file's URL).
Also note that the act of converting image Data to a UIImage appears to strip metadata, so be careful with that.

Resources