iOS: How to extract thumbnail and metadata from photo file on disk - ios

Apple doc here https://developer.apple.com/documentation/avfoundation/cameras_and_media_capture/capturing_still_and_live_photos/capturing_thumbnail_and_preview_images explains how to capture thubnails.
At the bottom, it says
If you requested an embedded thumbnail image, that image isn't
directly accessible from the AVCapturePhoto object—it's embedded in
the image file data that you get by calling the photo object's
fileDataRepresentation() method.
Seems impossible to separate the embedded thumbnail from the main photo. So what is the meaning of embedded thumbnail?
I want to save the AVCapturePhoto in JPG and raw DNG (requested embedded thumbnails for both) to App's Documents directory (I do not use PhotoKit) and then load it back to a UIImageView.
I save a photo like this:
if let data = capturePhoto.fileDataRepresentation() {
data.write(to: documentsPath, options: [])
}
And load it back to a UIImage like this:
if let data = FileManager.default.contents(at: path) {
let image = UIImage(data: data)
}
But it will be better to load the embedded thubmail first. If a user clicks to see the large image, load the full imgage file then.
I also want to show the metadata, e.g., GPS location, flash status, ISO, shutter speed etc. I wonder how to do that.

AVFoundation's AVAsset wraps some metadata types, but apparently not EXIF data. If you want the thumbnail you have to use the CoreGraphics framework. This function fetches the thumbnail if present, limiting the maximum side length to 512 pixels.
public func getImageThumbnail(url: URL) -> CGImage? {
guard let imageSource = CGImageSourceCreateWithURL(url as CFURL, nil) else { return nil }
let thumbnailOptions: [String: Any] = [
kCGImageSourceCreateThumbnailWithTransform as String: true,
kCGImageSourceCreateThumbnailFromImageIfAbsent as String: false, // true will create if thumbnail not present
kCGImageSourceThumbnailMaxPixelSize as String: 512
]
return CGImageSourceCreateThumbnailAtIndex(imageSource, 0, thumbnailOptions as CFDictionary);
}
For all the rest of the metadata, you can use CGImageSourceCopyPropertiesAtIndex or CGImageSourceCopyMetadataAtIndex.

Related

Save and retrieve Photos with all their metadata locally on iOS using swift

I am currently taking photos from the users iPhone making them jpegs, then saving them using file Manager and then saving their paths with core data.
The way I do it I can’t show live images or later export the image with its previous meta data.
My question is what is the correct way to save an image locally with all its meta data and live status so it can be displayed and exported later but only inside the app. I don’t want the images to be viable outside the application.
Taking The image:
Take a look at the UIImagePickerController class and use it to take pictures. Taking pictures with this, will NOT save the images automatically to the iOS gallery.
The mentioned UIImagePickerController will notify a delegate UIImagePickerControllerDelegate and call the function func imagePickerController(_ picker: UIImagePickerController, didFinishPickingMediaWithInfo info: [String: Any]).
From the second parameter, the info dictionary, you can request most of the images metadata, livePhoto and so on. The necessary keys can be found inside the extension of UIImagePickerController called UIImagePickerController.InfoKey.
For example, here is how I then retrieve the images coordinate if available:
if let imageURL = info[UIImagePickerControllerReferenceURL] as? URL {
let result = PHAsset.fetchAssets(withALAssetURLs: [imageURL], options: nil)
if let asset = result.firstObject, let location = asset.location {
let lat = location.coordinate.latitude
let lon = location.coordinate.longitude
print("Here's the lat and lon \(lat) + \(lon)")
}
}
Saving the image onto the phone within your app?:
I use JSONEncoder. You can use this class to encode and then decode your custom image class again.
Your custom image class, which has the UIImage and all the other properties must then implement the Codable protocol.
Afterwards, use the JSONEncoder to create Data of the object via its encode() method and save it to an app private location with the help of a FileManager.
Later on, you may read all files from that location again with the FileManager, decode() with the JSONEncoder and voila you have all your images again in the form of your custom image class.
Saving the image to the users gallery "Exporting":
Here's an example of how I again save that image to the users gallery as a way of exporting it:
private static func saveImageToGallery(picture: UIImage, lat: Double?, lon: Double?) {
PHPhotoLibrary.shared().performChanges({
let request = PHAssetCreationRequest.creationRequestForAsset(from: picture)
if let _lat = lat, let _lon = lon {
request.location = CLLocation(latitude: _lat, longitude: _lon)
}
})
}
I hope this will be enough to guide you to the correct way.

Extensive memory usage when uploading assets (images, videos) to firebase in Swift?

Suppose I have an array of UIImage called photos, they are to be uploaded to Firebase storage. I wish to do the following things:
Upload them to Firebase storage
Get paths of the uploaded photos and store in an array called uploadedAssets (paths, not download url, it looks like this: "photos/folder_name/photo_id"), where "folder_name" is randomly generated and "photo_id" is an integer, representing the order of photos
Call Cloud Function and pass uploadedAssets to it. The server then uses the paths to find all pictures and generates a thumbnail for each one.
Finally, store the original photos' download urls and thumbnails' download urls in database.
I have something that's working, but uses too much memory (300+MB when uploading only 4 pictures):
// Swift
let dispatchGroup = DispatchGroup()
let dispatchQueue = DispatchQueue.init(label: "AssetQueue")
var uploadedAssets = [String]()
let folderName: String = UUID().uuidString
dispatchQueue.async {
for i in 0..<photos.count {
dispatchGroup.enter()
let photo: UIImage = photos[i]
let fileName: String = "\(folderName)/\(i)"
let assetRef = Storage.storage().reference().child("photos/\(fileName)")
let metaData = StorageMetaData()
metaData.contentType = "image/jpg"
if let dataToUpload = UIImageJPEGRepresentation(photo, 0.75) {
assetRef.putData(
dataToUpload,
metaData: metaData,
completion: { (_, error) in
uploadedAssets.append("photos/\(fileName)")
dispatchGroup.leave()
}
)
}
}
}
dispatchGroup.notify(queue: dispatchQueue) {
Alamofire.request(
"https://<some_url>",
method: .post,
parameters: [
"uploadedAssets": uploadedAssets
]
)
}
And the code that generates thumbnails runs on server side, therefore, in my opinion, is irrelevant, I won't post it here. So, the above code snippet consumes 300+MB of memory when there are 4 photos to upload. After successfully uploaded those photos, the memory usage stays at 300+MB and never drops. When I try to upload more, say another 4 photos, it could even go up to 450+MB. I know that's not normal, but can't seem to figure out why this would happen?

Converting an Image from Heic to Jpeg/Jpg

I have an application where user can upload multiple images and all the images will be stored in a server and will be displayed on a web view in my iOS application.
Now everything used to work just about fine till iOS 10 but suddenly we started seeing some pictures/ images not being displayed , after a little debugging we found out that this is the problem caused because of the new image format of apple (HEIC),
I tried changing back to the Native UIImagePicker (picks only one image) and the images are being displayed as Apple I guess is converting the Image from HEIC to JPG when a user picks them, but this is not the case when I use 3rd party libraries as I need to implement multiple image picker.
Though we are hard at work to make the conversion process on the server side to avoid users who have not updated the app to face troubles, I also want to see if there is any way in which I can convert the image format locally in my application.
There's a workaround to convert HEIC photos to JPEG before uploading them to the server :
NSData *jpgImageData = UIImageJPEGRepresentation(image, 0.7);
If you use PHAsset, the, in order to have the image object, you'll need to call this method from PHImageManager:
- (PHImageRequestID)requestImageForAsset:(PHAsset *)asset targetSize:(CGSize)targetSize contentMode:(PHImageContentMode)contentMode options:(nullable PHImageRequestOptions *)options resultHandler:(void (^)(UIImage *__nullable result, NSDictionary *__nullable info))resultHandler;
On server side you also have the ability to use this API or this website directly
I've done it this way,
let newImageSize = Utility.getJpegData(imageData: imageData!, referenceUrl: referenceUrl!)
/**
- Convert heic image to jpeg format
*/
public static func getJpegData(imageData: Data, referenceUrl: NSURL) -> Data {
var newImageSize: Data?
if (try? Data(contentsOf: referenceUrl as URL)) != nil
{
let image: UIImage = UIImage(data: imageData)!
newImageSize = image.jpegData(compressionQuality: 1.0)
}
return newImageSize!
}
In Swift 3, given an input path of an existing HEIF pic and an output path where to save the future JPG file:
func fromHeicToJpg(heicPath: String, jpgPath: String) -> UIImage? {
let heicImage = UIImage(named:heicPath)
let jpgImageData = UIImageJPEGRepresentation(heicImage!, 1.0)
FileManager.default.createFile(atPath: jpgPath, contents: jpgImageData, attributes: nil)
let jpgImage = UIImage(named: jpgPath)
return jpgImage
}
It returns the UIImage of the jpgPath or null if something went wrong.
I have found the existing answers to be helpful but I have decided to post my take on the solution to this problem as well. Hopefully it's a bit clearer and "complete".
This solution saves the image to a file.
private let fileManager: FileManager
func save(asset: PHAsset, to destination: URL) {
let options = PHContentEditingInputRequestOptions()
options.isNetworkAccessAllowed = true
asset.requestContentEditingInput(with: options) { input, info in
guard let input = input, let url = input.fullSizeImageURL else {
return // you might want to handle this case
}
do {
try self.save(input, at: url, to: destination)
// success!
} catch {
// failure, handle the error!
}
}
}
private func copy(
_ input: PHContentEditingInput, at url: URL, to destination: URL
) throws {
let uniformType = input.uniformTypeIdentifier ?? ""
switch uniformType {
case UTType.jpeg.identifier:
// Copy JPEG files directly
try fileManager.copyItem(at: url, to: destination)
default:
// Convert HEIC/PNG and other formats to JPEG and save to file
let image = UIImage(data: try Data(contentsOf: url))
guard let data = image?.jpegData(compressionQuality: 1) else {
return // you might want to handle this case
}
try data.write(to: destination)
}
}

Swift - Create a GIF from Images and turn it into NSData

This might be an amateur question, but although I have searched Stack Overflow extensibly, I haven't been able to get an answer for my specific problem.
I was successful in creating a GIF file from an array of images by following a Github example:
func createGIF(with images: [NSImage], name: NSURL, loopCount: Int = 0, frameDelay: Double) {
let destinationURL = name
let destinationGIF = CGImageDestinationCreateWithURL(destinationURL, kUTTypeGIF, images.count, nil)!
// This dictionary controls the delay between frames
// If you don't specify this, CGImage will apply a default delay
let properties = [
(kCGImagePropertyGIFDictionary as String): [(kCGImagePropertyGIFDelayTime as String): frameDelay]
]
for img in images {
// Convert an NSImage to CGImage, fitting within the specified rect
let cgImage = img.CGImageForProposedRect(nil, context: nil, hints: nil)!
// Add the frame to the GIF image
CGImageDestinationAddImage(destinationGIF, cgImage, properties)
}
// Write the GIF file to disk
CGImageDestinationFinalize(destinationGIF)
}
Now, I would like to turn the actual GIF into NSData so I can upload it to Firebase, and be able to retrieve it on another device.
To achieve my goal, I have two options: Either to find how to use the code above to extract the GIF created (which seems to directly be created when creating the file), or to use the images on the function's parameters to create a new GIF but keep it on NSData format.
Does anybody have any ideas on how to do this?
Since nobody went ahead for over six months I will just put the answer from #Sachin Vas' comment here:
You can get the data using NSData(contentsOf: URL)

Get PHAsset from iOS Share Extension

I am developing a share extension for photos for my iOS app. Inside the extension, I am able to successfully retrieve the UIImage object from the NSItemProvider.
However, I would like to be able to share the image with my container app, without having to store the entire image data inside my shared user defaults. Is there a way to get the PHAsset of the image that the user has chosen in the share extension (if they have picked from their device)?
The documentation on the photos framework (https://developer.apple.com/library/ios/documentation/Photos/Reference/Photos_Framework/) has a line that says "This architecture makes it easy, safe, and efficient to work with the same assets from multiple threads or multiple apps and app extensions."
That line makes me think there is a way to share the same PHAsset between extension and container app, but I have yet to figure out any way to do that? Is there a way to do that?
This only works if the NSItemProvider gives you a URL with the format:
file:///var/mobile/Media/DCIM/100APPLE/IMG_0007.PNG
which is not always true for all your assets, but if it returns a URL as:
file:///var/mobile/Media/PhotoData/OutgoingTemp/2AB79E02-C977-4B4A-AFEE-60BC1641A67F.JPG
then PHAsset will never find your asset. Further more, the latter is a copy of your file, so if you happen to have a very large image/video, iOS will duplicate it in that OutgoingTemp directory. Nowhere in the documentation says when it's going to be deleted, hopefully soon enough.
I think this is a big gap Apple has left between Sharing Extensions and PHPhotoLibrary framework. Apple should've be creating an API to close it, and soon.
You can get PHAsset if image is shared from Photos app. The item provider will give you a URL that contains the image's filename, you use this to match PHAsset.
/// Assets that handle through handleImageItem:completionHandler:
private var handledAssets = [PHAsset]()
/// Key is the matched asset's original file name without suffix. E.g. IMG_193
private lazy var imageAssetDictionary: [String : PHAsset] = {
let options = PHFetchOptions()
options.includeHiddenAssets = true
let fetchResult = PHAsset.fetchAssetsWithOptions(options)
var assetDictionary = [String : PHAsset]()
for i in 0 ..< fetchResult.count {
let asset = fetchResult[i] as! PHAsset
let fileName = asset.valueForKey("filename") as! String
let fileNameWithoutSuffix = fileName.componentsSeparatedByString(".").first!
assetDictionary[fileNameWithoutSuffix] = asset
}
return assetDictionary
}()
...
provider.loadItemForTypeIdentifier(imageIdentifier, options: nil) { imageItem, _ in
if let image = imageItem as? UIImage {
// handle UIImage
} else if let data = imageItem as? NSData {
// handle NSData
} else if let url = imageItem as? NSURL {
// Prefix check: image is shared from Photos app
if let imageFilePath = imageURL.path where imageFilePath.hasPrefix("/var/mobile/Media/") {
for component in imageFilePath.componentsSeparatedByString("/") where component.containsString("IMG_") {
// photo: /var/mobile/Media/DCIM/101APPLE/IMG_1320.PNG
// edited photo: /var/mobile/Media/PhotoData/Mutations/DCIM/101APPLE/IMG_1309/Adjustments/FullSizeRender.jpg
// cut file's suffix if have, get file name like IMG_1309.
let fileName = component.componentsSeparatedByString(".").first!
if let asset = imageAssetDictionary[fileName] {
handledAssets.append(asset)
imageCreationDate = asset.creationDate
}
break
}
}
}

Resources