I want to know whether is video is taken from the front or back camera. As info return nil.
let asset: PHAsset!
let manager = PHImageManager.default()
if asset.mediaType == .video {
let options: PHVideoRequestOptions = PHVideoRequestOptions()
options.version = .original
options.isNetworkAccessAllowed = true
manager.requestAVAsset(forVideo: asset, options: options) { (asset, audiomix, info) in
}
Surprisingly you are able to use PHImageManager.requestImageDataAndOrientation for VIDEO assets.
Xcode 12.5
iPhone SE 2020 (iOS 14.6)
Observations
let imageManager = PHImageManager.default()
imageManager.requestImageDataAndOrientation(for: videoPHAsset, options: nil) { (data, string, orientation, info) in
if let data = data, let ciImage = CIImage(data: data) {
print(ciImage.properties)
}
}
Upon serializing ciImage.properties into JSON, I got following results.
1. Video (back camera)
{"ColorModel":"RGB","PixelHeight":1920,"{Exif}":{"ColorSpace":1,"PixelXDimension":1080,"ExifVersion":[2,2,1],"FlashPixVersion":[1,0],"PixelYDimension":1920,"SceneCaptureType":0,"ComponentsConfiguration":[1,2,3,0]},"ProfileName":"HDTV","DPIHeight":72,"PixelWidth":1080,"{TIFF}":{"YResolution":72,"ResolutionUnit":2,"XResolution":72},"{JFIF}":{"DensityUnit":0,"YDensity":72,"JFIFVersion":[1,0,1],"XDensity":72},"Depth":8,"DPIWidth":72}
2. Video (front camera)
{"Depth":8,"{TIFF}":{"YResolution":72,"ResolutionUnit":2,"XResolution":72},"PixelHeight":1334,"{JFIF}":{"DensityUnit":0,"YDensity":72,"JFIFVersion":[1,0,1],"XDensity":72},"ProfileName":"sRGB IEC61966-2.1","PixelWidth":750,"ColorModel":"RGB","DPIHeight":72,"DPIWidth":72,"{Exif}":{"ColorSpace":1,"PixelXDimension":750,"ExifVersion":[2,2,1],"FlashPixVersion":[1,0],"PixelYDimension":1334,"SceneCaptureType":0,"ComponentsConfiguration":[1,2,3,0]}}
3. Screenshot
{"ColorModel":"RGB","{Exif}":{"PixelXDimension":750,"PixelYDimension":1334,"DateTimeOriginal":"2021:01:21 14:25:56","UserComment":"Screenshot"},"{PNG}":{"InterlaceType":0},"HasAlpha":true,"Depth":16,"{TIFF}":{"Orientation":1},"PixelHeight":1334,"ProfileName":"Display P3","PixelWidth":750,"Orientation":1}
4. Photo (has a TON of information and JSON serialization crashes, so I extracted only the "{Exif}" part)
{"ExifVersion":[2,3,1],"Flash":24,"LensModel":"iPhone SE (2nd generation) back camera 3.99mm f\\/1.8","OffsetTimeDigitized":"+05:30","SubsecTimeOriginal":"630","LensSpecification":[3.990000009536743,3.990000009536743,1.7999999523162842,1.7999999523162842],"ExposureMode":0,"CompositeImage":2,"LensMake":"Apple","FNumber":1.8,"OffsetTimeOriginal":"+05:30","PixelYDimension":3024,"ApertureValue":1.6959938128383605,"ExposureBiasValue":0,"MeteringMode":5,"ISOSpeedRatings":[400],"ShutterSpeedValue":4.6443251405083465,"SceneCaptureType":0,"FocalLength":3.99,"DateTimeOriginal":"2021:01:21 20:47:05","SceneType":1,"FlashPixVersion":[1,0],"ColorSpace":65535,"SubjectArea":[2013,1511,2217,1330],"PixelXDimension":4032,"FocalLenIn35mmFilm":28,"SubsecTimeDigitized":"630","OffsetTime":"+05:30","SensingMethod":2,"BrightnessValue":0.06030004492448258,"DateTimeDigitized":"2021:01:21 20:47:05","ComponentsConfiguration":[1,2,3,0],"WhiteBalance":0,"ExposureTime":0.04,"ExposureProgram":2}
The videos have "ExifVersion":[2,2,1] while the photo has "ExifVersion":[2,3,1]. The videos' exif doesn't provide any useful information about camera at all - while photo's exif does. All of the videos & photos were captured on the same phone.
At this point, still clueless if this information is even encoded into video frames at all.
This may help you. If you get PHAssets with filtering by PHAssetCollectionSubtype.smartAlbumSelfPortraits, those assets
seem to videos are taken in front camera.
let collection = PHAssetCollection.fetchAssetCollections(with: PHAssetCollectionType.smartAlbum,
subtype: PHAssetCollectionSubtype.smartAlbumSelfPortraits,
options: nil)
Even though the video doesn't have faces, it is categorized as a selfie on iOS for now. I tested with iPhone 12 mini/ iOS15.
Apple also says like this:
A smart album that groups all photos and videos captured using the
device’s front-facing camera.
Blockquote
https://developer.apple.com/documentation/photokit/phassetcollectionsubtype/smartalbumselfportraits
So maybe you can prefetch selfie album assets, then check if an asset is contained in the album to detect it is taken by the front camera.
I am currently taking photos from the users iPhone making them jpegs, then saving them using file Manager and then saving their paths with core data.
The way I do it I can’t show live images or later export the image with its previous meta data.
My question is what is the correct way to save an image locally with all its meta data and live status so it can be displayed and exported later but only inside the app. I don’t want the images to be viable outside the application.
Taking The image:
Take a look at the UIImagePickerController class and use it to take pictures. Taking pictures with this, will NOT save the images automatically to the iOS gallery.
The mentioned UIImagePickerController will notify a delegate UIImagePickerControllerDelegate and call the function func imagePickerController(_ picker: UIImagePickerController, didFinishPickingMediaWithInfo info: [String: Any]).
From the second parameter, the info dictionary, you can request most of the images metadata, livePhoto and so on. The necessary keys can be found inside the extension of UIImagePickerController called UIImagePickerController.InfoKey.
For example, here is how I then retrieve the images coordinate if available:
if let imageURL = info[UIImagePickerControllerReferenceURL] as? URL {
let result = PHAsset.fetchAssets(withALAssetURLs: [imageURL], options: nil)
if let asset = result.firstObject, let location = asset.location {
let lat = location.coordinate.latitude
let lon = location.coordinate.longitude
print("Here's the lat and lon \(lat) + \(lon)")
}
}
Saving the image onto the phone within your app?:
I use JSONEncoder. You can use this class to encode and then decode your custom image class again.
Your custom image class, which has the UIImage and all the other properties must then implement the Codable protocol.
Afterwards, use the JSONEncoder to create Data of the object via its encode() method and save it to an app private location with the help of a FileManager.
Later on, you may read all files from that location again with the FileManager, decode() with the JSONEncoder and voila you have all your images again in the form of your custom image class.
Saving the image to the users gallery "Exporting":
Here's an example of how I again save that image to the users gallery as a way of exporting it:
private static func saveImageToGallery(picture: UIImage, lat: Double?, lon: Double?) {
PHPhotoLibrary.shared().performChanges({
let request = PHAssetCreationRequest.creationRequestForAsset(from: picture)
if let _lat = lat, let _lon = lon {
request.location = CLLocation(latitude: _lat, longitude: _lon)
}
})
}
I hope this will be enough to guide you to the correct way.
I am using PHPhotoLibrary to access camera roll photos. But it is getting all images, like downloaded images, screenshots, Facebook Image etc. I need images which are clicked by the camera.
I believe that this link may help you:
How to get only images in the camera roll using Photos Framework
Through some experimentation we discovered a hidden property not
listed in the documentation (assetSource). Basically you have to do a
regular fetch request, then use a predicate to filter the ones from
the camera roll. This value should be 3.
Sample code:
//fetch all assets, then sub fetch only the range we need
var assets = PHAsset.fetchAssetsWithMediaType(PHAssetMediaType.Image, options: fetchOptions)
assets.enumerateObjectsUsingBlock { (obj, idx, bool) -> Void in
results.addObject(obj)
}
var cameraRollAssets = results.filteredArrayUsingPredicate(NSPredicate(format: "assetSource == %#", argumentArray: [3]))
results = NSMutableArray(array: cameraRollAssets)
I'm using Core Image in Swift for editing photos and I have a problem when I save the photo. I'm not saving it with correct orientation.
When I get the picture from the Photo Library I'm saving the orientation in a variable as UIImageOrientation but I don't know how to set it back before saving the edited photo to the Photo Library. Any ideas how?
Saving the orientation:
var orientation: UIImageOrientation = .Up
orientation = gotImage.imageOrientation
Saving the edited photo to the Photo Library:
#IBAction func savePhoto(sender: UIBarButtonItem) {
let originalImageSize = CIImage(image:gotImage)
filter.setValue(originalImageSize, forKey: kCIInputImageKey)
// 1
let imageToSave = filter.outputImage
// 2
let softwareContext = CIContext(options:[kCIContextUseSoftwareRenderer: true])
// 3
let cgimg = softwareContext.createCGImage(imageToSave, fromRect:imageToSave.extent())
// 4
let library = ALAssetsLibrary()
library.writeImageToSavedPhotosAlbum(cgimg,
metadata:imageToSave.properties(),
completionBlock:nil)
}
Instead of using the metadata version of writeImageToSavedPhotosAlbum, you can use :
library.writeImageToSavedPhotosAlbum(
cgimg,
orientation: orientation,
completionBlock:nil)
then you can pass in the orientation directly.
To satisfy Swift, you may need to typecast it first:
var orientation : ALAssetOrientation = ALAssetOrientation(rawValue:
gotImage.imageOrientation.rawValue)!
As per my somewhat inconclusive answer here.
(As you have confirmed, this solution does indeed work in Swift - I derived it, untested, from working Objective-C code)
If you are interested in manipulating other information from image metadata, here are a few related answers I provided to other questions...
Updating UIImage orientation metaData?
Force UIImagePickerController to take photo in portrait orientation/dimensions iOS
How to get author of image in cocoa
Getting a URL from (to) a "picked" image, iOS
And a small test project on github that digs out image metadata from various sources (it's objective-C, but the principles are the same).
You are calling writeImageToSavedPhotosAlbum:metadata:completionBlock:... The docs on that method say:
You must specify the orientation key in the metadata dictionary to preserve the orientation of the image.
I'm trying to print all the image metadata in the imagePickerController: didFinishPickingMediaWithInfo function. When I use the info.objectForKey(UIImagePickerControllerReferenceURL) method, it returns nil and if try to use this result my app crashes. Does anyone know why it returns nil and what else can I use to print all the image metadata when I pick an image? (using UIImageJPEGRepresentation is not an option because the EXIF data is removed).
This is my code:
func imagePickerController(picker: UIImagePickerController!, didFinishPickingMediaWithInfo info: NSDictionary!)
{
let image = info.objectForKey(UIImagePickerControllerOriginalImage) as UIImage
let refURL : NSURL = info.objectForKey(UIImagePickerControllerReferenceURL) as NSURL
var localSourceRef: CGImageSourceRef = CGImageSourceCreateWithURL(refURL, nil)
var localMetadata: NSDictionary = CGImageSourceCopyPropertiesAtIndex(localSourceRef, 0, nil)
println("\n Photo data: \n")
println(localMetadata)
}
So it sounds like there are actually two questions here:
1) Why is UIImagePickerControllerReferenceURL returning a nil reference?
2) How can you get location data from the photo?
So, the answer to (1) is usually because you receive the callback didFinishPickingMedia before the OS as written the file to the image library.
The answer to #2 is much trickier, as showcased by this question's line of answers:
Reading the GPS data from the image returned by the camera in iOS iphone
There are a number of variables you need to account for:
iOS will strip the GPS data out if you haven't requested access to location data, so you'll need to prompt for location access using CLLocationManager.
If the user has geotagging disabled, you'll never get GPS coords.
If the phone can't get a GPS lock, iOS won't record the GPS coords.
Per this answer: https://stackoverflow.com/a/10338012/490180 you should be able to retrieve the raw UIImage and then create the CGImageSourceRef from the data property off of UIImage's CGImage. This effectively removes the need for you to ever access the ReferenceURL.