Apple's new iOS 14 PHPickerViewController delegate receives only an NSItemProvider. The 2020 WWDC video shows how to get from there to a UIImage:
let prov = result.itemProvider
prov.loadObject(ofClass: UIImage.self) { im, err in
if let im = im as? UIImage {
DispatchQueue.main.async {
// display the image here
But what about if the user chose a live photo or a video? How do we get that?
Live photos are easy; do it exactly the same way. You can do that because PHLivePhoto is a class. So:
let prov = result.itemProvider
prov.loadObject(ofClass: PHLivePhoto.self) { livePhoto, err in
if let photo = livePhoto as? PHLivePhoto {
DispatchQueue.main.async {
// display the live photo here
Videos are harder. The problem is that you do not want to be handed the data; it does you no good and is likely to be huge. You want the data saved to disk so that you can access the video URL, just like what UIImagePickerController used to do. You can in fact ask the item provider to save its data for you, but it wants to let go of that data when the completion handler returns. My solution is to access the URL in a .sync function:
let prov = result.itemProvider
prov.loadFileRepresentation(forTypeIdentifier: UTType.movie.identifier) { url, err in
if let url = url {
DispatchQueue.main.sync {
// display the video here
Related
Suppose I have an array of UIImage called photos, they are to be uploaded to Firebase storage. I wish to do the following things:
Upload them to Firebase storage
Get paths of the uploaded photos and store in an array called uploadedAssets (paths, not download url, it looks like this: "photos/folder_name/photo_id"), where "folder_name" is randomly generated and "photo_id" is an integer, representing the order of photos
Call Cloud Function and pass uploadedAssets to it. The server then uses the paths to find all pictures and generates a thumbnail for each one.
Finally, store the original photos' download urls and thumbnails' download urls in database.
I have something that's working, but uses too much memory (300+MB when uploading only 4 pictures):
// Swift
let dispatchGroup = DispatchGroup()
let dispatchQueue = DispatchQueue.init(label: "AssetQueue")
var uploadedAssets = [String]()
let folderName: String = UUID().uuidString
dispatchQueue.async {
for i in 0..<photos.count {
dispatchGroup.enter()
let photo: UIImage = photos[i]
let fileName: String = "\(folderName)/\(i)"
let assetRef = Storage.storage().reference().child("photos/\(fileName)")
let metaData = StorageMetaData()
metaData.contentType = "image/jpg"
if let dataToUpload = UIImageJPEGRepresentation(photo, 0.75) {
assetRef.putData(
dataToUpload,
metaData: metaData,
completion: { (_, error) in
uploadedAssets.append("photos/\(fileName)")
dispatchGroup.leave()
}
)
}
}
}
dispatchGroup.notify(queue: dispatchQueue) {
Alamofire.request(
"https://<some_url>",
method: .post,
parameters: [
"uploadedAssets": uploadedAssets
]
)
}
And the code that generates thumbnails runs on server side, therefore, in my opinion, is irrelevant, I won't post it here. So, the above code snippet consumes 300+MB of memory when there are 4 photos to upload. After successfully uploaded those photos, the memory usage stays at 300+MB and never drops. When I try to upload more, say another 4 photos, it could even go up to 450+MB. I know that's not normal, but can't seem to figure out why this would happen?
I'm trying to figure out the proper approach for sharing 10+ photos from an iOS app to an Apple Watch app using watchOS 2.
I want to transfer these images in the background so that the user doesn't have to open the iOS app in order to view the photos.
I've tried querying photos from Facebook and sending them to the watch via transferUserInfo() but the payload is too large:
FBSDKGraphRequest(graphPath: "me/photos?limit=2", parameters:["fields": "name, source"]).startWithCompletionHandler({ (connection, result, error) -> Void in
if (error != nil){
print(error.description)
}
else {
var arr = [NSData]()
for res in result["data"] as! NSArray {
if let string = res["source"] as? String {
if let url = NSURL(string: string) {
if let data = NSData(contentsOfURL: url){
arr.append(data)
}
}
}
}
print(arr)
if arr.count > 0 {
self.session.transferUserInfo(["image" : arr])
}
}
})
Any ideas how I should go about doing this?
The proper method is mentioned in the WCSession documentation:
Use the transferFile:metadata: method to transfer files in the background. Use this method in cases where you want to send more than a dictionary of values. For example, use this method to send images or file-based documents.
The images will be asynchronously delivered to the watch on a background thread. session:didReceiveFile: will be called when the watch successfully receives an image.
Make sure to include (date) metadata with the image, and remove any existing images from the watch which are no longer a part of the ten most recent Facebook uploads.
I am developing a share extension for photos for my iOS app. Inside the extension, I am able to successfully retrieve the UIImage object from the NSItemProvider.
However, I would like to be able to share the image with my container app, without having to store the entire image data inside my shared user defaults. Is there a way to get the PHAsset of the image that the user has chosen in the share extension (if they have picked from their device)?
The documentation on the photos framework (https://developer.apple.com/library/ios/documentation/Photos/Reference/Photos_Framework/) has a line that says "This architecture makes it easy, safe, and efficient to work with the same assets from multiple threads or multiple apps and app extensions."
That line makes me think there is a way to share the same PHAsset between extension and container app, but I have yet to figure out any way to do that? Is there a way to do that?
This only works if the NSItemProvider gives you a URL with the format:
file:///var/mobile/Media/DCIM/100APPLE/IMG_0007.PNG
which is not always true for all your assets, but if it returns a URL as:
file:///var/mobile/Media/PhotoData/OutgoingTemp/2AB79E02-C977-4B4A-AFEE-60BC1641A67F.JPG
then PHAsset will never find your asset. Further more, the latter is a copy of your file, so if you happen to have a very large image/video, iOS will duplicate it in that OutgoingTemp directory. Nowhere in the documentation says when it's going to be deleted, hopefully soon enough.
I think this is a big gap Apple has left between Sharing Extensions and PHPhotoLibrary framework. Apple should've be creating an API to close it, and soon.
You can get PHAsset if image is shared from Photos app. The item provider will give you a URL that contains the image's filename, you use this to match PHAsset.
/// Assets that handle through handleImageItem:completionHandler:
private var handledAssets = [PHAsset]()
/// Key is the matched asset's original file name without suffix. E.g. IMG_193
private lazy var imageAssetDictionary: [String : PHAsset] = {
let options = PHFetchOptions()
options.includeHiddenAssets = true
let fetchResult = PHAsset.fetchAssetsWithOptions(options)
var assetDictionary = [String : PHAsset]()
for i in 0 ..< fetchResult.count {
let asset = fetchResult[i] as! PHAsset
let fileName = asset.valueForKey("filename") as! String
let fileNameWithoutSuffix = fileName.componentsSeparatedByString(".").first!
assetDictionary[fileNameWithoutSuffix] = asset
}
return assetDictionary
}()
...
provider.loadItemForTypeIdentifier(imageIdentifier, options: nil) { imageItem, _ in
if let image = imageItem as? UIImage {
// handle UIImage
} else if let data = imageItem as? NSData {
// handle NSData
} else if let url = imageItem as? NSURL {
// Prefix check: image is shared from Photos app
if let imageFilePath = imageURL.path where imageFilePath.hasPrefix("/var/mobile/Media/") {
for component in imageFilePath.componentsSeparatedByString("/") where component.containsString("IMG_") {
// photo: /var/mobile/Media/DCIM/101APPLE/IMG_1320.PNG
// edited photo: /var/mobile/Media/PhotoData/Mutations/DCIM/101APPLE/IMG_1309/Adjustments/FullSizeRender.jpg
// cut file's suffix if have, get file name like IMG_1309.
let fileName = component.componentsSeparatedByString(".").first!
if let asset = imageAssetDictionary[fileName] {
handledAssets.append(asset)
imageCreationDate = asset.creationDate
}
break
}
}
}
Facebook sharing requires an ALAsset such as the following:
let content = FBSDKShareVideoContent()
//The videos must be less than 12MB in size.
let bundle = NSBundle.mainBundle()
let path = bundle.URLForResource("a", withExtension: "mp4")
let video = FBSDKShareVideo()
// doesn't work; needs to be an "asset url" (ALAsset)
//video.videoURL = path
content.video = video
let dialog = FBSDKShareDialog()
dialog.shareContent = content
dialog.show()
How is it possible to take a local bundle document, or an NSData object, and convert it to an ALAsset?
(My initial thinking was saving the video to the local camera roll, and then loading the list and selecting it, but that is unnecessary interface steps)
The documentation for an ALAsset states that
An ALAsset object represents a photo or a video managed by the Photo application.
so I'm pretty sure that you have to write the video to the camera roll before using it as an ALAsset. However, you don't need to open the camera roll and have a user pick the asset in order to use it. When writing to the ALAssetLibrary using
library.writeVideoAtPathToSavedPhotosAlbum(movieURL, completionBlock: { (newURL, error) -> Void
you get the asset url in that newUrl completion block variable. Use it in the Facebook sharing call
let content = FBSDKShareVideoContent()
content.video = FBSDKShareVideo(videoURL: newURL)
FBSDKShareAPI.shareWithContent(content, delegate: self)
NSLog("Facebook content shared \(content.video.videoURL)")
You can do this sharing inside the completion block if you so desire, or you can save the newUrl from the completion block and use it somewhere else.
Has anyone figured out how to extract the video portion from a Live Photo? I'm working on an app to convert Live Photos into a GIF, and the first step is to get the video file from the Live Photo. It seems like it should be possible, because if you plug in your phone to a Mac you can see the separate image and video files. I've kinda run into a brick wall in the extraction process, and I've tried many ways to do it and they all fail.
The first thing I did was obtain a PHAsset for what I think is the video part of the Live Photo, by doing the following:
if let livePhoto = info["UIImagePickerControllerLivePhoto"] as? PHLivePhoto {
let assetResources = PHAssetResource.assetResourcesForLivePhoto(livePhoto)
for assetRes in assetResources {
if (assetRes.type == .PairedVideo) {
let assets = PHAsset.fetchAssetsWithLocalIdentifiers([assetRes.assetLocalIdentifier], options: nil)
if let asset = assets.firstObject as? PHAsset {
To convert the PHAsset to an AVAsset I've tried:
asset.requestContentEditingInputWithOptions(nil, completionHandler: { (contentEditingInput, info) -> Void in
if let url = contentEditingInput?.fullSizeImageURL {
let movieUrl = url.absoluteString + ".mov"
let avAsset = AVURLAsset(URL: NSURL(fileURLWithPath: movieUrl), options: nil)
debugPrint(avAsset)
debugPrint(avAsset.duration.value)
}
})
I don't think this one works because the debug print with the duration.value gives 0.
I've also tried without the ".mov" addition and it still doesn't work.
I also tried:
PHImageManager.defaultManager().requestAVAssetForVideo(asset, options: nil, resultHandler: { (avAsset, audioMix, info) -> Void in
debugPrint(avAsset)
})
And the debugPrint(avAsset) prints nil so it doesn't work.
I'm kind of afraid they might have made it impossible to do, it seems like I'm going in circles since it seems like the PHAsset I got is still a Live Photo and not actually a video.
Use the PHAssetResourceManager to get the video file from the PHAssetResource.
PHAssetResourceManager.defaultManager().writeDataForAssetResource(assetRes,
toFile: fileURL, options: nil, completionHandler:
{
// Video file has been written to path specified via fileURL
}
NOTE: The Live Photo specific APIs were introduced in iOS 9.1
// suppose you have PHAsset instance (you can get it via [PHAsset fetchAssetsWithOptions:...])
PHAssetResource *videoResource = nil;
NSArray *resourcesArray = [PHAssetResource assetResourcesForAsset:asset];
const NSInteger livePhotoAssetResourcesCount = 2;
const NSInteger videoPartIndex = 1;
if (resourcesArray.count == livePhotoAssetResourcesCount) {
videoResource = resourcesArray[videoPartIndex];
}
if (videoResource) {
NSString * const fileURLKey = #"_fileURL";
NSURL *videoURL = [videoResource valueForKey:fileURLKey];
// load video url using AVKit or AVFoundation
}
I accidentally did. I have an ios app called Goodreader (available in the appstore) which features a windows-like file manager. When importing a live photo, it will save it as a folder ending in .pvt containing the jpg and mov files in it. There is only one caveat: you need to open the live photo from within the messages app after you've sent it to yourself or somebody else to see the "import to goodreader" option, not from the photos app.