I'm storing images and videos in a Camera Roll album using PhotoKit, and want to allow the user to share them using UIActivityViewController. If I pass UIActivityViewController a UIImage instance, it works as expected, probably because the image data is passed in memory. However, videos need to be passed by URL because there's no video analogue to UIImage. When I pass a URL to a video, I get an error "Could not create sandbox extension". If I pass a URL to an image, I get a similar error.
Based on this, it seems as though I might be able to get around this error by exporting the assets to the Documents directory, and passing UIActivityViewController the URL to the asset in Documents. However, I've read elsewhere that the Camera Roll can serve a similar purpose, and it goes to reason that the Camera Roll would be one of the few places that can hold data for sharing between apps.
Is there a way to pass UIActivityViewController URLs to Camera Roll assets without copying them to Documents? Is there a better way to be sharing images and video that are already in Camera Roll?
Implementation Details:
I'm generating URLs for assets using this:
func videoFor(asset: PHAsset, resultHander: #escaping (AVAsset?, AVAudioMix?, [AnyHashable : Any]?) -> Void) {
imageManager.requestAVAsset(forVideo: asset, options: nil, resultHandler: resultHander)
}
func urlFor(asset: PHAsset, resultHandler: #escaping (URL?) -> Void) {
if ( asset.mediaType == .video ) {
videoFor(asset: asset) { (asset, audioMix, info) in
let asset = asset as! AVURLAsset
resultHandler(asset.url)
}
}
else if ( asset.mediaType == .image ) {
let options: PHContentEditingInputRequestOptions = PHContentEditingInputRequestOptions()
options.canHandleAdjustmentData = {(adjustmeta: PHAdjustmentData) -> Bool in
return true
}
asset.requestContentEditingInput(with: options, completionHandler: {(contentEditingInput: PHContentEditingInput?, info: [AnyHashable : Any]) -> Void in
resultHandler(contentEditingInput!.fullSizeImageURL as URL?)
})
}
else {
resultHandler(nil)
}
}
Here is the full error I get in console when trying to share an image by URL:
Failed to determine whether URL /var/mobile/Media/DCIM/100APPLE/IMG_0201.JPG (n) is managed by a file provider
Could not create sandbox extension. Error: Error Domain=NSPOSIXErrorDomain Code=1 "Operation not permitted" UserInfo={NSLocalizedDescription=Could not create sandbox extension of type com.apple.app-sandbox.read for URL /var/mobile/Media/DCIM/100APPLE/IMG_0201.JPG. Error: No such file or directory}
... and for a video:
Failed to determine whether URL /var/mobile/Media/DCIM/100APPLE/IMG_0202.M4V (n) is managed by a file provider
Could not create sandbox extension. Error: Error Domain=NSPOSIXErrorDomain Code=1 "Operation not permitted" UserInfo={NSLocalizedDescription=Could not create sandbox extension of type com.apple.app-sandbox.read for URL /var/mobile/Media/DCIM/100APPLE/IMG_0202.M4V. Error: Operation not permitted}
I was stuck on the same problem today. Here is my solution. Hope this helps or guides you to the right path.
PHImageManager.default().requestExportSession(forVideo: video, options: nil, exportPreset: AVAssetExportPresetPassthrough) { (exportSession, nil) in
if let exportSession = exportSession {
exportSession.outputURL = destinationURLForFile
exportSession.outputFileType = AVFileType.m4v
exportSession.exportAsynchronously() {
// Load the share sheet using destinationURLForFile
}
}
}
What this does is export the video to the provided location destinationURLForFile (i used the Documents directory. Make sure you delete the file if its already there otherwise the export MAY not work cause it may not override the file).
You can set the type based on available types. I needed m4v.
Then, export async and just call the share sheet or whatever sharing mechanism you have.
Related
I'm attempting to write a photoapp that can take both RAW and JPEG images and save them to the camera roll. The functions jpegPhotoDataRepresentation and dngPhotoDataRepresentation seem to be the key to all examples I've found, however both of these are deprecated in iOS 11 and the function for saving after "capturePhoto" is now
func photoOutput(_ output: AVCapturePhotoOutput, didFinishProcessingPhoto photo: AVCapturePhoto, error: Error?) {
The main example I've been able to find of a working RAW iOS11 app is this:
https://ubunifu.co/swift/raw-photo-capture-sample-swift-4-ios-11
which works, however it only shoots RAW and saving is clumsy because it's not on the camera roll.
I've changed my photo settings to allow for both raw and processed capture with this line
photoSettings = AVCapturePhotoSettings(rawPixelFormatType: availableRawFormat.uint32Value, processedFormat: [AVVideoCodecKey : AVVideoCodecType.jpeg])
But once I've actually captured the photo I have no idea how to access the processedFormat data. fileDataRepresentation seems to be the only way to access the dng stuff, but there's no way to get at the jpeg separately? The code I've found from Apple pre-iOS11 suggests to use PHPhotoLibrary and add a resource, but this requires a data representation which I'm unable to access other than as a dng file, which when saved to the library is just white because the library is not able to handle RAW files. Here's my photoOutput code in case it helps.
func photoOutput(_ output: AVCapturePhotoOutput, didFinishProcessingPhoto photo: AVCapturePhoto, error: Error?) {
let dir = NSSearchPathForDirectoriesInDomains(.documentDirectory, .userDomainMask, true).first! as String
let formatter = DateFormatter()
formatter.dateFormat = "yyyyMMddHHmmss"
formatter.locale = Locale.init(identifier: "en_US_POSIX")
let filePath = dir.appending(String(format: "/%#.dng", formatter.string(from: Date())))
let dngFileURL = URL(fileURLWithPath: filePath)
let dngData = photo.fileDataRepresentation()!
do {
try dngData.write(to: dngFileURL, options: [])
} catch {
print("Unable to write DNG file.")
return
}
PHPhotoLibrary.shared().performChanges( {
let creationRequest = PHAssetCreationRequest.forAsset()
let creationOptions = PHAssetResourceCreationOptions()
creationOptions.shouldMoveFile = true
//dngData is the problem, this should be the jpeg representation
creationRequest.addResource(with: .photo, data: dngData, options: nil)
//This line works fine, the associated file is the correct RAW file, but the jpeg preview is garbage
creationRequest.addResource(with: .alternatePhoto, fileURL: dngFileURL, options: creationOptions)
}, completionHandler: nil)
}
Okay, following up on comment from earlier and the Apple docs on Capturing Photos in RAW Format:
As you’ve noticed, if you want to shoot RAW and save it in the Photos library, you need to save DNG+processed versions together in the same asset so that Photos library clients that don’t support RAW still have a readable version of the asset. (That includes the Photos app itself...) Saving both RAW+processed means specifying that in the capture.
If you’re requesting RAW+processed capture (where processed is JPEG, or even better, HEIF), you’re getting two photos for every shot you take. That means your didFinishProcessingPhoto callback gets called twice: once to deliver the JPEG (or HEIF), again to deliver the RAW.
Since you need to add RAW+processed versions of the asset to Photos together, you should wait until the capture output delivers both versions before trying to create the Photos asset. You’ll notice the code snippets in that Apple doc stash the data for both versions in the didFinishProcessingPhoto callback:
func photoOutput(_ output: AVCapturePhotoOutput, didFinishProcessingPhoto photo: AVCapturePhoto, error: Error?) {
if photo.isRawPhoto {
// Save the RAW (DNG) fileDataRepresentation to a URL
} else {
// Hold JPEG/HEIF fileDataRepresentation in a property
}
}
Then, when the didFinishCaptureFor callback fires, they make sure they have both versions, and add them together to the Photos library.
Notice that when you add DNG and JPEG or HEIF versions of a photo together...
The JPEG/HEIF needs to be the primary photo resource, and the DNG the alternatePhoto resource.
You can add a JPEG/HEIF resource straight from Data in memory, but DNG needs to be added from a file URL.
So the Photos library part goes like this (again, inside the didFinishCaptureFor callback):
PHPhotoLibrary.shared().performChanges({
// Add the compressed (HEIF) data as the main resource for the Photos asset.
let creationRequest = PHAssetCreationRequest.forAsset()
creationRequest.addResource(with: .photo, data: compressedData, options: nil)
// Add the RAW (DNG) file as an altenate resource.
let options = PHAssetResourceCreationOptions()
options.shouldMoveFile = true
creationRequest.addResource(with: .alternatePhoto, fileURL: rawURL, options: options)
}, completionHandler: self.handlePhotoLibraryError)
You can make CGImage extension like here and then get pixel buffer from that cgImage.
if let cgImage = photo.cgImageRepresentation() {
let pixelBuffer = cgImage.pixelBuffer()
}
I'm working on share videos using UIActivityViewController and have some questions about the URL extraction from PHAsset objects.
I use "requestAVAsset" in "PHImageManager" and cast the AVAsset object to AVURLAsset to access its url property. I've tried the following types of activities:
Copy to Drive - this opens Google Drive app (Success)
Google Drive - a dialog shows up for confirmation (Fail) (no video attached in the dialog)
Gmail - (Fail) (mail can be sent but no video attached)
Add to Notes - this add video to built-in Notes app, a dialog should show up for confirmation (Fail) (app freezes after UIActivityViewController disappears and no dialog shows up)
Facebook/LINE - (Fail) (the progress bar never moves)
My questions:
Does the URL extracted by this method has the access to the real resource file of video?
If yes, am I missing something? Are there bugs in my code (see below)?
Code to share contents (inside an UIViewController):
PHImageManager.default().requestAVAsset(forVideo: videoAsset, options: nil, resultHandler: {
(asset: AVAsset?, audioMix: AVAudioMix?, info: [AnyHashable: Any]?) in
if let urlAsset = asset as? AVURLAsset {
print("Share url=\(urlAsset.url.absoluteURL)")
let shareVC: UIActivityViewController = UIActivityViewController(activityItems: [urlAsset.url.absoluteURL], applicationActivities: nil)
shareVC.completionWithItemsHandler = {
(type: UIActivityType?, completed: Bool, returnedItems: [Any]?, err: Error?) in
print("Share result: completed=\(completed), \(type)")
if err != nil {
print("\(err.debugDescription)")
}
}
DispatchQueue.main.async {
self.present(shareVC, animated: true, completion: nil)
}
}
})
Environment: iPhone 7 plus, iOS 10.1.1
Btw, I also tried another 2 methods for sharing.
Using "writeData" in "PHAssetResourceManager" to output video to a temporary directory and then build URL by the file path.
Using "requestExportSession" in "PHImageManager" to output video to a temporary directory and then build URL by the file path.
These method work fine. In my opinion, this is because the video file can be accessed directly by the extracted url. But they are not suitable for me since I would like to share not only single file but also multiple files within one action. (They take time to process data)
I've got some live photos created JPEG and MOV files, now i want to import them into the app that would allow the user to save the live photos to their photo library. How can i go about doing this?
I've looked into this: https://github.com/genadyo/LivePhotoDemoSwift Which basically allows you to record video and turn it into a live photo. But since i've already created the "live photos", can i save them to the camera roll right away or do i need to follow a different route?
You can create a LivePhoto from separate elements from a LivePhoto by using PHLivePhoto.requestLivePhotoWithResourceFileURLs, you will then be able to save it to the library.
func makeLivePhotoFromItems(imageURL: NSURL, videoURL: NSURL, previewImage: UIImage, completion: (livePhoto: PHLivePhoto) -> Void) {
PHLivePhoto.requestLivePhotoWithResourceFileURLs([imageURL, videoURL], placeholderImage: previewImage, targetSize: CGSizeZero, contentMode: PHImageContentMode.AspectFit) {
(livePhoto, infoDict) -> Void in
if let lp = livePhoto {
completion(livePhoto: lp)
}
}
}
makeLivePhotoFromItems(imgURL, videoURL: movURL, previewImage: prevImg) { (livePhoto) -> Void in
// "livePhoto" is your PHLivePhoto object, save it/use it here
}
You will need the JPEG file URL, the MOV file URL, and a "preview" image (which is usually just the JPEG or a lighter version of it).
Full example working in a Playground here.
Has anyone figured out how to extract the video portion from a Live Photo? I'm working on an app to convert Live Photos into a GIF, and the first step is to get the video file from the Live Photo. It seems like it should be possible, because if you plug in your phone to a Mac you can see the separate image and video files. I've kinda run into a brick wall in the extraction process, and I've tried many ways to do it and they all fail.
The first thing I did was obtain a PHAsset for what I think is the video part of the Live Photo, by doing the following:
if let livePhoto = info["UIImagePickerControllerLivePhoto"] as? PHLivePhoto {
let assetResources = PHAssetResource.assetResourcesForLivePhoto(livePhoto)
for assetRes in assetResources {
if (assetRes.type == .PairedVideo) {
let assets = PHAsset.fetchAssetsWithLocalIdentifiers([assetRes.assetLocalIdentifier], options: nil)
if let asset = assets.firstObject as? PHAsset {
To convert the PHAsset to an AVAsset I've tried:
asset.requestContentEditingInputWithOptions(nil, completionHandler: { (contentEditingInput, info) -> Void in
if let url = contentEditingInput?.fullSizeImageURL {
let movieUrl = url.absoluteString + ".mov"
let avAsset = AVURLAsset(URL: NSURL(fileURLWithPath: movieUrl), options: nil)
debugPrint(avAsset)
debugPrint(avAsset.duration.value)
}
})
I don't think this one works because the debug print with the duration.value gives 0.
I've also tried without the ".mov" addition and it still doesn't work.
I also tried:
PHImageManager.defaultManager().requestAVAssetForVideo(asset, options: nil, resultHandler: { (avAsset, audioMix, info) -> Void in
debugPrint(avAsset)
})
And the debugPrint(avAsset) prints nil so it doesn't work.
I'm kind of afraid they might have made it impossible to do, it seems like I'm going in circles since it seems like the PHAsset I got is still a Live Photo and not actually a video.
Use the PHAssetResourceManager to get the video file from the PHAssetResource.
PHAssetResourceManager.defaultManager().writeDataForAssetResource(assetRes,
toFile: fileURL, options: nil, completionHandler:
{
// Video file has been written to path specified via fileURL
}
NOTE: The Live Photo specific APIs were introduced in iOS 9.1
// suppose you have PHAsset instance (you can get it via [PHAsset fetchAssetsWithOptions:...])
PHAssetResource *videoResource = nil;
NSArray *resourcesArray = [PHAssetResource assetResourcesForAsset:asset];
const NSInteger livePhotoAssetResourcesCount = 2;
const NSInteger videoPartIndex = 1;
if (resourcesArray.count == livePhotoAssetResourcesCount) {
videoResource = resourcesArray[videoPartIndex];
}
if (videoResource) {
NSString * const fileURLKey = #"_fileURL";
NSURL *videoURL = [videoResource valueForKey:fileURLKey];
// load video url using AVKit or AVFoundation
}
I accidentally did. I have an ios app called Goodreader (available in the appstore) which features a windows-like file manager. When importing a live photo, it will save it as a folder ending in .pvt containing the jpg and mov files in it. There is only one caveat: you need to open the live photo from within the messages app after you've sent it to yourself or somebody else to see the "import to goodreader" option, not from the photos app.
i want to get some extra info about the images i'll share with the Share extension. I can create the UIImage from the url but when i want to obtain an ALAsset i get nil. Anyone had this problem?
itemProvider!.loadItemForTypeIdentifier(String(kUTTypeImage), options: nil, completionHandler: { (decoder: NSSecureCoding!, error: NSError!) -> Void in
if ALAssetsLibrary.authorizationStatus() == ALAuthorizationStatus.Authorized {
if let url = decoder as? NSURL {
ALAssetsLibrary().assetForURL(url, resultBlock: { (myasset:ALAsset!) -> Void in
println(url)
println(fm.fileExistsAtPath(url.path!))
println(myasset)
let location = myasset?.valueForProperty(ALAssetPropertyLocation) as CLLocation?
let date = myasset?.valueForProperty(ALAssetPropertyDate) as NSDate?
self.extensionContext?.completeRequestReturningItems([AnyObject](), completionHandler: nil)
}, failureBlock: { (myerror:NSError!) -> Void in
})
}
}
The output is
file:///var/mobile/Media/DCIM/102APPLE/IMG_2977.JPG
true
nil
the immediate issue is you are passing a file url in place of an asset url for this line: ALAssetsLibrary().assetForURL(url, resultBlock: { (myasset:ALAsset!) -> Void in.
Share extensions return the url to the path on the iphone's file system...something of the form: file:///..... These are not the same as the urls that an ALAsset require in the assetForURL method.
Unfortunately, though this makes the code more correct, it still doesn't fix the issue. I spent some time with many different approaches. Writing a new image to disk via the AssetsLibrary and the given file path will return an asset url upon completion which will work successfully - though you obviously don't want duplicate photos in your camera roll. (Note: there is no way to delete an ALAsset). You could probably hold onto the file path and delete the new image when you are done with it, but that is an extremely messy approach.
I ended up rewriting my approach given these limitations.