Extract video portion from Live Photo - ios

Has anyone figured out how to extract the video portion from a Live Photo? I'm working on an app to convert Live Photos into a GIF, and the first step is to get the video file from the Live Photo. It seems like it should be possible, because if you plug in your phone to a Mac you can see the separate image and video files. I've kinda run into a brick wall in the extraction process, and I've tried many ways to do it and they all fail.
The first thing I did was obtain a PHAsset for what I think is the video part of the Live Photo, by doing the following:
if let livePhoto = info["UIImagePickerControllerLivePhoto"] as? PHLivePhoto {
let assetResources = PHAssetResource.assetResourcesForLivePhoto(livePhoto)
for assetRes in assetResources {
if (assetRes.type == .PairedVideo) {
let assets = PHAsset.fetchAssetsWithLocalIdentifiers([assetRes.assetLocalIdentifier], options: nil)
if let asset = assets.firstObject as? PHAsset {
To convert the PHAsset to an AVAsset I've tried:
asset.requestContentEditingInputWithOptions(nil, completionHandler: { (contentEditingInput, info) -> Void in
if let url = contentEditingInput?.fullSizeImageURL {
let movieUrl = url.absoluteString + ".mov"
let avAsset = AVURLAsset(URL: NSURL(fileURLWithPath: movieUrl), options: nil)
debugPrint(avAsset)
debugPrint(avAsset.duration.value)
}
})
I don't think this one works because the debug print with the duration.value gives 0.
I've also tried without the ".mov" addition and it still doesn't work.
I also tried:
PHImageManager.defaultManager().requestAVAssetForVideo(asset, options: nil, resultHandler: { (avAsset, audioMix, info) -> Void in
debugPrint(avAsset)
})
And the debugPrint(avAsset) prints nil so it doesn't work.
I'm kind of afraid they might have made it impossible to do, it seems like I'm going in circles since it seems like the PHAsset I got is still a Live Photo and not actually a video.

Use the PHAssetResourceManager to get the video file from the PHAssetResource.
PHAssetResourceManager.defaultManager().writeDataForAssetResource(assetRes,
toFile: fileURL, options: nil, completionHandler:
{
// Video file has been written to path specified via fileURL
}
NOTE: The Live Photo specific APIs were introduced in iOS 9.1

// suppose you have PHAsset instance (you can get it via [PHAsset fetchAssetsWithOptions:...])
PHAssetResource *videoResource = nil;
NSArray *resourcesArray = [PHAssetResource assetResourcesForAsset:asset];
const NSInteger livePhotoAssetResourcesCount = 2;
const NSInteger videoPartIndex = 1;
if (resourcesArray.count == livePhotoAssetResourcesCount) {
videoResource = resourcesArray[videoPartIndex];
}
if (videoResource) {
NSString * const fileURLKey = #"_fileURL";
NSURL *videoURL = [videoResource valueForKey:fileURLKey];
// load video url using AVKit or AVFoundation
}

I accidentally did. I have an ios app called Goodreader (available in the appstore) which features a windows-like file manager. When importing a live photo, it will save it as a folder ending in .pvt containing the jpg and mov files in it. There is only one caveat: you need to open the live photo from within the messages app after you've sent it to yourself or somebody else to see the "import to goodreader" option, not from the photos app.

Related

How do I grab the UTI or MIME Type from a video asset fetched using PHAsset.fetchAsset?

I've run into an issue where I can't seem to figure out a good way of grabbing the UTI or MIME type from a video asset after fetching it from the photo library.
let assets = PHAsset.fetchAssets(in: someCollection, options: videosOnly)
(iterating over each asset in assets...)
PHImageManager.default().requestAVAsset(forVideo: asset, options: nil) { (AVAsset: avAsset, AVAudioMix: avAudioMix, info) in
(I'm trying to find the UTI/MIME type here...)
}
One iffy solution I found was to grab the pathExtension of the file by casting the avAsset as a AVURLAsset:
guard let avURLAsset = avAsset as? AVURLAsset else { return }
let videoExt = avURLAsset.url.pathExtension
This seems to get the corresponding filetype ('m4v', 'mov', 'mp4') in basic test cases, but I'm worried that this is not a robust enough solution. I saw another post (Finding image type from NSData or UIImage) that details grabbing the image type by looking at the bytes of NSData, but did not touch on video files.
I have also tried the solution suggested at How to get MIME type for image or video in iOS 8 using PHAsset?:
let requestContentEditingOptions = PHContentEditingInputRequestOptions()
asset.requestContentEditingInput(with: requestContentEditingOptions) { (contentEditingInput, contentEditingInfo) in
guard let videoUTI = contentEditingInput?.uniformTypeIdentifier else { completion(nil); return }
}
Unfortunately, the uniformTypeIdentifier property came up as nil when I attempted this. If anyone sees a potential problem with my implementation of this, I'd love to hear it.
Has anyone else run into this before? Would love to hear any ideas or if anyone things the pathExtension is a reasonable option for a production app.

Get PHAsset from iOS Share Extension

I am developing a share extension for photos for my iOS app. Inside the extension, I am able to successfully retrieve the UIImage object from the NSItemProvider.
However, I would like to be able to share the image with my container app, without having to store the entire image data inside my shared user defaults. Is there a way to get the PHAsset of the image that the user has chosen in the share extension (if they have picked from their device)?
The documentation on the photos framework (https://developer.apple.com/library/ios/documentation/Photos/Reference/Photos_Framework/) has a line that says "This architecture makes it easy, safe, and efficient to work with the same assets from multiple threads or multiple apps and app extensions."
That line makes me think there is a way to share the same PHAsset between extension and container app, but I have yet to figure out any way to do that? Is there a way to do that?
This only works if the NSItemProvider gives you a URL with the format:
file:///var/mobile/Media/DCIM/100APPLE/IMG_0007.PNG
which is not always true for all your assets, but if it returns a URL as:
file:///var/mobile/Media/PhotoData/OutgoingTemp/2AB79E02-C977-4B4A-AFEE-60BC1641A67F.JPG
then PHAsset will never find your asset. Further more, the latter is a copy of your file, so if you happen to have a very large image/video, iOS will duplicate it in that OutgoingTemp directory. Nowhere in the documentation says when it's going to be deleted, hopefully soon enough.
I think this is a big gap Apple has left between Sharing Extensions and PHPhotoLibrary framework. Apple should've be creating an API to close it, and soon.
You can get PHAsset if image is shared from Photos app. The item provider will give you a URL that contains the image's filename, you use this to match PHAsset.
/// Assets that handle through handleImageItem:completionHandler:
private var handledAssets = [PHAsset]()
/// Key is the matched asset's original file name without suffix. E.g. IMG_193
private lazy var imageAssetDictionary: [String : PHAsset] = {
let options = PHFetchOptions()
options.includeHiddenAssets = true
let fetchResult = PHAsset.fetchAssetsWithOptions(options)
var assetDictionary = [String : PHAsset]()
for i in 0 ..< fetchResult.count {
let asset = fetchResult[i] as! PHAsset
let fileName = asset.valueForKey("filename") as! String
let fileNameWithoutSuffix = fileName.componentsSeparatedByString(".").first!
assetDictionary[fileNameWithoutSuffix] = asset
}
return assetDictionary
}()
...
provider.loadItemForTypeIdentifier(imageIdentifier, options: nil) { imageItem, _ in
if let image = imageItem as? UIImage {
// handle UIImage
} else if let data = imageItem as? NSData {
// handle NSData
} else if let url = imageItem as? NSURL {
// Prefix check: image is shared from Photos app
if let imageFilePath = imageURL.path where imageFilePath.hasPrefix("/var/mobile/Media/") {
for component in imageFilePath.componentsSeparatedByString("/") where component.containsString("IMG_") {
// photo: /var/mobile/Media/DCIM/101APPLE/IMG_1320.PNG
// edited photo: /var/mobile/Media/PhotoData/Mutations/DCIM/101APPLE/IMG_1309/Adjustments/FullSizeRender.jpg
// cut file's suffix if have, get file name like IMG_1309.
let fileName = component.componentsSeparatedByString(".").first!
if let asset = imageAssetDictionary[fileName] {
handledAssets.append(asset)
imageCreationDate = asset.creationDate
}
break
}
}
}

How to convert a video NSURL to an ALAsset?

Facebook sharing requires an ALAsset such as the following:
let content = FBSDKShareVideoContent()
//The videos must be less than 12MB in size.
let bundle = NSBundle.mainBundle()
let path = bundle.URLForResource("a", withExtension: "mp4")
let video = FBSDKShareVideo()
// doesn't work; needs to be an "asset url" (ALAsset)
//video.videoURL = path
content.video = video
let dialog = FBSDKShareDialog()
dialog.shareContent = content
dialog.show()
How is it possible to take a local bundle document, or an NSData object, and convert it to an ALAsset?
(My initial thinking was saving the video to the local camera roll, and then loading the list and selecting it, but that is unnecessary interface steps)
The documentation for an ALAsset states that
An ALAsset object represents a photo or a video managed by the Photo application.
so I'm pretty sure that you have to write the video to the camera roll before using it as an ALAsset. However, you don't need to open the camera roll and have a user pick the asset in order to use it. When writing to the ALAssetLibrary using
library.writeVideoAtPathToSavedPhotosAlbum(movieURL, completionBlock: { (newURL, error) -> Void
you get the asset url in that newUrl completion block variable. Use it in the Facebook sharing call
let content = FBSDKShareVideoContent()
content.video = FBSDKShareVideo(videoURL: newURL)
FBSDKShareAPI.shareWithContent(content, delegate: self)
NSLog("Facebook content shared \(content.video.videoURL)")
You can do this sharing inside the completion block if you so desire, or you can save the newUrl from the completion block and use it somewhere else.

Swift Share extension, ALAsset is nil

i want to get some extra info about the images i'll share with the Share extension. I can create the UIImage from the url but when i want to obtain an ALAsset i get nil. Anyone had this problem?
itemProvider!.loadItemForTypeIdentifier(String(kUTTypeImage), options: nil, completionHandler: { (decoder: NSSecureCoding!, error: NSError!) -> Void in
if ALAssetsLibrary.authorizationStatus() == ALAuthorizationStatus.Authorized {
if let url = decoder as? NSURL {
ALAssetsLibrary().assetForURL(url, resultBlock: { (myasset:ALAsset!) -> Void in
println(url)
println(fm.fileExistsAtPath(url.path!))
println(myasset)
let location = myasset?.valueForProperty(ALAssetPropertyLocation) as CLLocation?
let date = myasset?.valueForProperty(ALAssetPropertyDate) as NSDate?
self.extensionContext?.completeRequestReturningItems([AnyObject](), completionHandler: nil)
}, failureBlock: { (myerror:NSError!) -> Void in
})
}
}
The output is
file:///var/mobile/Media/DCIM/102APPLE/IMG_2977.JPG
true
nil
the immediate issue is you are passing a file url in place of an asset url for this line: ALAssetsLibrary().assetForURL(url, resultBlock: { (myasset:ALAsset!) -> Void in.
Share extensions return the url to the path on the iphone's file system...something of the form: file:///..... These are not the same as the urls that an ALAsset require in the assetForURL method.
Unfortunately, though this makes the code more correct, it still doesn't fix the issue. I spent some time with many different approaches. Writing a new image to disk via the AssetsLibrary and the given file path will return an asset url upon completion which will work successfully - though you obviously don't want duplicate photos in your camera roll. (Note: there is no way to delete an ALAsset). You could probably hold onto the file path and delete the new image when you are done with it, but that is an extremely messy approach.
I ended up rewriting my approach given these limitations.

Loading image from "My Photo Stream" using UIImagePicker results URL and PHAsset on iOS8

I am updated an app from ios7 to ios8 and struggling to get UIImagePicker working to load a picture from the "My Photo Stream" category of photos. The UIImagePicker implementation is standard and I retrieve the URL of the selected photo with:
NSURL *url = [info objectForKey:#"UIImagePickerControllerReferenceURL"];
I then use the new IOS8 APIs for loading this photo:
PHFetchResult *result = [PHAsset fetchAssetsWithALAssetURLs:#[url] options:nil];
This fails to work. The result.count is zero and no image is found. The URL has an different UUID than if I select the photo from "Moments" or "Camera Roll" but looks well formed. (phone is running 8.1):
url NSURL * #"assets-library://asset/asset.JPG?id=4522DBD1-862C-42BE-AF7C-0D6C76CA7590&ext=JPG"
Anyone have some code to load photos from "My Photo Stream" or a way to disable it for the UIImagePicker display?
Using the older ALAssetsLibrary assetForURL API also fails to load these images.
I think you can try this to get "my photo stream" album.
PHFetchResult *smartAlbums = [PHAssetCollection fetchAssetCollectionsWithType:PHAssetCollectionTypeAlbum subtype:PHAssetCollectionSubtypeAlbumMyPhotoStream options:nil];
Photos in My Photo Stream album are uploaded to iCloud and iCloud will manage them efficiently, photos captured within 30 days are there, and I think PHAsset or ALAssetLibrary dealt with locally stored assets. Here is a test I did on my device, I used a imagePicker to get a photo url from stream album, which was
"assets-library://asset/asset.JPG?id=60AE4E50-B835-47EB-B896-5974C21F8C9B&ext=JPG";
And using PHAsset on the same photo I get its localIdentifier:
"60AE4E50-B835-47EB-B896-5974C21F8C9B/L0/001"
So I think you can strip id from url, and find out which asset's local identifier contains it, then you get the PHAsset you want.
After not finding any answers for this anywhere, I created the following extension to PHAsset which works great as of iOS 8.2 although I assume it's theoretically slow. Even though one of the prior comments says that this is fixed on iOS8.2 beta, the bug was still present for me now that iOS8.2 is released.
import Photos
import UIKit
extension PHAsset {
class func fetchAssetWithALAssetURL (alURL: NSURL) -> PHAsset? {
let phPhotoLibrary = PHPhotoLibrary.sharedPhotoLibrary()
let assetManager = PHImageManager()
var phAsset : PHAsset?
let optionsForFetch = PHFetchOptions()
optionsForFetch.includeHiddenAssets = true
var fetchResult = PHAsset.fetchAssetsWithALAssetURLs([alURL], options: optionsForFetch)
if fetchResult?.count > 0 {
return fetchResult[0] as? PHAsset
} else {
var str = alURL.absoluteString!
let startOfString = advance(find(str, "=")!, 1)
let endOfString = advance(startOfString, 36)
let range = Range<String.Index>(start:startOfString, end:endOfString)
let localIDFragment = str.substringWithRange(range)
let fetchResultForPhotostream = PHAssetCollection.fetchAssetCollectionsWithType(PHAssetCollectionType.Album, subtype: PHAssetCollectionSubtype.AlbumMyPhotoStream, options: nil)
if fetchResultForPhotostream?.count > 0 {
let photostream = fetchResultForPhotostream![0] as PHAssetCollection
let fetchResultForPhotostreamAssets = PHAsset.fetchAssetsInAssetCollection(photostream, options: optionsForFetch)
if fetchResultForPhotostreamAssets?.count >= 0 {
var stop : Bool = false
for var i = 0; i < fetchResultForPhotostreamAssets.count && !stop; i++ {
let phAssetBeingCompared = fetchResultForPhotostreamAssets[i] as PHAsset
if phAssetBeingCompared.localIdentifier.rangeOfString(localIDFragment, options: nil, range: nil, locale: nil) != nil {
phAsset = phAssetBeingCompared
stop = true
}
}
return phAsset
}
}
return nil
}
}
}
Loading an image with a specified URL from My Photo Stream using ALAssetsLibrary is discussed here:
ALAssetsLibrary assetForURL: always returning nil for photos in "My Photo Stream" in iOS 8.1
In short, instead of using assetForURL you have to iterate through all items in a Photo Stream and compare their URL with yours. The link above contains a code example.
It looks awkward, works slower than assetForURL, and it is not so easy to determine if there is no any file with specified URL found, but I did not find any other way to do it (except of migrating to Photos framework).
For PHAsset, I use
[[PHImageManager defaultManager] requestImageForAsset:imageAsset
targetSize:targetSize
contentMode:PHImageContentModeAspectFit
options:options
resultHandler:^(UIImage *img, NSDictionary *info)
{ // code for handle image }];
I tried to use requestImageDataForAsset, but it return nil for streamed photos.
Work for me in 8.0
import Photos
func imageFromAsset(nsurl: NSURL) {
let asset = PHAsset.fetchAssetsWithALAssetURLs([nsurl], options: nil).firstObject as! PHAsset
let targetSize = CGSizeMake(300, 300)
var options = PHImageRequestOptions()
PHImageManager.defaultManager().requestImageForAsset(asset, targetSize: targetSize, contentMode: PHImageContentMode.AspectFit, options: options, resultHandler: {
(result, info) in
// imageE - UIImageView on scene
self.imageE.image = result
})
}

Resources