Facebook sharing requires an ALAsset such as the following:
let content = FBSDKShareVideoContent()
//The videos must be less than 12MB in size.
let bundle = NSBundle.mainBundle()
let path = bundle.URLForResource("a", withExtension: "mp4")
let video = FBSDKShareVideo()
// doesn't work; needs to be an "asset url" (ALAsset)
//video.videoURL = path
content.video = video
let dialog = FBSDKShareDialog()
dialog.shareContent = content
dialog.show()
How is it possible to take a local bundle document, or an NSData object, and convert it to an ALAsset?
(My initial thinking was saving the video to the local camera roll, and then loading the list and selecting it, but that is unnecessary interface steps)
The documentation for an ALAsset states that
An ALAsset object represents a photo or a video managed by the Photo application.
so I'm pretty sure that you have to write the video to the camera roll before using it as an ALAsset. However, you don't need to open the camera roll and have a user pick the asset in order to use it. When writing to the ALAssetLibrary using
library.writeVideoAtPathToSavedPhotosAlbum(movieURL, completionBlock: { (newURL, error) -> Void
you get the asset url in that newUrl completion block variable. Use it in the Facebook sharing call
let content = FBSDKShareVideoContent()
content.video = FBSDKShareVideo(videoURL: newURL)
FBSDKShareAPI.shareWithContent(content, delegate: self)
NSLog("Facebook content shared \(content.video.videoURL)")
You can do this sharing inside the completion block if you so desire, or you can save the newUrl from the completion block and use it somewhere else.
Related
Apple's new iOS 14 PHPickerViewController delegate receives only an NSItemProvider. The 2020 WWDC video shows how to get from there to a UIImage:
let prov = result.itemProvider
prov.loadObject(ofClass: UIImage.self) { im, err in
if let im = im as? UIImage {
DispatchQueue.main.async {
// display the image here
But what about if the user chose a live photo or a video? How do we get that?
Live photos are easy; do it exactly the same way. You can do that because PHLivePhoto is a class. So:
let prov = result.itemProvider
prov.loadObject(ofClass: PHLivePhoto.self) { livePhoto, err in
if let photo = livePhoto as? PHLivePhoto {
DispatchQueue.main.async {
// display the live photo here
Videos are harder. The problem is that you do not want to be handed the data; it does you no good and is likely to be huge. You want the data saved to disk so that you can access the video URL, just like what UIImagePickerController used to do. You can in fact ask the item provider to save its data for you, but it wants to let go of that data when the completion handler returns. My solution is to access the URL in a .sync function:
let prov = result.itemProvider
prov.loadFileRepresentation(forTypeIdentifier: UTType.movie.identifier) { url, err in
if let url = url {
DispatchQueue.main.sync {
// display the video here
I would like to play a video file in my ViewController which is loaded in every page of my PageViewController. As you will be able to see I use a plugin called Carlos to cache the videos (which initially need to be downloaded from a server) so that they do not have to be downloaded every time the user hits a new page. However, I can't figure a way out how to play this downloaded file (NSData). Thus, I would really like to know how I can get the URL of the downloaded file so that I can play it using AVPlayer.
Code (still using URL from server)
let omniCache = videoCache.cache
let request = omniCache.get(URL(string: video!)!)
request
.onSuccess { videoFile in
print("The file..." )
print(videoFile)
//How can I get the local URL here instead of my server url
if let videoURL = URL(string: self.video!){
if self.player == nil {
let playerItemToBePlayed = AVPlayerItem(url: videoURL as URL)
self.player = AVPlayer(playerItem: playerItemToBePlayed)
let playerLayer = AVPlayerLayer(player: self.player)
playerLayer.frame = self.view.frame
self.controlsContainerView.layer.insertSublayer(playerLayer, at: 0)
}
}
}
.onFailure { error in
print("An error occurred :( \(error)")
}
Look at this code of yours:
videoFile in
print("The file..." )
print(videoFile)
if let videoURL = URL(string: self.video!){
So in the first line you print videoFile, which turns out to be the data of the file. But then you ignore it! You never mention videoFile again. Why do you ignore it? That is the data, you already have the data. Now play it!
If the data is a file, get its file URL and play it. If it is in memory — it definitely should not be, because a video held entirely in memory would crash your program — save it, and get that file URL and play it.
[I have to ask, however, why you are interposing this cache plug-in between yourself and such a simple task. Why don't you just download the remote video to disk, yourself?]
Since iOS 10, Apple has provided the support for downloading HLS (m3u8) video for offline viewing.
My question is: Is it necessary that we can only download HLS when it is being played ? Or we can just download when user press download button and show progress.
Does anyone has implemented this in Objective C version? Actually my previous App is made in Objective C. Now I want to add support for downloading HLS rather than MP4 (previously I was downloading MP4 for offline view).
I am really desperate to this. Please share thoughts or any code if implemented.
I used the apple code guid to download HLS content with the following code:
var configuration: URLSessionConfiguration?
var downloadSession: AVAssetDownloadURLSession?
var downloadIdentifier = "\(Bundle.main.bundleIdentifier!).background"
func setupAssetDownload(videoUrl: String) {
// Create new background session configuration.
configuration = URLSessionConfiguration.background(withIdentifier: downloadIdentifier)
// Create a new AVAssetDownloadURLSession with background configuration, delegate, and queue
downloadSession = AVAssetDownloadURLSession(configuration: configuration!,
assetDownloadDelegate: self,
delegateQueue: OperationQueue.main)
if let url = URL(string: videoUrl){
let asset = AVURLAsset(url: url)
// Create new AVAssetDownloadTask for the desired asset
let downloadTask = downloadSession?.makeAssetDownloadTask(asset: asset,
assetTitle: "Some Title",
assetArtworkData: nil,
options: nil)
// Start task and begin download
downloadTask?.resume()
}
}//end method
func urlSession(_ session: URLSession, assetDownloadTask: AVAssetDownloadTask, didFinishDownloadingTo location: URL) {
// Do not move the asset from the download location
UserDefaults.standard.set(location.relativePath, forKey: "testVideoPath")
}
if you don't understand what's going on, read up about it here:
https://developer.apple.com/library/archive/documentation/AudioVideo/Conceptual/MediaPlaybackGuide/Contents/Resources/en.lproj/HTTPLiveStreaming/HTTPLiveStreaming.html
now you can use the stored HSL content to play the video in AVPlayer with the following code:
//get the saved link from the user defaults
let savedLink = UserDefaults.standard.string(forKey: "testVideoPath")
let baseUrl = URL(fileURLWithPath: NSHomeDirectory()) //app's home directory
let assetUrl = baseUrl.appendingPathComponent(savedLink!) //append the saved link to home path
now use the path to play video in AVPlayer
let avAssest = AVAsset(url: assetUrl)
let playerItem = AVPlayerItem(asset: avAssest)
let player = AVPlayer(playerItem: playerItem) // video path coming from above function
let playerViewController = AVPlayerViewController()
playerViewController.player = player
self.present(playerViewController, animated: true, completion: {
player.play()
})
The only way you can do this is to set up an HTTP server to serve the files locally after you've downloaded them.
The Live playlist uses a sliding-window. You need to periodically reload it after target-duration time and download only the new segments as they appear in the list (they will be removed at a later time).
Here are some related answers: Can IOS devices stream m3u8 segmented video from the local file system using html5 video and phonegap/cordova?
You can easily download an HLS stream with AVAssetDownloadURLSession makeAssetDownloadTask. Have a look at the AssetPersistenceManager in Apples Sample code: https://developer.apple.com/library/content/samplecode/HLSCatalog/Introduction/Intro.html
It should be fairly straight forward to use the Objective C version of the api.
Yes, you can download video stream served over HLS and watch it later.
There is a very straight forward sample app (HLSCatalog) from apple on this. The code is fairly simple. you can find it here - https://developer.apple.com/services-account/download?path=/Developer_Tools/FairPlay_Streaming_Server_SDK_v3.1/FairPlay_Streaming_Server_SDK_v3.1.zip
You can find more about offline HLS streaming here.
I am developing a share extension for photos for my iOS app. Inside the extension, I am able to successfully retrieve the UIImage object from the NSItemProvider.
However, I would like to be able to share the image with my container app, without having to store the entire image data inside my shared user defaults. Is there a way to get the PHAsset of the image that the user has chosen in the share extension (if they have picked from their device)?
The documentation on the photos framework (https://developer.apple.com/library/ios/documentation/Photos/Reference/Photos_Framework/) has a line that says "This architecture makes it easy, safe, and efficient to work with the same assets from multiple threads or multiple apps and app extensions."
That line makes me think there is a way to share the same PHAsset between extension and container app, but I have yet to figure out any way to do that? Is there a way to do that?
This only works if the NSItemProvider gives you a URL with the format:
file:///var/mobile/Media/DCIM/100APPLE/IMG_0007.PNG
which is not always true for all your assets, but if it returns a URL as:
file:///var/mobile/Media/PhotoData/OutgoingTemp/2AB79E02-C977-4B4A-AFEE-60BC1641A67F.JPG
then PHAsset will never find your asset. Further more, the latter is a copy of your file, so if you happen to have a very large image/video, iOS will duplicate it in that OutgoingTemp directory. Nowhere in the documentation says when it's going to be deleted, hopefully soon enough.
I think this is a big gap Apple has left between Sharing Extensions and PHPhotoLibrary framework. Apple should've be creating an API to close it, and soon.
You can get PHAsset if image is shared from Photos app. The item provider will give you a URL that contains the image's filename, you use this to match PHAsset.
/// Assets that handle through handleImageItem:completionHandler:
private var handledAssets = [PHAsset]()
/// Key is the matched asset's original file name without suffix. E.g. IMG_193
private lazy var imageAssetDictionary: [String : PHAsset] = {
let options = PHFetchOptions()
options.includeHiddenAssets = true
let fetchResult = PHAsset.fetchAssetsWithOptions(options)
var assetDictionary = [String : PHAsset]()
for i in 0 ..< fetchResult.count {
let asset = fetchResult[i] as! PHAsset
let fileName = asset.valueForKey("filename") as! String
let fileNameWithoutSuffix = fileName.componentsSeparatedByString(".").first!
assetDictionary[fileNameWithoutSuffix] = asset
}
return assetDictionary
}()
...
provider.loadItemForTypeIdentifier(imageIdentifier, options: nil) { imageItem, _ in
if let image = imageItem as? UIImage {
// handle UIImage
} else if let data = imageItem as? NSData {
// handle NSData
} else if let url = imageItem as? NSURL {
// Prefix check: image is shared from Photos app
if let imageFilePath = imageURL.path where imageFilePath.hasPrefix("/var/mobile/Media/") {
for component in imageFilePath.componentsSeparatedByString("/") where component.containsString("IMG_") {
// photo: /var/mobile/Media/DCIM/101APPLE/IMG_1320.PNG
// edited photo: /var/mobile/Media/PhotoData/Mutations/DCIM/101APPLE/IMG_1309/Adjustments/FullSizeRender.jpg
// cut file's suffix if have, get file name like IMG_1309.
let fileName = component.componentsSeparatedByString(".").first!
if let asset = imageAssetDictionary[fileName] {
handledAssets.append(asset)
imageCreationDate = asset.creationDate
}
break
}
}
}
Has anyone figured out how to extract the video portion from a Live Photo? I'm working on an app to convert Live Photos into a GIF, and the first step is to get the video file from the Live Photo. It seems like it should be possible, because if you plug in your phone to a Mac you can see the separate image and video files. I've kinda run into a brick wall in the extraction process, and I've tried many ways to do it and they all fail.
The first thing I did was obtain a PHAsset for what I think is the video part of the Live Photo, by doing the following:
if let livePhoto = info["UIImagePickerControllerLivePhoto"] as? PHLivePhoto {
let assetResources = PHAssetResource.assetResourcesForLivePhoto(livePhoto)
for assetRes in assetResources {
if (assetRes.type == .PairedVideo) {
let assets = PHAsset.fetchAssetsWithLocalIdentifiers([assetRes.assetLocalIdentifier], options: nil)
if let asset = assets.firstObject as? PHAsset {
To convert the PHAsset to an AVAsset I've tried:
asset.requestContentEditingInputWithOptions(nil, completionHandler: { (contentEditingInput, info) -> Void in
if let url = contentEditingInput?.fullSizeImageURL {
let movieUrl = url.absoluteString + ".mov"
let avAsset = AVURLAsset(URL: NSURL(fileURLWithPath: movieUrl), options: nil)
debugPrint(avAsset)
debugPrint(avAsset.duration.value)
}
})
I don't think this one works because the debug print with the duration.value gives 0.
I've also tried without the ".mov" addition and it still doesn't work.
I also tried:
PHImageManager.defaultManager().requestAVAssetForVideo(asset, options: nil, resultHandler: { (avAsset, audioMix, info) -> Void in
debugPrint(avAsset)
})
And the debugPrint(avAsset) prints nil so it doesn't work.
I'm kind of afraid they might have made it impossible to do, it seems like I'm going in circles since it seems like the PHAsset I got is still a Live Photo and not actually a video.
Use the PHAssetResourceManager to get the video file from the PHAssetResource.
PHAssetResourceManager.defaultManager().writeDataForAssetResource(assetRes,
toFile: fileURL, options: nil, completionHandler:
{
// Video file has been written to path specified via fileURL
}
NOTE: The Live Photo specific APIs were introduced in iOS 9.1
// suppose you have PHAsset instance (you can get it via [PHAsset fetchAssetsWithOptions:...])
PHAssetResource *videoResource = nil;
NSArray *resourcesArray = [PHAssetResource assetResourcesForAsset:asset];
const NSInteger livePhotoAssetResourcesCount = 2;
const NSInteger videoPartIndex = 1;
if (resourcesArray.count == livePhotoAssetResourcesCount) {
videoResource = resourcesArray[videoPartIndex];
}
if (videoResource) {
NSString * const fileURLKey = #"_fileURL";
NSURL *videoURL = [videoResource valueForKey:fileURLKey];
// load video url using AVKit or AVFoundation
}
I accidentally did. I have an ios app called Goodreader (available in the appstore) which features a windows-like file manager. When importing a live photo, it will save it as a folder ending in .pvt containing the jpg and mov files in it. There is only one caveat: you need to open the live photo from within the messages app after you've sent it to yourself or somebody else to see the "import to goodreader" option, not from the photos app.