I am using PHCachingImageManager().requestAVAssetForVideo to fetch videos from iPhone. I am not getting all the videos from my iPhone. Videos stored in Photos are being fetched not all.
How can I get all the videos stored in iPhone storage.
Here is the code.
let options = PHFetchOptions()
options.sortDescriptors = [NSSortDescriptor(key: "modificationDate",
ascending: true)]
let assetResults = PHAsset.fetchAssetsWithMediaType(.Video,
options: options)
for i in 0 ..< assetResults.count{
let object: AnyObject = assetResults[i]
if let asset = object as? PHAsset{
let options = PHVideoRequestOptions()
options.deliveryMode = .Automatic
options.networkAccessAllowed = true
options.version = .Current
options.progressHandler = {(progress: Double,
error: NSError?,
stop: UnsafeMutablePointer<ObjCBool>,
info: [NSObject : AnyObject]?) in
}
/* Now get the video */
PHCachingImageManager().requestAVAssetForVideo(asset,options: options,resultHandler: {(asset: AVAsset?,audioMix: AVAudioMix?,info: [NSObject : AnyObject]?) -> Void in dispatch_async(dispatch_get_main_queue(), {
/* Did we get the URL to the video? */
if let asset = asset as? AVURLAsset{
})
})
}
}
You need to have a look inside PHFetchOptions Maybe some of your video assets are hidden?
var fetchOptions = PHFetchOptions()
fetchOptions.includeHiddenAssets = true
From the Docs-
// Whether hidden assets are included in fetch results. Defaults to NO
public var includeHiddenAssets: Bool
Hope it helps.
Related
I am trying to upload a video to firebase Storage, to be able to do that I have to get the following:
let localFile = URL(string: "path/to/video")!
and then upload it with:
let uploadTask = riversRef.putFile(from: localFile
My problem is that I dont know how to get that path. So far I have a have a custom image / video picker, and then a function that can upload images. I just dont know how to upload Videos, for that I believe i need to get the path.
Any ideas on how to get localFile?
UPDATE:
This is my custom Picker
import SwiftUI
import PhotosUI
class ImagePickerViewModel: ObservableObject {
// MARK: Properties
#Published var fetchedImages: [ImageAsset] = []
#Published var selectedImages: [ImageAsset] = []
init(){
fetchImages()
}
// MARK: Fetching Images
func fetchImages(){
let options = PHFetchOptions()
// MARK: Modify As Per Your Wish
options.includeHiddenAssets = false
options.includeAssetSourceTypes = [.typeUserLibrary]
options.sortDescriptors = [NSSortDescriptor(key: "creationDate", ascending: false)]
PHAsset.fetchAssets(with: .image, options: options).enumerateObjects { asset, _, _ in
let imageAsset: ImageAsset = .init(asset: asset)
self.fetchedImages.append(imageAsset)
}
PHAsset.fetchAssets(with: .video, options: options).enumerateObjects { asset, _, _ in
let imageAsset: ImageAsset = .init(asset: asset)
self.fetchedImages.append(imageAsset)
}
}
}
On my view I have this:
.popupImagePicker(show: $showPicker) { assets in
// MARK: Do Your Operation With PHAsset
// I'm Simply Extracting Image
// .init() Means Exact Size of the Image
let manager = PHCachingImageManager.default()
let options = PHImageRequestOptions()
options.isSynchronous = true
DispatchQueue.global(qos: .userInteractive).async {
assets.forEach { asset in
manager.requestImage(for: asset, targetSize: .init(), contentMode: .default, options: options) { image, _ in
guard let image = image else {return}
DispatchQueue.main.async {
self.pickedImages.append(image)
}
}
}
}
}
In my project I show library image and video to user but in some device I got the crash like ArrayBuffer.getElementSlowPath. Can anyone guide me how i can replicate this issue? I got this issue from Crashlytics.
Here is my code for get videos from phassests.
func getVideo(withCompletionHandler completion:#escaping CompletionHandler) {
let fetchOptions = PHFetchOptions()
let requestOptions = PHVideoRequestOptions()
requestOptions.isNetworkAccessAllowed = false
fetchOptions.sortDescriptors = [NSSortDescriptor(key:"creationDate", ascending: false)]
let fetchResult: PHFetchResult = PHAsset.fetchAssets(with: PHAssetMediaType.video, options: fetchOptions)
fetchResult.enumerateObjects ({ (assest, index, isCompleted) in
if assest.sourceType != PHAssetSourceType.typeiTunesSynced{
PHImageManager.default().requestAVAsset(forVideo: assest , options: requestOptions, resultHandler: { (asset : AVAsset?, video : AVAudioMix?, dic : [AnyHashable : Any]?) in
if let _ = asset as? AVURLAsset
{
let objAssest = GallaryAssets()
objAssest.objAssetsType = assetsType.videoType
objAssest.createdDate = (assest ).creationDate
objAssest.assetsDuration = (assest ).duration
objAssest.assetsURL = (asset as! AVURLAsset).url
objAssest.localizationStr = assest.localIdentifier
objAssest.locationInfo = LocationInfo()
if let location = (assest).location
{
objAssest.locationInfo.Latitude = "\(location.coordinate.latitude)"
objAssest.locationInfo.Longitude = "\(location.coordinate.longitude)"
}
self.media.add(objAssest)
}
completion(self.media)
}
})
}
})
}
}
I can't say for sure that this is the cause of your crash, but for anyone else finding this question on Google search results:
I encountered a similar cryptic stack trace with ArrayBuffer.getElementSlowPath in Crashlytics. After much debugging I reproduced the crash locally and it turned out that a typed Swift array had an element in it which was not of the expected type. This had happened because the array in question was actually bridged from Objective-C which is less strict about types in arrays.
May be some video/ resource is coming from iCloud you need to remove PHVideorequestoptions then check your flow . Hope your crash will solve.
func getVideo(withCompletionHandler completion:#escaping CompletionHandler) {
let fetchOptions = PHFetchOptions()
fetchOptions.sortDescriptors = [NSSortDescriptor(key:"creationDate", ascending: false)]
let fetchResult: PHFetchResult = PHAsset.fetchAssets(with: PHAssetMediaType.video, options: fetchOptions)
fetchResult.enumerateObjects ({ (assest, index, isCompleted) in
if assest.sourceType != PHAssetSourceType.typeiTunesSynced{
PHImageManager.default().requestAVAsset(forVideo: assest , options: **PHVideoRequestOptions**(), resultHandler: { (asset : AVAsset?, video : AVAudioMix?, dic : [AnyHashable : Any]?) in
if let _ = asset as? AVURLAsset
{
let objAssest = GallaryAssets()
objAssest.objAssetsType = assetsType.videoType
objAssest.createdDate = (assest ).creationDate
objAssest.assetsDuration = (assest ).duration
objAssest.assetsURL = (asset as! AVURLAsset).url
objAssest.localizationStr = assest.localIdentifier
objAssest.locationInfo = LocationInfo()
if let location = (assest).location
{
objAssest.locationInfo.Latitude = "\(location.coordinate.latitude)"
objAssest.locationInfo.Longitude = "\(location.coordinate.longitude)"
}
self.media.add(objAssest)
isCompleted.pointee = true
}
})
}
})
completion(self.media)
}
Are you removing a controller on completion? The enumerateObject function is executed synchronously, if you call the completion before it stops executing, and the fetchResult object gets deallocated before the function ends its execution, this could cause issues.
Try calling the completion block after the enumerate ends, and if you want just one result, change the isCompleted pointer to true, this will stop the execution of the enumeration:
func getVideo(withCompletionHandler completion:#escaping CompletionHandler) {
let fetchOptions = PHFetchOptions()
let requestOptions = PHVideoRequestOptions()
fetchOptions.sortDescriptors = [NSSortDescriptor(key:"creationDate", ascending: false)]
let fetchResult: PHFetchResult = PHAsset.fetchAssets(with: PHAssetMediaType.video, options: fetchOptions)
fetchResult.enumerateObjects ({ (assest, index, isCompleted) in
if assest.sourceType != PHAssetSourceType.typeiTunesSynced{
PHImageManager.default().requestAVAsset(forVideo: assest , options: requestOptions, resultHandler: { (asset : AVAsset?, video : AVAudioMix?, dic : [AnyHashable : Any]?) in
if let _ = asset as? AVURLAsset
{
let objAssest = GallaryAssets()
objAssest.objAssetsType = assetsType.videoType
objAssest.createdDate = (assest ).creationDate
objAssest.assetsDuration = (assest ).duration
objAssest.assetsURL = (asset as! AVURLAsset).url
objAssest.localizationStr = assest.localIdentifier
objAssest.locationInfo = LocationInfo()
if let location = (assest).location
{
objAssest.locationInfo.Latitude = "\(location.coordinate.latitude)"
objAssest.locationInfo.Longitude = "\(location.coordinate.longitude)"
}
self.media.add(objAssest)
isCompleted.pointee = true
}
})
}
})
completion(self.media)
}
I am trying to get all the photos from cameraRoll using Photos framework but its taking a lot of time to fetch all the photos from cameraRoll.
Is their anyway to add pagination to it ?
so i can fetch while scrolling.
var images = [UIImage]()
var assets = [PHAsset]()
fileprivate func assetsFetchOptions() -> PHFetchOptions {
let fetchOptions = PHFetchOptions()
//fetchOptions.fetchLimit = 40 //uncomment to limit photo
let sortDescriptor = NSSortDescriptor(key: "creationDate", ascending: false)
fetchOptions.sortDescriptors = [sortDescriptor]
return fetchOptions
}
fileprivate func fetchPhotos() {
let allPhotos = PHAsset.fetchAssets(with: .image, options: assetsFetchOptions())
DispatchQueue.global(qos: .background).async {
allPhotos.enumerateObjects({ (asset, count, stop) in
//print(count)
let imageManager = PHImageManager.default()
let targetSize = CGSize(width: 200, height: 200)
let options = PHImageRequestOptions()
options.isSynchronous = true
imageManager.requestImage(for: asset, targetSize: targetSize, contentMode: .aspectFit, options: options, resultHandler: { (image, info) in
if let image = image {
self.images.append(image)
self.assets.append(asset)
}
if count == allPhotos.count - 1 {
DispatchQueue.main.async {
self.collectionView?.reloadData()
}
}
})
})
}
}
allPhotos is of type PHFetchResult< PHAsset > which is a lazy collection, ie it doesn't actually go out and get the photo until you ask it for one, which is what .enumerateObjects is doing. You can just grab the photos one at a time with the subscript operator or get a range of objects with objects(at:) to page through the collection as needed.
I'm trying to retrieve a PHAsset however PHAsset.fetchAssets(withALAssetURLs:options:) is deprecated from iOS 8 so how can I properly retrieve a PHAsset?
I had the same the issue, first check permissions and request access:
let status = PHPhotoLibrary.authorizationStatus()
if status == .notDetermined {
PHPhotoLibrary.requestAuthorization({status in
})
}
Just hook that up to whatever triggers your UIImagePickerController. The delegate call should now include the PHAsset in the userInfo.
guard let asset = info[UIImagePickerControllerPHAsset] as? PHAsset
Here is my solution:
func imagePickerController(_ picker: UIImagePickerController, didFinishPickingMediaWithInfo info: [String : Any]) {
if #available(iOS 11.0, *) {
let asset = info[UIImagePickerControllerPHAsset]
} else {
if let assetURL = info[UIImagePickerControllerReferenceURL] as? URL {
let result = PHAsset.fetchAssets(withALAssetURLs: [assetURL], options: nil)
let asset = result.firstObject
}
}
}
The PHAsset will not appear in the didFinishPickingMediaWithInfo: info result unless the user has authorized, which did not happen for me just by presenting the picker. I added this in the Coordinator init():
let status = PHPhotoLibrary.authorizationStatus()
if status == .notDetermined {
PHPhotoLibrary.requestAuthorization({status in
})
}
I am not sure what you want.
Are you trying to target iOS 8?
This is how I fetch photos and it works in iOS (8.0 and later), macOS (10.11 and later), tvOS (10.0 and later).
Code is commented where it may be confusing
The first functions sets the options to fetch the photos
The second function will actually fetch them
//import the Photos framework
import Photos
//in these arrays I store my images and assets
var images = [UIImage]()
var assets = [PHAsset]()
fileprivate func setPhotoOptions() -> PHFetchOptions{
let fetchOptions = PHFetchOptions()
fetchOptions.fetchLimit = 15
let sortDescriptor = NSSortDescriptor(key: "creationDate", ascending: false)
fetchOptions.sortDescriptors = [sortDescriptor]
return fetchOptions
}
fileprivate func fetchPhotos() {
let allPhotos = PHAsset.fetchAssets(with: .image, options: setPhotoOptions())
DispatchQueue.global(qos: .background).async {
allPhotos.enumerateObjects({ (asset, count, stop) in
let imageManager = PHImageManager.default()
let targetSize = CGSize(width: 200, height: 200)
let options = PHImageRequestOptions()
options.isSynchronous = true
imageManager.requestImage(for: asset, targetSize: targetSize, contentMode: .aspectFit, options: options, resultHandler: { (image, info) in
if let image = image {
self.images.append(image)
self.assets.append(asset)
}
if count == allPhotos.count - 1 {
DispatchQueue.main.async {
//basically, here you can do what you want
//(after you finish retrieving your assets)
//I am reloading my collection view
self.collectionView?.reloadData()
}
}
})
})
}
}
Edit based on OP's clarification
You need to set the delegate UIImagePickerControllerDelegate
then implement the following function
func imagePickerController(picker: UIImagePickerController, didFinishPickingMediaWithInfo info: [String : AnyObject]) {
within said method, get the image like this:
var image : UIImage = info[UIImagePickerControllerEditedImage] as! UIImage
I would like to fetch all photos that are saved in device and save them to my app and then eventually (if user allow this) delete originals.
This is my whole class I created for this task:
class ImageAssetsManager: NSObject {
let imageManager = PHCachingImageManager()
func fetchAllImages() {
let options = PHFetchOptions()
options.predicate = NSPredicate(format: "mediaType = %d", PHAssetMediaType.Image.rawValue)
options.sortDescriptors = [NSSortDescriptor(key: "creationDate", ascending: true)]
if #available(iOS 9.0, *) {
options.fetchLimit = 5
} else {
// Fallback on earlier versions
}
let imageAssets = PHAsset.fetchAssetsWithOptions(options)
print(imageAssets.count)
self.getAssets(imageAssets)
}
func getAssets(assets: PHFetchResult) {
var assetsToDelete: [PHAsset] = []
assets.enumerateObjectsUsingBlock { (object, count, stop) in
if object is PHAsset {
let asset = object as! PHAsset
let imageSize = CGSize(width: asset.pixelWidth,height: asset.pixelHeight)
let options = PHImageRequestOptions()
options.deliveryMode = .FastFormat
options.synchronous = true
self.imageManager.requestImageForAsset(asset, targetSize: imageSize, contentMode: .AspectFill, options: options, resultHandler: { [weak self]
image, info in
self.addAssetToSync(image, info: info)
assetsToDelete.append(asset)
})
}
}
self.deleteAssets(assetsToDelete)
}
func addAssetToSync(image: UIImage?, info: [NSObject : AnyObject]?) {
guard let image = image else {
return
}
guard let info = info else {
return
}
dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0), {
let imageData = UIImageJPEGRepresentation(image, 0.95)!
let fileUrl = info["PHImageFileURLKey"] as! NSURL
dispatch_async(dispatch_get_main_queue(), {
let photoRootItem = DatabaseManager.sharedInstance.getPhotosRootItem()
let ssid = DatabaseManager.sharedInstance.getSsidInfoByName(ContentManager.sharedInstance.ssid)
let item = StorageManager.sharedInstance.createFile(imageData, name: fileUrl.absoluteString.fileNameWithoutPath(), parentFolder: photoRootItem!, ssid: ssid!)
})
})
}
func deleteAssets(assetsToDelete: [PHAsset]){
PHPhotoLibrary.sharedPhotoLibrary().performChanges({
PHAssetChangeRequest.deleteAssets(assetsToDelete)
}, completionHandler: { success, error in
guard let error = error else {return}
})
}
}
It's working but my problem is that it's working just for a limited number of photos. When I try it with all I get memory warnings and then app crashed. I know why it is. I know that my problem is that I get all photos to memory and it's too much. I could fetch images with that fetch limit and make it to loop but I am not sure if it is best solution.
I was hoping that with some solution process few photos then release memory and again and again until end. But this change would be somewhere in enumerateObjectsUsingBlock. I am not sure if it helps but I don't even need to get image. I just need to copy image file from device path to my app sandbox path.
What's best solution for this? How to avoid memory warnings and leaks? Thanks
Change your dispatch_async calls to dispatch_sync. Then you will process photos one at a time as you walk through enumerateObjectsUsingBlock, instead of trying to process them all at the same time.