How to fetch images from latest to oldest to collectionviewe - ios

I have a collectionview which loads from library.Currently all the images are listed.But oldest images are listed first.I wish to display them from latest showing first to oldest.How to do this??
The code below is what I am currenly using.How to update this inorder to list images from starting latest to oldest
if PHPhotoLibrary.authorizationStatus() == PHAuthorizationStatus.Authorized
{
let collection:PHFetchResult = PHAssetCollection.fetchAssetCollectionsWithType(.SmartAlbum, subtype: .SmartAlbumUserLibrary, options: nil)
var i = 0
repeat
{
if (collection.count > 0)
{
if let first_Obj:AnyObject = collection.objectAtIndex(i)
{
self.assetCollection = first_Obj as! PHAssetCollection
}
i += 1
}
}while( i < collection.count)
}

You may utilize PHFetchOptions.
PHFetchOptions Class Reference>sortDescriptors
Try this:
let options = PHFetchOptions()
options.sortDescriptors = [NSSortDescriptor(key: "endDate", ascending: false)]
let collection:PHFetchResult = PHAssetCollection.fetchAssetCollectionsWithType(.SmartAlbum, subtype: .SmartAlbumUserLibrary, options: options)

Swift 4.2
let allPhotosOptions = PHFetchOptions()
allPhotosOptions.sortDescriptors = [NSSortDescriptor(key: "creationDate", ascending: false)]
fetchResult = PHAsset.fetchAssets(with: allPhotosOptions)

Related

How do I get all videos only from library?

I'm trying to get all videos that are in the camera roll on a user's phone when they try and upload a video, but I'm not sure how to.
I've done this to get all pictures and noticed that if I change the .image to .video it gets all the videos, but they are still presented as an image and you can't play the video:
func fetchImagesFromDeviceLibary() {
let allPhotos = PHAsset.fetchAssets(with: .image, options: getAssetFetchOptions())
DispatchQueue.global(qos: .background).async {
//Enumerate objects
allPhotos.enumerateObjects({ (asset, count, stop) in
let imageManager = PHImageManager.default()
let targetSize = CGSize(width: 600, height: 600)
let options = PHImageRequestOptions()
options.isSynchronous = true
imageManager.requestImage(for: asset, targetSize: targetSize, contentMode: .aspectFit, options: options, resultHandler: {
(image, info) in
if let image = image {
self.videos.append(image)
self.assets.append(asset)
if self.selectedVideo == nil {
self.selectedVideo = image
}
if count == allPhotos.count - 1 {
DispatchQueue.main.async {
self.collectionView.reloadData()
}
}
}
})
})
}
}
func getAssetFetchOptions() -> PHFetchOptions {
let options = PHFetchOptions()
options.fetchLimit = 50
let sortDescriptor = NSSortDescriptor(key: "creationDate", ascending: false)
options.sortDescriptors = [sortDescriptor]
return options
}
How would I get all the videos and display them on screen so that you can interact with them?
After changing fetchAssets with .image to .video make the required changes in the getAssetsFetchOption().
func getAssetFetchOptions() -> PHFetchOptions {
let fetchOptions = PHFetchOptions()
fetchOptions.sortDescriptors = [NSSortDescriptor(key: "creationDate",
ascending: false)]
// For Images Only
// fetchOptions.predicate = NSPredicate(format: mediaType == %d", PHAssetMediaType.image.rawValue)
// For Videos Only
// fetchOptions.predicate = NSPredicate(format: "mediaType == %d, PHAssetMediaType.video.rawValue)
// For Images and Videos
// fetchOptions.predicate = NSPredicate(format: "mediaType == %d || mediaType == %d", PHAssetMediaType.image.rawValue, PHAssetMediaType.video.rawValue)
// For Videos with some duration, here I’m taking it as 10 second
fetchOptions.predicate = NSPredicate(format: "mediaType = %d AND duration < 10", PHAssetMediaType.video.rawValue)
fetchOptions.fetchLimit = 50
let imagesAndVideos = PHAsset.fetchAssets(with: fetchOptions)
print(“LIST: \(imagesAndVideos)”)
return options
}
Hope this will work for you too.
Create AVPlayer Instance :
let videoURL = "your video url"
// Create an AVPlayer, passing it the local video url path
let player = AVPlayer(url: videoURL as URL)
let controller = AVPlayerViewController()
controller.player = player
present(controller, animated: true) {
player.play()
}
Do not forgot to import AVKit and AVFoundation.
Also try to make AvPlayer instance globally.

How to fetch all assets from iPhone apart from camera roll?

How to fetch assets from all albums including images and videos ?
let fetchOptions = PHFetchOptions()
fetchOptions.sortDescriptors = [NSSortDescriptor(key: "creationDate", ascending: true)]
let allPhotos = PHAsset.fetchAssets(with: .image, options: fetchOptions)
This code will fetch only images from camera roll.
You can use this code to fetch all assets from iPhone.
PHPhotoLibrary.shared().register(self as PHPhotoLibraryChangeObserver)
if fetchResult == nil {
let allPhotosOptions = PHFetchOptions()
allPhotosOptions.sortDescriptors = [NSSortDescriptor(key: "creationDate", ascending: false)]
fetchResult = PHAsset.fetchAssets(with: allPhotosOptions)
}
we can fetch all photos including sync albums and user albums.
let fetchOptions = PHFetchOptions()
fetchOptions.includeAssetSourceTypes = [.typeUserLibrary, .typeiTunesSynced, .typeCloudShared]
let fetchResult = PHAsset.fetchAssets(with: fetchOptions)
fetchResult will be having all photos and videos in all albums.

PHFetchOptions() Photos Only Using PHAsset & PHAssetCollection

I am using a chunk from Apple's sample code here:
override func awakeFromNib() {
// Create a PHFetchResult object for each section in the table view.
let allPhotosOptions = PHFetchOptions()
allPhotosOptions.sortDescriptors = [NSSortDescriptor(key: "creationDate", ascending: false)]
let allPhotos = PHAsset.fetchAssetsWithOptions(allPhotosOptions)
let smartAlbums = PHAssetCollection.fetchAssetCollectionsWithType(.SmartAlbum, subtype: .AlbumRegular, options: nil)
let topLevelUserCollections = PHCollectionList.fetchTopLevelUserCollectionsWithOptions(nil)
// Store the PHFetchResult objects and localized titles for each section.
self.sectionFetchResults = [allPhotos, smartAlbums, topLevelUserCollections]
self.sectionLocalizedTitles = ["", NSLocalizedString("Smart Albums", comment: ""), NSLocalizedString("Albums", comment: "")]
PHPhotoLibrary.sharedPhotoLibrary().registerChangeObserver(self)
}
This lists all Albums successfully.
What I Need:
I want to only list albums with photos, exclude videos. Also, exclude video's from getting listed inside of albums, such as inside "All Photos".
What I tried:
let fetchOptions = PHFetchOptions()
fetchOptions.predicate = NSPredicate(format: "mediaType = %d", PHAssetMediaType.Image.rawValue)
This causes to crash saying 'Unsupported predicate in fetch options: mediaType == 1'
I know this is a very late answer, but in case someone like me gets here, you should use this method to retrieve a media type. In this case PHAssetMediaType.image
func fetchAssets(with mediaType: PHAssetMediaType, options: PHFetchOptions?) -> PHFetchResult<PHAsset>
So in Swift 5 it writes as
let allPhotosOptions = PHFetchOptions()
allPhotosOptions.sortDescriptors = [NSSortDescriptor(key: "creationDate", ascending: false)]
fetchResult = PHAsset.fetchAssets(with: .image, options: allPhotosOptions)
So far it works fine to me :
let allAlbums = PHAssetCollection.fetchAssetCollections(with: .album, subtype: .albumRegular, options: nil)
let someAlbum = allAlbums.object(at: 0)
let onlyPhotoOption = PHFetchOptions()
onlyPhotoOption.predicate = NSPredicate(format: "mediaType == %i", PHAssetMediaType.image.rawValue)
let photos = PHAsset.fetchAssets(in: someAlbum, options: onlyPhotoOption)

How to add latest photo of photo library to uiimageview using Swift?

Is there a way to add the latest photo of my photo library to an uiimageview? I've tried nearly everything but nothing works.
Swift 3
let fetchOptions = PHFetchOptions()
fetchOptions.sortDescriptors = [SortDescriptor(key: "creationDate", ascending: true)]
let fetchResult = PHAsset.fetchAssets(with: .image, options: fetchOptions)
let last = fetchResult.lastObject
if let lastAsset = last {
let options = PHImageRequestOptions()
options.version = .current
PHImageManager.default().requestImage(
for: lastAsset,
targetSize: imageView.bounds.size,
contentMode: .aspectFit,
options: options,
resultHandler: { image, _ in
DispatchQueue.main.async {
self.imageView.image = image
}
}
)
}
Swift 2
#IBOutlet weak var imageView: UIImageView!
override func viewDidLoad() {
super.viewDidLoad()
let fetchOptions = PHFetchOptions()
fetchOptions.sortDescriptors = [NSSortDescriptor(key: "creationDate", ascending: true)]
let fetchResult = PHAsset.fetchAssetsWithMediaType(.Image, options: fetchOptions)
let last = fetchResult.lastObject
if let lastAsset = last {
let options = PHImageRequestOptions()
options.version = .Current
options.deliveryMode = .HighQualityFormat
PHImageManager.defaultManager().requestImageForAsset(
lastAsset as! PHAsset,
targetSize: imageView.bounds.size,
contentMode: .AspectFit,
options: options,
resultHandler: { image, _ in
dispatch_async(dispatch_get_main_queue(), {
self.imageView.image = image
})
}
)
}
}

Error: 'Unsupported predicate in fetch options: mediaType == 2'

I'm trying to use smartAlbum to generate an array of either only videos or only photos or both.
You can see my code below:
PHFetchResult *collectionList = [PHCollectionList fetchMomentListsWithSubtype:PHCollectionListSubtypeMomentListCluster options:nil];
PHFetchOptions *options = nil;
if (self.xSelected) {
options = [[PHFetchOptions alloc] init];
options.sortDescriptors = #[[NSSortDescriptor sortDescriptorWithKey:#"creationDate" ascending:NO]];
options.predicate = [NSPredicate predicateWithFormat:#"mediaType = %d",PHAssetMediaTypeImage];
}
if (self.ySelected) {
options = [[PHFetchOptions alloc] init];
options.predicate = [NSPredicate predicateWithFormat:#"mediaType = %d",PHAssetMediaTypeVideo];
}
[collectionList enumerateObjectsUsingBlock:^(PHCollectionList *collection, NSUInteger idx, BOOL *stop) {
PHFetchResult *momentsInCollection = [PHAssetCollection fetchMomentsInMomentList:collection options:options];
for (id moment in momentsInCollection) {
PHAssetCollection *castedMoment = (PHAssetCollection *)moment;
[_smartAlbums insertObject:castedMoment atIndex:0];
}
}];
This however is constantly breaking on the first line inside the block and giving the following error:
Terminating app due to uncaught exception 'NSInvalidArgumentException', reason: 'Unsupported predicate in fetch options: mediaType == 2'
I did a little research and found this link.
I'm wondering if this is an Apple bug or if its just something wrong with my code.
It seems like it worked for the people that referred to this answer, which is so weird coz its basically the same thing.
Thanks in advance,
Anish
EDIT:
I think I found the answer here in Apple's Documentation. Looks like mediaType is a key only for PHAsset and not PHAssetCollection. So now I guess the question is how to get PFAssetCollection with only videos or only images.
Use this method to get photos and videos assets array separately.
Get All Videos
func getAllVideos(completion:#escaping (_ videoAssets : [PHAsset]?) -> Void) {
var videoAssets : [PHAsset] = []
let fetchOptions = PHFetchOptions()
fetchOptions.sortDescriptors = [NSSortDescriptor(key: "creationDate",ascending: false)]
fetchOptions.predicate = NSPredicate(format: "mediaType = %d", PHAssetMediaType.video.rawValue)
let allVideo = PHAsset.fetchAssets(with: .video, options: fetchOptions)
allVideo.enumerateObjects { (asset, index, bool) in
videoAssets.append(asset)
}
completion(videoAssets)
}
Get All Photos
func getAllPhotos(completion:#escaping (_ photosAssets : [PHAsset]?) -> Void) {
var photosAssets : [PHAsset] = []
let fetchOptions = PHFetchOptions()
let scale = UIScreen.main.scale
let screenWidth = UIScreen.main.bounds.width * scale
let screenHeight = UIScreen.main.bounds.height * scale
fetchOptions.sortDescriptors = [NSSortDescriptor(key: "creationDate",ascending: false)]
fetchOptions.predicate = NSPredicate(format: "mediaType = %d || (mediaSubtype & %d) != 0 && (pixelHeight != %d AND pixelWidth != %d) OR (pixelHeight != %d AND pixelWidth != %d)", PHAssetMediaType.image.rawValue, PHAssetMediaSubtype.photoLive.rawValue ,screenHeight, screenWidth, screenWidth, screenHeight)
let allPhotos = PHAsset.fetchAssets(with: .image, options: fetchOptions)
allPhotos.enumerateObjects { (asset, index, bool) in
photosAssets.append(asset)
}
completion(photosAssets)
}
No need to put mediaType in predicate
In swift 4.0 this is how i used fetchAsset() method from Photos framework , to get all videos from photo library.
You can also get the video from specific folder using predicate.
func fetchAllVideos()
{
//let albumName = "blah"
let fetchOptions = PHFetchOptions()
// fetchOptions.predicate = NSPredicate(format: "title = %#", albumName)
//uncomment this if you want video from custom folder
fetchOptions.predicate = NSPredicate(format: "mediaType = %d ", PHAssetMediaType.video.rawValue )
let allVideo = PHAsset.fetchAssets(with: .video, options: fetchOptions)
allVideo.enumerateObjects { (asset, index, bool) in
// videoAssets.append(asset)
let imageManager = PHCachingImageManager()
imageManager.requestAVAsset(forVideo: asset, options: nil, resultHandler: { (asset, audioMix, info) in
if asset != nil {
let avasset = asset as! AVURLAsset
let urlVideo = avasset.url
print(urlVideo)
}
})
}
}
Hope this help!!!

Resources