Check given PHAsset is iCloud asset? - ios

I'm trying to get PhAsset object. I want to segregate iCloud assets. Here is my code,
PHFetchResult *cloudAlbums = [PHAssetCollection fetchAssetCollectionsWithType:PHAssetCollectionTypeAlbum subtype:PHAssetCollectionSubtypeAny options:nil];
[cloudAlbums enumerateObjectsUsingBlock:^(PHAssetCollection *collection, NSUInteger idx, BOOL *stop){
if(collection != nil){
PHFetchResult *result = [PHAsset fetchAssetsInAssetCollection:collection options:fetchOptions];
[result enumerateObjectsUsingBlock:^(PHAsset *asset, NSUInteger idx, BOOL *stop)
{
// check asset is iCloud asset
}];
}
}];
Please tell me how to find the PHAsset is iCloud asset?

It's a bit kind of hack, where I had to dig out the resource array and debug to find out my required information. But it works. Although this is an undocumented code and I'm not sure whether apple will reject the app because of this or not. Give it a try and see what happens!
// asset is a PHAsset object for which you want to get the information
NSArray *resourceArray = [PHAssetResource assetResourcesForAsset:asset];
BOOL bIsLocallayAvailable = [[resourceArray.firstObject valueForKey:#"locallyAvailable"] boolValue]; // If this returns NO, then the asset is in iCloud and not saved locally yet
You can also get some other useful information from asset resource, such as - original filename, file size, file url, etc.

There are actually 2 kinds of situations:
1. The photo is captured by this device, and is uploaded to iCloud. Then, you can use the progressHandler to check whether it needs iCloud download.
__block BOOL isPhotoInICloud = NO;
PHImageRequestOptions *options = [[PHImageRequestOptions alloc] init];
options.networkAccessAllowed = YES;
options.deliveryMode = PHImageRequestOptionsDeliveryModeHighQualityFormat;
options.progressHandler = ^(double progress, NSError *error, BOOL *stop, NSDictionary *info){
isPhotoInICloud = YES;
});
[[PHImageManager defaultManager] requestImageForAsset:asset targetSize:PHImageManagerMaximumSize contentMode:PHImageContentModeAspectFit options:options resultHandler:^(UIImage * _Nullable result, NSDictionary * _Nullable info) {
if (isPhotoInICloud) {
// Photo is in iCloud.
}
});
The photo is in iCloud but uploaded from other device. And you did not save it to your local photo library. So the progressHandler block will never ever be invoked. I don't know why but it's true, and I think it's kind of a bug of PhotoKit framework.
For this situation, if you use the PHImageResultIsInCloudKey, that is also difficult. Because you can know the PHImageResultIsInCloudKey value just in the requestImageForAsset's resultHandler block. But that's the time after the photo request is initiated.
So, at least, in my opinion, there is no way to check whether photo is stored in iCloud.
Maybe there is other better way, please let me know.
Thanks very much!

When you request for an image you get a key in info dictionary which tells you if the asset is present in iCloud.
[cloudAlbums enumerateObjectsUsingBlock:^(PHAssetCollection *collection, NSUInteger idx, BOOL *stop)
{
PHFetchResult *result = [PHAsset fetchAssetsInAssetCollection:collection options:fetchOptions];
[result enumerateObjectsUsingBlock:^(PHAsset *asset, NSUInteger idx, BOOL *stop)
{
PHImageRequestOptions *options = [[PHImageRequestOptions alloc] init];
options.resizeMode = PHImageRequestOptionsResizeModeFast;
options.synchronous = YES;
__block BOOL isICloudAsset = NO;
[[PHImageManager defaultManager] requestImageForAsset:asset targetSize:imageSize contentMode:PHImageContentModeAspectFit options:options resultHandler:^(UIImage *result, NSDictionary *info)
{
if ([info objectForKey: PHImageResultIsInCloudKey].boolValue)
{
isICloudAsset = YES;
}
}];
}];
}];

Here is the Swift 3 version
func checkVideoType(){
if selectedAsset != nil {
guard (selectedAsset.mediaType == .video) else {
print("Not a valid video media type")
return
}
requestID = checkIsiCloud(assetVideo:selectedAsset, cachingImageManager: catchManager)
}
}
func checkIsiCloud(assetVideo:PHAsset,cachingImageManager:PHCachingImageManager) -> PHImageRequestID{
let opt=PHVideoRequestOptions()
opt.deliveryMode = .mediumQualityFormat
opt.isNetworkAccessAllowed=true //iCloud video can play
return cachingImageManager.requestAVAsset(forVideo:assetVideo, options: opt) { (asset, audioMix, info) in
DispatchQueue.main.async {
if (info!["PHImageFileSandboxExtensionTokenKey"] != nil) {
self.iCloudStatus=false
self.playVideo(videoAsset:asset!)
}else if((info![PHImageResultIsInCloudKey]) != nil) {
self.iCloudStatus=true
}else{
self.iCloudStatus=false
self.playVideo(videoAsset:asset!)
}
}
}
}

Following is a method you can implement to acquire all videos in the Videos folder of the Photos app, which uses a predicate with the PHFetchRequest to filter only videos stored on the iPhone itself, and not in iCloud:
// Collect all videos in the Videos folder of the Photos app
- (PHFetchResult *)assetsFetchResults {
__block PHFetchResult *i = self->_assetsFetchResults;
if (!i) {
static dispatch_once_t onceToken;
dispatch_once(&onceToken, ^{
PHFetchOptions *fetchOptions = [PHFetchOptions new];
fetchOptions.predicate = [NSPredicate predicateWithFormat:#"(sourceType & %d) != 0", PHAssetSourceTypeUserLibrary];
PHFetchResult *smartAlbums = [PHAssetCollection fetchAssetCollectionsWithType:PHAssetCollectionTypeSmartAlbum subtype:PHAssetCollectionSubtypeSmartAlbumVideos options:fetchOptions];
PHAssetCollection *collection = smartAlbums.firstObject;
if (![collection isKindOfClass:[PHAssetCollection class]]) collection = nil;
PHFetchOptions *allPhotosOptions = [[PHFetchOptions alloc] init];
allPhotosOptions.sortDescriptors = #[[NSSortDescriptor sortDescriptorWithKey:#"creationDate" ascending:NO]];
i = [PHAsset fetchAssetsInAssetCollection:collection options:allPhotosOptions];
self->_assetsFetchResults = i;
});
}
return i;
}
Apple's documentation on PHFetchResult states that only a subset of attributes can be used with a predicate; so, if the above code does not work for you, remove the PHFetchOptions predicate, and replace the corresponding reference in the PHFetchRequest to nil:
// Collect all videos in the Videos folder of the Photos app
- (PHFetchResult *)assetsFetchResults {
__block PHFetchResult *i = self->_assetsFetchResults;
if (!i) {
static dispatch_once_t onceToken;
dispatch_once(&onceToken, ^{
PHFetchResult *smartAlbums = [PHAssetCollection fetchAssetCollectionsWithType:PHAssetCollectionTypeSmartAlbum subtype:PHAssetCollectionSubtypeSmartAlbumVideos options:nil];
PHAssetCollection *collection = smartAlbums.firstObject;
if (![collection isKindOfClass:[PHAssetCollection class]]) collection = nil;
PHFetchOptions *allPhotosOptions = [[PHFetchOptions alloc] init];
allPhotosOptions.sortDescriptors = #[[NSSortDescriptor sortDescriptorWithKey:#"creationDate" ascending:NO]];
i = [PHAsset fetchAssetsInAssetCollection:collection options:allPhotosOptions];
self->_assetsFetchResults = i;
});
}
return i;
}
Then, add this line:
// Filter videos that are stored in iCloud
- (NSArray *)phAssets {
NSMutableArray *assets = [NSMutableArray arrayWithCapacity:self.assetsFetchResults.count];
[[self assetsFetchResults] enumerateObjectsUsingBlock:^(PHAsset *asset, NSUInteger idx, BOOL *stop) {
if (asset.sourceType == PHAssetSourceTypeUserLibrary)
[assets addObject:asset];
}];
return [NSArray arrayWithArray:(NSArray *)assets];
}

this code should be work.
If call this code very frequently, make sure cancel useless request by PHImageRequestID.
- (PHImageRequestID)checkIsCloud:(PHAsset *)asset cachingImageManager:(PHCachingImageManager *)cachingImageManager {
if (asset.mediaType == PHAssetMediaTypeVideo) {
PHVideoRequestOptions *options = [PHVideoRequestOptions new];
options.deliveryMode = PHVideoRequestOptionsDeliveryModeMediumQualityFormat;
return [cachingImageManager requestAVAssetForVideo:asset options:options resultHandler:^(AVAsset * _Nullable avAsset, AVAudioMix * _Nullable audioMix, NSDictionary * _Nullable info) {
if (asset != self.asset) return;
dispatch_async(dispatch_get_main_queue(), ^{
if (info[#"PHImageFileSandboxExtensionTokenKey"]) {
self.iCloudStatus = KICloudStatusNone;
} else if ([info[PHImageResultIsInCloudKey] boolValue]) {
self.iCloudStatus = KICloudStatusNormal;
} else {
self.iCloudStatus = KICloudStatusNone;
}
});
}];
} else {
return [cachingImageManager requestImageDataForAsset:asset options:nil resultHandler:^(NSData * _Nullable imageData, NSString * _Nullable dataUTI, UIImageOrientation orientation, NSDictionary * _Nullable info) {
if (asset != self.asset) return;
dispatch_async(dispatch_get_main_queue(), ^{
if ([info[PHImageResultIsInCloudKey] boolValue]) {
self.iCloudStatus = KICloudStatusNormal;
} else {
self.iCloudStatus = KICloudStatusNone;
}
});
}];
}
}

Related

iOS8 PhotoKit, some albums can't be finded

When I used PhotoKit to create my albums,there is not all albums in my project,some albums imported from computer can't be searched.I will post my code follow.
//get system albums
PHImageRequestOptions * options = [[PHImageRequestOptions alloc] init];
options.networkAccessAllowed = YES;
options.synchronous = YES;
KVPhotoAlbum * userAlum = nil;
PHCachingImageManager * manager = [[PHCachingImageManager alloc] init];
PHFetchResult * systemAlbums = [PHAssetCollection fetchAssetCollectionsWithType:PHAssetCollectionTypeSmartAlbum subtype:PHAssetCollectionSubtypeAlbumRegular options:nil];
for (id object in systemAlbums) {
if ([object isKindOfClass:[PHAssetCollection class]]) {
PHAssetCollection * collection = (PHAssetCollection*)object;
if (collection.assetCollectionSubtype == PHAssetCollectionSubtypeSmartAlbumVideos) {
continue;
}
//get photo
PHFetchResult * imageResults = [PHAsset fetchAssetsInAssetCollection:collection options:nil];
if (imageResults.count) {
KVPhotoAlbum * album = [[KVPhotoAlbum alloc] init];
album.collection = collection;
album.title = [collection valueForKey:#"localizedTitle"];
album.photoCount = imageResults.count;
if (collection.assetCollectionSubtype == PHAssetCollectionSubtypeSmartAlbumUserLibrary) {
//camera
userAlum = album;
}else {
[_albums addObject:album];
}
PHAsset * asset = [imageResults lastObject];
[manager requestImageForAsset:asset targetSize:CGSizeMake(60, 60) contentMode:PHImageContentModeDefault options:options resultHandler:^(UIImage * _Nullable result, NSDictionary * _Nullable info) {
album.image = result;
}];
}
}
}
//get albums created by user
PHFetchResult * customAlbums = [PHCollectionList fetchTopLevelUserCollectionsWithOptions:nil];
for (id object in customAlbums) {
if ([object isKindOfClass:[PHAssetCollection class]]) {
PHAssetCollection * collection = (PHAssetCollection*)object;
if (collection.assetCollectionSubtype == PHAssetCollectionSubtypeSmartAlbumVideos) {
continue;
}
//get photo
PHFetchResult * imageResults = [PHAsset fetchAssetsInAssetCollection:collection options:nil];
if (imageResults.count) {
KVPhotoAlbum * album = [[KVPhotoAlbum alloc] init];
album.collection = collection;
album.title = [collection valueForKey:#"localizedTitle"];
album.photoCount = imageResults.count;
[_albums addObject:album];
PHAsset * asset = [imageResults lastObject];
[manager requestImageForAsset:asset targetSize:CGSizeMake(60, 60) contentMode:PHImageContentModeDefault options:options resultHandler:^(UIImage * _Nullable result, NSDictionary * _Nullable info) {
album.image = result;
}];
}
}
}
Too many code,I have to add more details.
I solved it by myself,now I show the way.
There is a album type,"PHAssetCollectionTypeAlbum",this can search these albums imported from computer.
PHFetchResult * importAlbums = [PHAssetCollection fetchAssetCollectionsWithType:PHAssetCollectionTypeAlbum subtype:PHAssetCollectionSubtypeAny options:nil];

UI getting blocked while fetching image path using Photos Framework in iOS

- (void)updateFetchRequest {
PHFetchOptions *options = [PHFetchOptions new];
switch (self.imagePickerController.mediaType) {
case QBImagePickerMediaTypeImages:
options.predicate = [NSPredicate predicateWithFormat:#"mediaType == %ld", PHAssetMediaTypeImage];
break;
case QBImagePickerMediaTypeVideos:
options.predicate = [NSPredicate predicateWithFormat:#"mediaType == %ld", PHAssetMediaTypeVideo];
break;
default:
break;
}
self.fetchResult = [PHAsset fetchAssetsInAssetCollection:self.assetCollections[0] options:options];
PHContentEditingInputRequestOptions *editoptions = [[PHContentEditingInputRequestOptions alloc] init];
[editoptions setCanHandleAdjustmentData:^BOOL(PHAdjustmentData *adjustmentData) {
return [adjustmentData.formatIdentifier isEqualToString:AdjustmentFormatIdentifier] && [adjustmentData.formatVersion isEqualToString:#"1.0"];
}];
NSLog(#"fectarray %#",self.fetchResult);
PHImageRequestOptions *option = [PHImageRequestOptions new];
option.networkAccessAllowed = YES;
option.synchronous = YES;
option.version = PHImageRequestOptionsVersionOriginal;
PHAssetCollection *assetCollection = self.assetCollections[0];
__weak __typeof(self) weakSelf = self; // New C99 uses __typeof(..)
dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0), ^{
NSLog(#"Work Dispatched");
PHAsset *asset;
SyncAlbumNames = [NSString stringWithFormat:#"%#",assetCollection.localizedTitle];
for (int i=0; i<self.fetchResult.count; i++) {
asset = weakSelf.fetchResult[i];
NSLog(#"asset is %#",asset);
[asset requestContentEditingInputWithOptions:editoptions
completionHandler:^(PHContentEditingInput *contentEditingInput, NSDictionary *info) {
NSURL *imageURL = contentEditingInput.fullSizeImageURL;
NSLog (#"imageUrl %#",imageURL);
// [weakSelf getLocalId:[NSString stringWithFormat:#"%#",imageURL]];
[self performSelectorOnMainThread:#selector(getLocalId:)
withObject:[NSString stringWithFormat:#"%#",imageURL]
waitUntilDone:YES];
if(uparray.count == 0){
[arrayAssets addObject:asset];
[newArray addObject:asset];
if(arrayAssets.count !=0){
[dict setValue:[NSArray arrayWithArray:arrayAssets]
forKey:assetCollection.localizedTitle];
NSLog(#"dict count is %lu",(unsigned long)dict.count);
}
[SyncAlbum getArraySubTypes];
NSLog(#"arrayAssets count %d",arrayAssets.count);
NSLog(#"fetchresult count %d",weakSelf.fetchResult.count);
__typeof(weakSelf) strongSelf = weakSelf;
if (strongSelf) {
if(arrayAssets.count + 2 + alcount == strongSelf.fetchResult.count){
[strongSelf SyncAlbums];
}
// When finished call back on the main thread:
dispatch_async(dispatch_get_main_queue(), ^{
// Return data and update on the main thread
// Task 3: Deliver the data to a 3rd party component (always do this on the main thread, especially UI).
});
}
//add count to array asset
}else{
NSLog(#"asset already uploaded");
alcount++;
//asset already upload count here..
}
}];
}
});
}
My UI gets blocked when executing this method. I am using SQLite database. Can anyone help me with this?
I believe that
option.synchronous = YES;
If you cmd+click on synchronous shows that
// return only a single result, blocking until available (or failure). Defaults to NO
Also setting
PHImageRequestOptionsDeliveryMode
to HighQualityFormat will force synchronous = true.

in iOS How to apply lazy loading while fetching images and videos from assets?

I have fetched the images and videos from gallery, it works fine when the number of videos and images is minimum,but when the number of images and videos is large, it blocks the main thread.
-(void)viewDidAppear:(BOOL)animated{
[NSThread sleepForTimeInterval:1];
// call the same method on a background thread
dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0), ^{
[self getAllVideosAndImages];
});
}
To gat Images and Videos From Asset
-(void)getAllVideosAndImages{
NSMutableArray *assetGroups = [[NSMutableArray alloc] init];
assetGroups = [[NSMutableArray alloc] init];
library = [[ALAssetsLibrary alloc] init];
NSUInteger groupTypes = ALAssetsGroupAll;
void (^assetEnumerator)
( ALAsset *, NSUInteger, BOOL *) = ^(ALAsset *result, NSUInteger index, BOOL *stop) {
if(result != nil) {
if([[result valueForProperty:ALAssetPropertyType] isEqualToString:ALAssetTypeVideo]||[[result valueForProperty:ALAssetPropertyType] isEqualToString:ALAssetTypePhoto]) {
ALAssetRepresentation *rep = [result defaultRepresentation];
CGImageRef iref = [rep fullResolutionImage];
if (iref) {
//GET DATE
NSDate *myDate = [result valueForProperty:ALAssetPropertyDate];
NSDateFormatter *dateFormatter = [[NSDateFormatter alloc] init];
[dateFormatter setDateFormat:#"MMM dd,YYYY"];
NSString *realDateStr = [dateFormatter stringFromDate:myDate];
//GET IMG AND VIDEO
UIImage *largeimage = [UIImage imageWithCGImage:iref];
NSData *webData = UIImagePNGRepresentation(largeimage);
NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
NSString *documentsDirectory = [paths objectAtIndex:0];
NSString *localFilePath = [documentsDirectory stringByAppendingPathComponent:[NSString stringWithFormat:#"img%zd.mov",imgName]];
imgName=imgName+1;
[webData writeToFile:localFilePath atomically:YES];
Para_ImageSet *objImgSet=[Para_ImageSet new];
objImgSet.imagePath=localFilePath;
objImgSet.imageDate=realDateStr;
// [yArrayOfAllImges addObject:objImgSet];
if ([[result valueForProperty:ALAssetPropertyType] isEqualToString:ALAssetTypeVideo]) {
objImgSet.is_video=YES;
}
else
{
objImgSet.is_video=NO;
}
if ([tempDate isEqualToString:#""]) {
yArrayOfImagesWithDate=[NSMutableArray new];
[yArrayOfImagesWithDate addObject:objImgSet];
tempDate=realDateStr;
}else
{
//GATE IMAGE COUNT
if ([realDateStr isEqualToString:tempDate]) {
//DO NOTHING
[yArrayOfImagesWithDate addObject:objImgSet];
}
else
{
[yArrayOFNNestedImgs addObject:yArrayOfImagesWithDate];
NSLog(#" ----------ImagesWithDate =%zd, %#",yArrayOfImagesWithDate.count,tempDate);
yArrayOfImagesWithDate=[NSMutableArray new];
[yArrayOfImagesWithDate addObject:objImgSet];
tempDate=realDateStr;
}
}
}
}
}
[self.tableView reloadData];
};
void (^ assetGroupEnumerator) ( ALAssetsGroup *, BOOL *)= ^(ALAssetsGroup *group, BOOL *stop) {
if(group != nil) {
[group enumerateAssetsUsingBlock:assetEnumerator];
[assetGroups addObject:group];
// NSNumber *countObj = [NSNumber numberWithInt:count];
NSLog(#" ----------ImagesWithDate =%zd, %#",yArrayOfImagesWithDate.count,tempDate);
[yArrayOFNNestedImgs addObject:yArrayOfImagesWithDate];
NSLog(#" ----------Count =%zd",yArrayOFNNestedImgs.count);
[NSThread sleepForTimeInterval:10];
// update UI on the main thread
dispatch_async(dispatch_get_main_queue(), ^{
[self.tableView reloadData];
});
}
};
[library enumerateGroupsWithTypes:groupTypes
usingBlock:assetGroupEnumerator
failureBlock:^(NSError *error) {NSLog(#"A problem occurred");}];
}
Please Suggest..Thanks in advance.
[NSThread sleepForTimeInterval:1]; is blocking main thread for a second. It's not needed.
You could use [self performSelectorInBackground:#selector(getAllVideosAndImages) withObject:nil];
Also, wrap first [self.tableView reloadData]; in
dispatch_async(dispatch_get_main_queue(), ^{
...
}); too
In order to lazily load the resources, here is the recommended approach:
Download URLs of all resources and store them into your container (array / objects etc)
Fire a NSURLSessionTask (post iOS 7 only) which runs async on background queue. If you are below iOS 7, you can use NSURLConnection SendAsynchronousRequest API - it's deprecated in iOS 9 so you better get rid of that soon.
Process your downloaded resources while on background queue - store audio/image files, create UIImage from NSData, and so on.
Come back to main queue, then update the relevant UI part with downloaded content. If audio/video, play it. If UIImage, display it.
Here - the entire approach is described in my tutorial.

With the iOS Photos Framework how do I list all PHAssetCollections available?

With the iOS Photos Framework how do I list all PHAssetCollections available?
I'd like to find the "Photo Roll" collection so that I can retrieve all photos from that collection, specifically. How do I do that with the iOS 8+ using the new PhotosFramework?
If you look at PhotoTypes, you can see that Camera Roll is not included in PH,
You can get to it by
PHFetchResult *result = [PHAssetCollection fetchAssetCollectionsWithType:PHAssetCollectionTypeSmartAlbum
subtype:PHAssetCollectionSubtypeSmartAlbumUserLibrary
options:nil];
PHAssetCollection *assetCollection = result.firstObject;
NSLog(#"%#", assetCollection.localizedTitle); // Camera Roll
In general this is how to get everything
PHFetchOptions *options = [[PHFetchOptions alloc] init];
options.wantsIncrementalChangeDetails = YES;
options.predicate = [NSPredicate predicateWithFormat:#"mediaType == %d",PHAssetMediaTypeImage];
PHFetchResult *albums = [PHAssetCollection fetchAssetCollectionsWithType:PHAssetCollectionTypeSmartAlbum subtype:PHAssetCollectionSubtypeAlbumRegular options:nil];
for (PHAssetCollection *sub in albums)
{
PHFetchResult *fetchResult = [PHAsset fetchAssetsInAssetCollection:sub options:options];
}
#pragma mark - PHAssetCollection types
typedef NS_ENUM(NSInteger, PHAssetCollectionType) {
PHAssetCollectionTypeAlbum = 1,
PHAssetCollectionTypeSmartAlbum = 2,
PHAssetCollectionTypeMoment = 3,
} NS_ENUM_AVAILABLE_IOS(8_0);
typedef NS_ENUM(NSInteger, PHAssetCollectionSubtype) {
// PHAssetCollectionTypeAlbum regular subtypes
PHAssetCollectionSubtypeAlbumRegular = 2,
PHAssetCollectionSubtypeAlbumSyncedEvent = 3,
PHAssetCollectionSubtypeAlbumSyncedFaces = 4,
PHAssetCollectionSubtypeAlbumSyncedAlbum = 5,
PHAssetCollectionSubtypeAlbumImported = 6,
// PHAssetCollectionTypeAlbum shared subtypes
PHAssetCollectionSubtypeAlbumMyPhotoStream = 100,
PHAssetCollectionSubtypeAlbumCloudShared = 101,
// PHAssetCollectionTypeSmartAlbum subtypes
PHAssetCollectionSubtypeSmartAlbumGeneric = 200,
PHAssetCollectionSubtypeSmartAlbumPanoramas = 201,
PHAssetCollectionSubtypeSmartAlbumVideos = 202,
PHAssetCollectionSubtypeSmartAlbumFavorites = 203,
PHAssetCollectionSubtypeSmartAlbumTimelapses = 204,
PHAssetCollectionSubtypeSmartAlbumAllHidden = 205,
PHAssetCollectionSubtypeSmartAlbumRecentlyAdded = 206,
PHAssetCollectionSubtypeSmartAlbumBursts = 207,
PHAssetCollectionSubtypeSmartAlbumSlomoVideos = 208,
PHAssetCollectionSubtypeSmartAlbumUserLibrary = 209,
// Used for fetching, if you don't care about the exact subtype
PHAssetCollectionSubtypeAny = NSIntegerMax
} NS_ENUM_AVAILABLE_IOS(8_0);
Use below given code snippet to get the all Smart Albums and All smart photos
// Get all Smart Albums
PHFetchResult *smartAlbums = [PHAssetCollection fetchAssetCollectionsWithType:PHAssetCollectionTypeSmartAlbum subtype:PHAssetCollectionSubtypeAlbumRegular options:nil];
[smartAlbums enumerateObjectsUsingBlock:^(PHAssetCollection *collection, NSUInteger idx, BOOL *stop) {
NSLog(#"album title %#", collection.localizedTitle);
}];
// Get all photos
PHFetchResult *allPhotosResult = [PHAsset fetchAssetsWithMediaType:PHAssetMediaTypeImage options:nil];
// Get assets from the PHFetchResult object
[allPhotosResult enumerateObjectsUsingBlock:^(PHAsset *asset, NSUInteger idx, BOOL *stop) {
NSLog(#"asset %#", asset);
CGSize size=CGSizeMake(90, 90);
PHImageManager *imageManager;
[imageManager requestImageForAsset:asset targetSize:size contentMode:PHImageContentModeAspectFit options:nil resultHandler:^(UIImage *result, NSDictionary *info) {
yourImageView.image=result;
}];
}];
for reference:https://developer.apple.com/library/prerelease/ios/samplecode/UsingPhotosFramework/Introduction/Intro.html

Accessing the image gallery through code for displaying in image view

I've just started a new project where I want the user to be able to pick one of the images in the devices gallery.
I am trying to achieve this by using an ImageView and a UIStepper.
I want to write all images inside the gallery into an array and have the imageView navigate through the array with the + and - buttons of the stepper (selecting the current array position +1 or -1 depending on click).
OK as per prior discussion, here is the project: AssetLibraryPhotosViewer
Have not done an extensive testing, though does seem to run OK both on simulator and real device
#Exothug, to give you an idea of how to enumerate the device library accessing full screen photos:
ALAssetsLibrary* assetLibrary = [[ALAssetsLibrary alloc] init];
[assetLibrary enumerateGroupsWithTypes:ALAssetsGroupSavedPhotos usingBlock:^(ALAssetsGroup *group, BOOL *stop) {
if (group) {
[group enumerateAssetsUsingBlock:^(ALAsset* asset, NSUInteger index, BOOL* innerstop) {
if (asset) {
ALAssetRepresentation *rep = [asset defaultRepresentation];
CGImageRef iref = [rep fullScreenImage];
if (iref) {
UIImage *image = [UIImage imageWithCGImage:iref scale:rep.scale
orientation:(UIImageOrientation)rep.orientation];
// process the image here
}
}
}];
}
} failureBlock:^(NSError *error) {
NSLog(#"failure: %#", [error localizedDescription]);
}];
you can just process the image via adding it to your array, however depending on number of images in the library it might not be most effective. an alternative approach would be using images URL / indexes to iterate through the library, fetching the image from the library as its needed for display in your ImageView
Maybe try something like this, choose the directory if you want a specific group of images.
NSMutableArray *result = [NSMutableArray array];
[[[NSBundle mainBundle] pathsForResourcesOfType:#"png" inDirectory:nil] enumerateObjectsUsingBlock:^(NSString *obj, NSUInteger idx, BOOL *stop) {
NSString *path = [obj lastPathComponent];
[result addObject:path];
];
I thought that , You just need to retrieve the Photos of Camera Roll from Your device .
If so , Try with this :
ALAssetsLibrary
void (^assetEnumerator)(struct ALAsset *, NSUInteger, BOOL *) = ^(ALAsset *result, NSUInteger index, BOOL *stop) {
if(result != NULL) {
NSLog(#"See Asset: %#", result);
}
};
void (^assetGroupEnumerator)(struct ALAssetsGroup *, BOOL *) = ^(ALAssetsGroup *group, BOOL *stop) {
if(group != nil) {
[group enumerateAssetsUsingBlock:assetEnumerator];
}
};
ALAssetsLibrary *library = [[ALAssetsLibrary alloc] init];
[library enumerateGroupsWithTypes:ALAssetsGroupAll
usingBlock:assetGroupEnumerator
failureBlock: ^(NSError *error) {
NSLog(#"Failure");
}];
OR
//Get camera roll images
- (void)updateLastPhotoThumbnail {
ALAssetsLibrary *assetsLibrary = [[ALAssetsLibrary alloc] init];
[assetsLibrary enumerateGroupsWithTypes:ALAssetsGroupSavedPhotos usingBlock:^(ALAssetsGroup *group, BOOL *stop) {
NSInteger numberOfAssets = [group numberOfAssets];
if (numberOfAssets > 0) {
NSLog(#"numberOfPictures: %d",numberOfAssets);
//NSInteger lastIndex = numberOfAssets - 1;
int i = 0;
for (i = 0; i <= numberOfAssets-1; i++) {
[group enumerateAssetsAtIndexes:[NSIndexSet indexSetWithIndex:i] options:0 usingBlock:^(ALAsset *result, NSUInteger index, BOOL *stop) {
UIImage *thumbnail = [UIImage imageWithCGImage:[result thumbnail]];
NSLog(#"theObject!!!! -- (%d) %#",i,thumbnail);
[cameraRollPictures addObject:thumbnail];
}];
}
}
} failureBlock:^(NSError *error) {
NSLog(#"error: %#", error);
}];

Resources