PHAsset assetResourcesForAsset fails when called too often - ios

I need to retrieve the names of all the PHAsset existing in the Camera Roll, individually and in a short time.
To get the file name, I use the documented originalFilename property of PHAssetResource associated to the PHAsset.
This works fine for the first assets, but at some point (after around 400 assets), it starts failing and returning nil every time.
Here is a code that shows this behavior (running on an iPhone 7 with ~800 photos in the Camera Roll):
PHFetchResult *result = [PHAssetCollection fetchAssetCollectionsWithType:PHAssetCollectionTypeSmartAlbum
subtype:PHAssetCollectionSubtypeSmartAlbumUserLibrary options:nil];
PHAssetCollection *assetCollection = result.firstObject;
PHFetchResult *assetsFetchResult = [PHAsset fetchAssetsInAssetCollection:assetCollection options:nil];
for (int i = index; i<[assets count]; i++) {
PHAsset *asset = assets[i];
NSArray *resources = [PHAssetResource assetResourcesForAsset:asset];
NSString *name = (resources.count > 0) ? [(PHAssetResource*)resources.firstObject originalFilename] : nil;
NSLog(#"%i: %#", i, name);
}
When using undocumented methods to get the file name, such as [asset valueForKey#"filenamme"] or the PHImageFileURLKey key of the info dictionary returned by the PHImageManager, everything works well (although the name is different than with the originalFilename and well, it's not reliable since it's not documented).
How come the official method is that unreliable?
Is there something I do wrong?

Related

PHAsset Null but local identifier is valid

I am trying to load an asset from a local identifier. The local identifier seems to be correct but the asset is null and I can't figure out why. I have a similar code in another part of my app that works fine.
Queue *queue = [NSEntityDescription insertNewObjectForEntityForName:#"Queue" inManagedObjectContext:self.context];
for (int reyrt = 0; reyrt < self.storeGIF.count; reyrt++) {
queue.queuetextimagePath = [self.storeGIF objectAtIndex:reyrt];
}
__block float fileSize;
NSArray *identifiers = #[queue.queuetextimagePath];
PHFetchResult *assetsFetchResult = [PHAsset fetchAssetsWithLocalIdentifiers:identifiers options:nil];
PHAsset *asset = [assetsFetchResult firstObject];
NSLog (#"identifiers: %#", identifiers);
NSLog (#"assetsFetchResult: %#", assetsFetchResult);
NSLog (#"asset: %#", asset);
if (!asset) {
NSLog(#"can't retrieve PHAsset from localIdentifier:%#",identifiers);
}
Here is the NSLog file result when using the above code.
019-07-19 09:26:36.366739-0400 myApp[1440:328144] identifiers: (
"857AC3DA-C047-4D88-911B-C5FE227E2B96/L0/001"
)
2019-07-19 09:26:36.366945-0400 myApp[1440:328144] assetsFetchResult PHFetchResult: 0x281129900 count=0
2019-07-19 09:26:36.366982-0400 myApp[1440:328144] asset: (null)
2019-07-19 09:26:36.367061-0400 myApp[1440:328144] can't retrieve PHAsset from localIdentifier:(
"857AC3DA-C047-4D88-911B-C5FE227E2B96/L0/001"
)
The problem is that you should drop /L0/001 part before requesting the object. I don't understand why uuid strings contain these suffixes, but it works if you drop it and use only normal UUID part.

What sort of assets does PHAssetMediaTypeAudio fetch?

I am testing PHAsset more particularly the following method:
+ (PHFetchResult *)fetchAssetsWithMediaType:(PHAssetMediaType)mediaType
options:(PHFetchOptions *)options
I am unclear what PHAssetMediaTypeAudio and fetchAssetsWithMediaType:PHAssetMediaTypeUnknown actually retrieve. This is my test code for illustration:
- (void)testRetrieveAssetsFromPhotoLibrary {
// this gets me photo library images
PHFetchResult *imagesResults =
[PHAsset fetchAssetsWithMediaType:PHAssetMediaTypeImage options:nil];
NSLog(#"Number of images: %i", (int)imagesResults.count);
// this gets me photo library videos
PHFetchResult *videoResults =
[PHAsset fetchAssetsWithMediaType:PHAssetMediaTypeVideo options:nil];
NSLog(#"Number of video files: %i", (int)videoResults.count);
// what does this get me??
// not iTunes sync'ed music nor Voice Memo recordings...
// prints 0...
PHFetchResult *audioResults =
[PHAsset fetchAssetsWithMediaType:PHAssetMediaTypeAudio options:nil];
NSLog(#"Number of audio files: %i", (int)audioResults.count);
// what does this get me too??
// prints 0...
PHFetchResult *otherResults =
[PHAsset fetchAssetsWithMediaType:PHAssetMediaTypeUnknown options:nil];
NSLog(#"Number of other files: %i", (int)otherResults.count);
}

How to get only images in the camera roll using Photos Framework

The following code loads images that are also located on iCloud or the streams images. How can we limit the search to only images in the camera roll?
var assets = PHAsset.fetchAssetsWithMediaType(PHAssetMediaType.Image, options: nil)
After adding the Camera Roll and Photo Stream albums, Apple added the following PHAssetCollectionSubtype types in iOS 8.1:
PHAssetCollectionSubtypeAlbumMyPhotoStream (together with PHAssetCollectionTypeAlbum) - fetches the Photo Stream album.
PHAssetCollectionSubtypeSmartAlbumUserLibrary (together with PHAssetCollectionTypeSmartAlbum) - fetches the Camera Roll album.
Haven't tested if this is backward-compatible with iOS 8.0.x though.
Through some experimentation we discovered a hidden property not listed in the documentation (assetSource). Basically you have to do a regular fetch request, then use a predicate to filter the ones from the camera roll. This value should be 3.
Sample code:
//fetch all assets, then sub fetch only the range we need
var assets = PHAsset.fetchAssetsWithMediaType(PHAssetMediaType.Image, options: fetchOptions)
assets.enumerateObjectsUsingBlock { (obj, idx, bool) -> Void in
results.addObject(obj)
}
var cameraRollAssets = results.filteredArrayUsingPredicate(NSPredicate(format: "assetSource == %#", argumentArray: [3]))
results = NSMutableArray(array: cameraRollAssets)
If you are searching like me for Objective C code, and also you didn't get Answer of new library/ Photo Framework as you were getting deprecated AssetsLibrary's code , Then this will help you:
Swift
Global Variables:
func getAllPhotosFromCameraRoll() -> [UIImage] {
// TODO: Add `NSPhotoLibraryUsageDescription` to info.plist
PHPhotoLibrary.requestAuthorization { print($0) } // TODO: Move this line of code to somewhere before attempting to access photos
var images = [UIImage]()
let requestOptions: PHImageRequestOptions = PHImageRequestOptions()
requestOptions.resizeMode = .exact
requestOptions.deliveryMode = .highQualityFormat
requestOptions.isSynchronous = true
let fetchResult: PHFetchResult = PHAsset.fetchAssets(with: .image, options: nil)
let manager: PHImageManager = PHImageManager.default()
for i in 0..<fetchResult.count {
let asset = fetchResult.object(at: i)
manager.requestImage(
for: asset,
targetSize: PHImageManagerMaximumSize,
contentMode: .default,
options: requestOptions,
resultHandler: { (image: UIImage?, info: [AnyHashable: Any]?) -> Void in
if let image = image {
images.append(image)
}
})
}
return images
}
Objective C
Global Variables:
NSArray *imageArray;
NSMutableArray *mutableArray;
below method will help you:
-(void)getAllPhotosFromCamera
{
imageArray=[[NSArray alloc] init];
mutableArray =[[NSMutableArray alloc]init];
PHImageRequestOptions *requestOptions = [[PHImageRequestOptions alloc] init];
requestOptions.resizeMode = PHImageRequestOptionsResizeModeExact;
requestOptions.deliveryMode = PHImageRequestOptionsDeliveryModeHighQualityFormat;
requestOptions.synchronous = true;
PHFetchResult *result = [PHAsset fetchAssetsWithMediaType:PHAssetMediaTypeImage options:nil];
NSLog(#"%d",(int)result.count);
PHImageManager *manager = [PHImageManager defaultManager];
NSMutableArray *images = [NSMutableArray arrayWithCapacity:[result count]];
// assets contains PHAsset objects.
__block UIImage *ima;
for (PHAsset *asset in result) {
// Do something with the asset
[manager requestImageForAsset:asset
targetSize:PHImageManagerMaximumSize
contentMode:PHImageContentModeDefault
options:requestOptions
resultHandler:^void(UIImage *image, NSDictionary *info) {
ima = image;
[images addObject:ima];
}];
}
imageArray = [images copy]; // You can direct use NSMutuable Array images
}
If you use your own PHCachingImageManager instead of the shared PHImageManager instance then when you call requestImageForAsset:targetSize:contentMode:options:resultHandler: you can set an option in PHImageRequestOptions to specify that the image is local.
networkAccessAllowed
Property
A Boolean value that specifies whether Photos can download the requested image from iCloud.
networkAccessAllowed
Discussion
If YES, and the requested image is not stored on the local device, Photos downloads the image from iCloud. To be notified of the download’s progress, use the progressHandler property to provide a block that Photos calls periodically while downloading the image. If NO (the default), and the image is not on the local device, the PHImageResultIsInCloudKey value in the result handler’s info dictionary indicates that the image is not available unless you enable network access.
This can help. You can use your own data model instead of AlbumModel I used.
func getCameraRoll() -> AlbumModel {
var cameraRollAlbum : AlbumModel!
let cameraRoll = PHAssetCollection.fetchAssetCollections(with: .smartAlbum, subtype: .smartAlbumUserLibrary, options: nil)
cameraRoll.enumerateObjects({ (object: AnyObject!, count: Int, stop: UnsafeMutablePointer) in
if object is PHAssetCollection {
let obj:PHAssetCollection = object as! PHAssetCollection
let fetchOptions = PHFetchOptions()
fetchOptions.sortDescriptors = [NSSortDescriptor(key: "creationDate", ascending: false)]
fetchOptions.predicate = NSPredicate(format: "mediaType = %d", PHAssetMediaType.image.rawValue)
let assets = PHAsset.fetchAssets(in: obj, options: fetchOptions)
if assets.count > 0 {
let newAlbum = AlbumModel(name: obj.localizedTitle!, count: assets.count, collection:obj, assets: assets)
cameraRollAlbum = newAlbum
}
}
})
return cameraRollAlbum
}
Here is Objective- c version provided by apple.
-(NSMutableArray *)getNumberOfPhotoFromCameraRoll:(NSArray *)array{
PHFetchResult *fetchResult = array[1];
int index = 0;
unsigned long pictures = 0;
for(int i = 0; i < fetchResult.count; i++){
unsigned long temp = 0;
temp = [PHAsset fetchAssetsInAssetCollection:fetchResult[i] options:nil].count;
if(temp > pictures ){
pictures = temp;
index = i;
}
}
PHCollection *collection = fetchResult[index];
if (![collection isKindOfClass:[PHAssetCollection class]]) {
// return;
}
// Configure the AAPLAssetGridViewController with the asset collection.
PHAssetCollection *assetCollection = (PHAssetCollection *)collection;
PHFetchResult *assetsFetchResult = [PHAsset fetchAssetsInAssetCollection:assetCollection options:nil];
self. assetsFetchResults = assetsFetchResult;
self. assetCollection = assetCollection;
self.numberOfPhotoArray = [NSMutableArray array];
for (int i = 0; i<[assetsFetchResult count]; i++) {
PHAsset *asset = assetsFetchResult[i];
[self.numberOfPhotoArray addObject:asset];
}
NSLog(#"%lu",(unsigned long)[self.numberOfPhotoArray count]);
return self.numberOfPhotoArray;
}
Where you can grab following details
PHFetchResult *fetchResult = self.sectionFetchResults[1];
PHCollection *collection = fetchResult[6];
**value 1,6 used to get camera images**
**value 1,0 used to get screen shots**
**value 1,1 used to get hidden**
**value 1,2 used to get selfies**
**value 1,3 used to get recently added**
**value 1,4 used to get videos**
**value 1,5 used to get recently deleted**
**value 1,7 used to get favorites**
Apple demo link
Declare your property
#property (nonatomic, strong) NSArray *sectionFetchResults;
#property (nonatomic, strong) PHFetchResult *assetsFetchResults;
#property (nonatomic, strong) PHAssetCollection *assetCollection;
#property (nonatomic, strong) NSMutableArray *numberOfPhotoArray;
I've been banging my head over this too. I've found no way to filter for only assets on the device with fetchAssetsWithMediaType or fetchAssetsInAssetCollection. I'm able to use requestContentEditingInputWithOptions or requestImageDataForAsset to determine if the asset is on the device or not, but this is asynchronous and seems like it's using way too much resources to do for every asset in the list. There must be a better way.
PHFetchResult *fetchResult = [PHAsset fetchAssetsWithMediaType:PHAssetMediaTypeImage options:nil];
for (int i=0; i<[fetchResult count]; i++) {
PHAsset *asset = fetchResult[i];
[asset requestContentEditingInputWithOptions:nil
completionHandler:^(PHContentEditingInput *contentEditingInput, NSDictionary *info) {
if ([[info objectForKey:PHContentEditingInputResultIsInCloudKey] intValue] == 1) {
NSLog(#"asset is in cloud");
} else {
NSLog(#"asset is on device");
}
}];
}
If you don't want to rely on an undocumented API, look at [asset canPerformEditOperation:PHAssetEditOperationContent]. This only returns true if the full original is available on device.
Admittedly this is also fragile, but testing shows it works for all of the assetSource types (photostream, iTunes sync, etc).

Adding list of URLs to an array?

I'm using MHVideoPhotoGallery to create gallery's of images that are stored on my website. The current way to add images (as shown in the example on Github) is
MHGalleryItem *photo1 = [MHGalleryItem.alloc initWithURL:#"*ENTER IMAGE URL HERE*"
galleryType:MHGalleryTypeImage];
MHGalleryItem *photo2 = [MHGalleryItem.alloc initWithURL:#"*ENTER IMAGE URL HERE*"
galleryType:MHGalleryTypeImage];
MHGalleryItem *photo3 = [MHGalleryItem.alloc initWithURL:#"*ENTER IMAGE URL HERE*"
galleryType:MHGalleryTypeImage];
self.galleryDataSource = #[#[photo1,photo2,photo3]];
But I want to add hundreds of images and this is not the most ideal way to do it. What would be an easier way for me to accomplish this?
Thanks!
You have to start with a list of the URLs. What I would do is put this in a text file in my bundle. In code, when the app runs, I would open the text file (as an NSString) and split it into an NSArray. Now I've got an NSArray of the URLs. I would then cycle through the NSArray. So now we're inside a loop. For each item the array, I would initialize the MHGalleryItem and then add it to a previously created NSMutableArray with addObject:. Thus we have a two or three-line loop which is repeated, running through all the URLs.
The following is pseudo-code and untested (so it might contain errors), but it should give the general idea of the structure I'm suggesting:
NSMutableArray* temp = [NSMutableArray new];
NSString* s =
[NSString stringWithContentsOfFile:
[[NSBundle mainBundle] pathForResource:#"urls" ofType:#"txt"]
encoding:NSUTF8StringEncoding error:nil];
NSArray* urls = [s componentsSeparatedByString:#"\n"];
for (NSString* url in urls) {
MHGalleryItem *item = [[MHGalleryItem alloc] initWithURL:url
galleryType:MHGalleryTypeImage];
[temp addObject:item];
}
self.galleryDataSource = temp;
Loop. If you're putting numbers at the end of your variable names, you need a loop and/or an array.
NSMutableArray * photos = [NSMutableArray new];
NSArray * photoPaths = [NSArray arrayWithContentsOfFile:plistContainingPhotoPaths];
for( NSString * path in photoPaths ){
NSURL * photoURL = [NSURL URLWithString:path];
MHGalleryItem * photo = [[MHGalleryItem alloc] initWithURL:photoURL
galleryType:MHGalleryTypeImage];
[photos addObject:photo];
}
And don't use dot syntax for alloc, or your code will burst into flames.
Use a naming protocol on your website such as:
www.mywebsite.com/appImageGallery/insertImageNumber
And replace the insertImageNumber with the number of you image. Then add this for loop to get all of the images and add them to the array.
NSMutableArray *mutableGalleryDataSource = [self.galleryDataSource mutableCopy]
for(int i = 0; i < numberOfImagesOnWebsite; i++){ //replace numberOfImagesOnWebsite with the number of images on your website.
MHGalleryItem *newItem = [MHGalleryItem.alloc initWithURL:[#"www.mywebsite.com/appImageGallery/" stringByAppendingString:[NSString stringWithFormat: #"%i", i]] galleryType:MHGalleryTypeImage];
[mutableGalleryDataSource addObject:newItem];
}
self.galleryDataSource = mutableGalleryDataSource;
There is also an -addObjectsFromArray method on NSMutableArray.

How to get the MPMediaItem (audiobook) individual chapters

I know I can get all the audiobooks from the iPod library with:
MPMediaPropertyPredicate *abPredicate =
[MPMediaPropertyPredicate predicateWithValue:[NSNumber numberWithInt:MPMediaTypeAudioBook]
forProperty:MPMediaItemPropertyMediaType];
MPMediaQuery *abQuery = [[MPMediaQuery alloc] init];
[abQuery addFilterPredicate:abPredicate];
[abQuery setGroupingType:MPMediaGroupingAlbum];
NSArray *books = [abQuery collections];
And I can get the parts/files for each book by using this:
[book items];
What I cant figure out is how to get the separate chapters that make up each part.
I know you can see this in the iPod application by tapping the "track" button in the upper right corner while playing a book. This flips the player around and shows the list of chapters.
Is apple using a private API to get this?
To get the individual chapters you need to create an AVAsset from the MPMediaItem's AssetURL property.
NSURL *assetURL = [mediaItem valueForProperty:MPMediaItemPropertyAssetURL];
AVURLAsset *asset = [AVURLAsset URLAssetWithURL:assetURL options:nil];
NSArray *locales = [asset availableChapterLocales];
NSArray *chapters = [asset chapterMetadataGroupsWithTitleLocale:locale containingItemsWithCommonKeys:[NSArray arrayWithObject:AVMetadataCommonKeyArtwork]];
Get the url and create the asset, check the available locales for the chapter, and get the chapters from the asset. The result is an array of AVTimedMetadataGroups that each contain a CMTimeRange and an array of AVMetadataItems. Each AVMetadataItem holds a piece of metadata (e.g. chapter title, chapter artwork).
According to the documentation the only supported key is the AVMetadataCommonKeyArtwork.

Resources