[PHCollectionList canContainCustomKeyAssets]: unrecognized selector sent to instance - ios

Fetching PHAssets is now crashing for me on latest versions of iOS (iOS 9.2 and iOS 9.3). It previously worked fine.
The error I am getting is:
[PHCollectionList canContainCustomKeyAssets]: unrecognized selector sent to instance
Terminating app due to uncaught exception 'NSInvalidArgumentException'
The line throwing the exception is:
PHFetchResult *fetchImage = [PHAsset fetchKeyAssetsInAssetCollection:(PHAssetCollection*)collection options:fetchOptions];
Here is more code, for reference:
Class PHPhotoLibrary_class = NSClassFromString(#"PHPhotoLibrary");
if (PHPhotoLibrary_class) {
PHFetchResult *fetchResult = self.collectionsFetchResults[indexPath.section];
PHCollection *collection = fetchResult[indexPath.row];
cell.textLabel.text = collection.localizedTitle;
PHFetchOptions *fetchOptions = [[PHFetchOptions alloc] init];
fetchOptions.sortDescriptors = #[[NSSortDescriptor sortDescriptorWithKey:#"creationDate" ascending:YES]];
PHFetchResult *fetchImage = [PHAsset fetchKeyAssetsInAssetCollection:(PHAssetCollection*)collection options:fetchOptions];
PHAsset *asset = [fetchImage firstObject];
PHImageRequestOptions *options = [[PHImageRequestOptions alloc] init];
options.resizeMode = PHImageRequestOptionsResizeModeExact;
CGFloat scale = [UIScreen mainScreen].scale;
CGFloat dimension = 90.0;
CGSize size = CGSizeMake(dimension*scale, dimension*scale);
[[PHImageManager defaultManager] requestImageForAsset:asset
targetSize:size
contentMode:PHImageContentModeAspectFill
options:options
resultHandler:^(UIImage *result, NSDictionary *info) {
if (result) {
CGSize itemSize = CGSizeMake(60, 60);
UIGraphicsBeginImageContextWithOptions(itemSize, NO, 2);
CGRect imageRect = CGRectMake(0.0, 0.0, itemSize.width, itemSize.height);
[result drawInRect:imageRect];
cell.imageView.image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
}
else{
UIImage *placeholder = [UIImage imageNamed:#"image-placeholder.jpg"];
CGSize itemSize = CGSizeMake(60, 60);
UIGraphicsBeginImageContextWithOptions(itemSize, NO, 2);
CGRect imageRect = CGRectMake(0.0, 0.0, itemSize.width, itemSize.height);
[placeholder drawInRect:imageRect];
cell.imageView.image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
}
}];
}

This issue actually lies here:
PHCollection *collection = fetchResult[indexPath.row];
and then here:
PHAsset *asset = [fetchImage firstObject];
You are fetching the collection using PHCollection and then just assuming all assets are PHAssets without properly checking if this is the case or not.
Actually, PHCollection has two possible subclasses: PHAsset and PHCollectionList, and the PHCollectionList is what is throwing the error here.
Wrap the code after PHCollection *collection = fetchResult[indexPath.row]; with a check for PHAsset, and it should solve the issue:
if ([collection isKindOfClass:[PHAssetCollection class]]) {
//code
}

First, get all assets using PHAsset like
dispatch_async(dispatch_get_main_queue(), ^{
AllPhotos=[[NSMutableArray alloc] init];
PHFetchResult *result = [PHAsset fetchAssetsWithMediaType:PHAssetMediaTypeImage options:nil];
for (PHAsset *asset in result)
{
[AllPhotos addObject:asset];
}
[self.collectionView reloadData];
// code here
});
Not, use the collectionview delegate method like
PHImageManager *manager = [PHImageManager defaultManager];
PHAsset *asset = [AllPhotos objectAtIndex:indexPath.row];
PHImageRequestOptions *requestOptions = [[PHImageRequestOptions alloc] init];
requestOptions.resizeMode = PHImageRequestOptionsResizeModeExact;
requestOptions.deliveryMode = PHImageRequestOptionsDeliveryModeOpportunistic;
requestOptions.synchronous = false;
//CGFloat scale = 1.5; //or an even smaller one
if (!manager)
{
manager = [[PHCachingImageManager alloc] init];
}
if (!requestOptions)
{
requestOptions = [[PHImageRequestOptions alloc] init];
}
[manager requestImageForAsset:asset
targetSize:CGSizeMake(imagesize.width,imagesize.height)
contentMode:PHImageContentModeAspectFill
options:requestOptions
resultHandler:^void(UIImage *image, NSDictionary *info)
{
[imageview setImage:image];
}];
[cell.contentView addSubview:imageview];

Related

ALAssetsLibrary methods deprecated

This is deprecated, what could be the updated code?
ALAssetsLibrary *library = [[ALAssetsLibrary alloc] init];
[library assetForURL:referenceURL resultBlock:^(ALAsset *asset) {
ALAssetRepresentation *rep = [asset defaultRepresentation];
Use below code to get all picture from Gallery:
First you need to import Photo framework.
#import <Photos/Photos.h>
Take Authorization before getting image:
[PHPhotoLibrary requestAuthorization:^(PHAuthorizationStatus status)
{
switch (status) {
case PHAuthorizationStatusAuthorized:
[self performSelectorOnMainThread:#selector(getAllPictures) withObject:nil waitUntilDone:NO];
// [self getAllPictures];
NSLog(#"PHAuthorizationStatusAuthorized");
break;
case PHAuthorizationStatusRestricted:
NSLog(#"PHAuthorizationStatusRestricted");
break;
case PHAuthorizationStatusDenied:
NSLog(#"PHAuthorizationStatusDenied");
break;
default:
break;
}
}];
-(void)getAllPicture
{
NSLog(#"Started...");
PHImageRequestOptions *requestOptions = [[PHImageRequestOptions alloc] init];
requestOptions.synchronous = YES;
PHFetchOptions *allPhotosOptions = [PHFetchOptions new];
allPhotosOptions.sortDescriptors = #[[NSSortDescriptor sortDescriptorWithKey:#"creationDate" ascending:YES]];
PHFetchResult *result = [PHAsset fetchAssetsWithMediaType:PHAssetMediaTypeImage options:allPhotosOptions];
for (PHAsset *asset in result) {
NSMutableDictionary *dic = [[NSMutableDictionary alloc] init];
[dic setValue:asset forKey:#"assest"];
[YOUR_ARRAY insertObject:dic atIndex:0];
dic = nil;
}
NSLog(#"Completed...");
}
You can retrive image from below code:
PHImageRequestOptions *requestOptions = [[PHImageRequestOptions alloc] init];
requestOptions.synchronous = YES;
PHImageManager *manager = [PHImageManager defaultManager];
[manager requestImageForAsset:YOUR_ARRAY[INDEX_ARRAY][#"assest"]
targetSize:CGSizeMake(self.view.frame.size.width/3, 200)
contentMode:PHImageContentModeDefault
options:requestOptions
resultHandler:^void(UIImage *image, NSDictionary *info) {
YOUR_IMAGE_VIEW.image = image;
}];

Can't read some files from camera roll

I wrote simple library to get photos from camera roll. Unfortunately can't read some of them. I can't preview or convert to NSData
PHFetchOptions *options = [[PHFetchOptions alloc] init];
options.includeAssetSourceTypes = PHAssetSourceTypeUserLibrary;
PHFetchResult *allPhotosResult = [PHAsset fetchAssetsWithMediaType:PHAssetMediaTypeImage options:options];
PHImageRequestOptions *requestOptionForPhotos = [[PHImageRequestOptions alloc] init];
requestOptionForPhotos.networkAccessAllowed = YES;
for(PHAsset *asset in allPhotosResult) {
[[PHImageManager defaultManager]
requestImageForAsset:asset
targetSize:CGSizeMake(100, 100)
contentMode:PHImageContentModeAspectFill
options:requestOptionForPhotos
resultHandler:^(UIImage *result, NSDictionary *info) {
NSData *data = UIImagePNGRepresentation(result);
NSString *base = [data base64EncodedStringWithOptions:NSDataBase64EncodingEndLineWithLineFeed]; // for some of photos there is nil
}];
}

Can't read PHAsset content asynchronous

I want to read specified file (photo from camera roll) asynchronous, but it does not work for me.
Variable tempData gets nil untill I change config requestOptionForPhotos.synchronous to YES, then everything is ok, but I don't want to perform this code synchronous.
Is it possible that I'm blocking access to photo by requesting to the same file in other thread? I'm newbie in objective-c and iOS programming and I don't know how does it works.
NSURL *assetUrl = [[NSURL alloc] initWithString:filepath];
PHFetchResult *collection = [PHAsset fetchAssetsWithALAssetURLs:[NSArray arrayWithObject:assetUrl] options:nil];
PHImageRequestOptions *requestOptionForPhotos = [[PHImageRequestOptions alloc] init];
requestOptionForPhotos.networkAccessAllowed = YES;
requestOptionForPhotos.synchronous = NO;
__block BOOL isFinished = NO;
__block NSData * tempData = nil;
for(PHAsset *asset in collection) {
[[PHImageManager defaultManager]
requestImageForAsset:asset
targetSize:CGSizeMake(80, 80)
contentMode:PHImageContentModeAspectFill
options:requestOptionForPhotos
resultHandler:^(UIImage *result, NSDictionary *info) {
tempData = UIImagePNGRepresentation(result);
isFinished = YES;
}];
}
dispatch_queue_t queue = dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_HIGH, 0);
dispatch_async(queue, ^{
NSURL *assetUrl = [[NSURL alloc] initWithString:filepath];
PHFetchResult *collection = [PHAsset fetchAssetsWithALAssetURLs:[NSArray arrayWithObject:assetUrl] options:nil];
PHImageRequestOptions *requestOptionForPhotos = [[PHImageRequestOptions alloc] init];
requestOptionForPhotos.networkAccessAllowed = YES;
requestOptionForPhotos.synchronous = NO;
__block BOOL isFinished = NO;
__block NSData * tempData = nil;
for (PHAsset *asset in collection) {
[[PHImageManager defaultManager]
requestImageForAsset:asset
targetSize:CGSizeMake(80, 80)
contentMode:PHImageContentModeAspectFill
options:requestOptionForPhotos
resultHandler:^(UIImage *result, NSDictionary *info) {
tempData = UIImagePNGRepresentation(result);
isFinished = YES;
}];
}
});
try this code to get asynchronusly image and put breakpoint on result handdler to check weather you are getting image or not.

Objective-C: Getting PNG Thumbnail from Movie with NSData

I have the following code to attempt to get a screenshot of a video file from NSData. I can confirm the NSData is valid and not nil, however both dataString and movieURL are returning nil.
- (UIImage *)imageFromMovie:(NSData *)movieData {
// set up the movie player
NSString *dataString = [[NSString alloc] initWithData:movieData encoding:NSUTF8StringEncoding];
NSURL *movieURL = [NSURL URLWithString:dataString];
// get the thumbnail
AVURLAsset *asset1 = [[AVURLAsset alloc] initWithURL:movieURL options:nil];
AVAssetImageGenerator *generate1 = [[AVAssetImageGenerator alloc] initWithAsset:asset1];
generate1.appliesPreferredTrackTransform = YES;
NSError *err = NULL;
CMTime time = CMTimeMake(1, 2);
CGImageRef oneRef = [generate1 copyCGImageAtTime:time actualTime:NULL error:&err];
UIImage *one = [[UIImage alloc] initWithCGImage:oneRef];
return(one);
}
EDIT: Here's a look at where/how I'm getting the NSData from the UIImagePicker
if ([mediaType isEqualToString:#"ALAssetTypeVideo"]) {
ALAssetsLibrary *assetLibrary=[[ALAssetsLibrary alloc] init];
[assetLibrary assetForURL:[[info objectAtIndex:x] valueForKey:UIImagePickerControllerReferenceURL] resultBlock:^(ALAsset *asset) {
ALAssetRepresentation *rep = [asset defaultRepresentation];
unsigned long DataSize = (unsigned long)[rep size];
Byte *buffer = (Byte*)malloc(DataSize);
NSUInteger buffered = [rep getBytes:buffer fromOffset:0.0 length:DataSize error:nil];
//here’s the NSData
NSData *data = [NSData dataWithBytesNoCopy:buffer length:buffered freeWhenDone:YES];
} failureBlock:^(NSError *err) {
NSLog(#"Error: %#",[err localizedDescription]);
}];
}
Possible, you have problems with encoding.
NSString instance method -(id)initWithData:data:encoding returns nil if data does not represent valid data for encoding.(https://developer.apple.com/library/mac/documentation/Cocoa/Reference/Foundation/Classes/NSString_Class/#//apple_ref/occ/instm/NSString/initWithData:encoding:)
Try to use correct encoding in -(id)initWithData:data:encoding method.
You are trying to convert the movie data to NSURL, that's why you are getting a nil url.
In your implementation, you can get the thumbnail in the following way:
AVURLAsset *asset1 = [[AVURLAsset alloc] initWithURL:[[info objectAtIndex:x] valueForKey:UIImagePickerControllerReferenceURL] options:nil];
AVAssetImageGenerator *generate1 = [[AVAssetImageGenerator alloc] initWithAsset:asset1];
generate1.appliesPreferredTrackTransform = YES;
NSError *err = NULL;
CMTime time = CMTimeMake(1, 2);
CGImageRef oneRef = [generate1 copyCGImageAtTime:time actualTime:NULL error:&err];
UIImage *one = [[UIImage alloc] initWithCGImage:oneRef];
Download my sample project before reading this answer from:
https://drive.google.com/open?id=0B_exgT43OZJOWl9HMDJCR0cyTW8
I know it's been a really long time since you posted this question; but, I found it, can answer it, and am reasonably confident that, unless you used the sample code provided by the Apple Developer Connection web site that does what you're asking, you still need answer. I base that solely on this fact: it's hard to figure out.
Nonetheless, I have a basic, working project that addresses your question; however, before looking at it, check out a video I made of it running on my iPhone 6s Plus:
<iframe width="640" height="360" src="https://www.youtube.com/embed/GiF-FFKvy5M?rel=0&controls=0&showinfo=0" frameborder="0" allowfullscreen></iframe>
As you can see, the poster frame for every asset in my iPhone's video collection is displayed in UICollectionViewCell; in the UICollectionViewController (or the UICollectionView / datasource delegate:
void (^renderThumbnail)(NSIndexPath *, CustomCell *) = ^(NSIndexPath *indexPath, CustomCell *cell) {
[[PHImageManager defaultManager] requestAVAssetForVideo:AppDelegate.assetsFetchResults[indexPath.section] options:nil resultHandler:^(AVAsset * _Nullable asset, AVAudioMix * _Nullable audioMix, NSDictionary * _Nullable info) {
cell.asset = [asset copy];
cell.frameTime = [NSValue valueWithCMTime:kCMTimeZero];
}];
};
- (UICollectionViewCell *)collectionView:(UICollectionView *)collectionView cellForItemAtIndexPath:(NSIndexPath *)indexPath {
PHAsset *phAsset = AppDelegate.assetsFetchResults[indexPath.section];
CustomCell *cell = [collectionView dequeueReusableCellWithReuseIdentifier:CellReuseIdentifier forIndexPath:indexPath];
cell.representedAssetIdentifier = phAsset.localIdentifier;
CGFloat hue = (CGFloat)indexPath.section / 5;
cell.backgroundColor = [UIColor colorWithHue:hue saturation:1.0f brightness:0.5f alpha:1.0f];
if ([cell.representedAssetIdentifier isEqualToString:phAsset.localIdentifier]) {
NSPurgeableData *data = [self.thumbnailCache objectForKey:phAsset.localIdentifier];
[data beginContentAccess];
UIImage *image = [UIImage imageWithData:data];
if (image != nil) {
cell.contentView.layer.contents = (__bridge id)image.CGImage;
NSLog(#"Cached image found");
} else {
renderThumbnail(indexPath, cell);
}
[data endContentAccess];
[data discardContentIfPossible];
}
// Request an image for the asset from the PHCachingImageManager.
/*[AppDelegate.imageManager requestImageForAsset:phAsset
targetSize:cell.contentView.bounds.size
contentMode:PHImageContentModeAspectFill
options:nil
resultHandler:^(UIImage *result, NSDictionary *info) {
// Set the cell's thumbnail image if it's still showing the same asset.
if ([cell.representedAssetIdentifier isEqualToString:phAsset.localIdentifier]) {
cell.thumbnailImage = result;
}
}];*/
return cell;
}
In the UICollectionViewCell subclass:
#implementation CustomCell
- (void)prepareForReuse {
[super prepareForReuse];
_asset = nil;
_frameTime = nil;
_thumbnailImage = nil;
[self.contentView.layer setContents:nil];
[[self contentView] setContentMode:UIViewContentModeScaleAspectFit];
[[self contentView] setClipsToBounds:YES];
}
- (void)dealloc {
}
- (void)setAsset:(AVAsset *)asset {
_asset = asset;
}
- (void)setFrameTime:(NSValue *)frameTime {
_frameTime = frameTime;
dispatch_queue_t concurrentQueue = dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_HIGH, 0);
dispatch_async(concurrentQueue, ^{
AVAssetImageGenerator *imageGenerator = [AVAssetImageGenerator assetImageGeneratorWithAsset:_asset];
imageGenerator.appliesPreferredTrackTransform = YES;
imageGenerator.requestedTimeToleranceAfter = kCMTimeZero;
imageGenerator.requestedTimeToleranceBefore = kCMTimeZero;
[imageGenerator generateCGImagesAsynchronouslyForTimes:#[frameTime] completionHandler:^(CMTime requestedTime, CGImageRef _Nullable image, CMTime actualTime, AVAssetImageGeneratorResult result, NSError * _Nullable error) {
dispatch_sync(dispatch_get_main_queue(), ^{
self.thumbnailImage = [UIImage imageWithCGImage:image scale:25.0 orientation:UIImageOrientationUp];
});
}];
});
}
- (void)setThumbnailImage:(UIImage *)thumbnailImage {
_thumbnailImage = thumbnailImage;
self.contentView.layer.contents = (__bridge id)_thumbnailImage.CGImage;
}
#end
The NSCache is set up like this:
self.thumbnailCache = [[NSCache alloc] init];
self.thumbnailCache.name = #"Thumbnail Cache";
self.thumbnailCache.delegate = self;
self.thumbnailCache.evictsObjectsWithDiscardedContent = true;
self.thumbnailCache.countLimit = AppDelegate.assetsFetchResults.count;
The PHAssets were acquired this way:
- (PHFetchResult *)assetsFetchResults {
__block PHFetchResult *i = self->_assetsFetchResults;
if (!i) {
static dispatch_once_t onceToken;
dispatch_once(&onceToken, ^{
PHFetchResult *smartAlbums = [PHAssetCollection fetchAssetCollectionsWithType:PHAssetCollectionTypeSmartAlbum subtype:PHAssetCollectionSubtypeSmartAlbumVideos options:nil];
self->_assetCollection = smartAlbums.firstObject;
if (![self->_assetCollection isKindOfClass:[PHAssetCollection class]]) self->_assetCollection = nil;
PHFetchOptions *allPhotosOptions = [[PHFetchOptions alloc] init];
allPhotosOptions.sortDescriptors = #[[NSSortDescriptor sortDescriptorWithKey:#"creationDate" ascending:NO]];
i = [PHAsset fetchAssetsInAssetCollection:self->_assetCollection options:allPhotosOptions];
self->_assetsFetchResults = i;
});
}
return i;
}

Cannot get a clear image requested from Photo Album using PhotoKit

I am using PhotoKit to fetch the photos in one of the system albums like this:
PHFetchOptions *fetchOptions = [[PHFetchOptions alloc] init];
fetchOptions.sortDescriptors = #[[NSSortDescriptor sortDescriptorWithKey:#"creationDate" ascending:YES]];
PHFetchResult *fetchResult = [PHAssetCollection fetchAssetCollectionsWithType:PHAssetCollectionTypeSmartAlbum subtype:PHAssetCollectionSubtypeAlbumRegular options:nil];
[fetchResult enumerateObjectsUsingBlock:^(PHAssetCollection *collection, NSUInteger idx, BOOL *stop) {
NSLog(#"ALBUM NAME: %#", collection.localizedTitle);
if ([collection.localizedTitle isEqualToString:#"Camera Roll"]) {
PHFetchResult *photos = [PHAsset fetchAssetsInAssetCollection:collection options:nil];
NSLog(#"PHOTOS: %ld", photos.count);
_photos = nil;
_photos = #[].mutableCopy;
for (PHAsset *asset in photos) {
[self loadImageFromPHAsset:asset];
}
}
}];
Then in my custom helper method: (void)loadImageFromPHAsset:(PHAsset *)asset, I have this:
-(void)loadImageFromPHAsset:(PHAsset *)asset
{
PHImageManager *manager = [PHImageManager defaultManager];
CGSize targetSize = _layout.itemSize;
[manager requestImageForAsset:asset targetSize:targetSize contentMode:PHImageContentModeAspectFill options:nil resultHandler:^(UIImage *result, NSDictionary *info) {
[_photos addObject:result];
}];
[self.collectionView reloadData];
}
So I have an array of images and I present them in my UICollectionViewCell:
UIImage *image = [_photos objectAtIndex:indexPath.row];
cell.imageView.contentMode = UIViewContentModeScaleAspectFill;
cell.imageView.image = image;
But what I got is like this, it is very blur, how can I make it clear?
First of all the targetSize should be multiplied by scale to account for the screen's scale like this,
CGFloat scale = [UIScreen mainScreen].scale;
CGSize targetSize = CGSizeMake(_layout.itemSize.width*scale, _layout.itemSize.height*scale);
Secondly, try these options to get the exact size image,
//Async call returned on main thread, can return multiple times
PHImageRequestOptions *options = [[PHImageRequestOptions alloc] init];
options.resizeMode = PHImageRequestOptionsResizeModeExact;
Lastly, if you have performance concerns, use PHCachingManager, I have answered the question regarding performance improvements here
Hope this helps you.
Edit:
I did not realise that the images were stored in an array. Try setting the synchronous flag in image options.
PHImageRequestOptions *options = [[PHImageRequestOptions alloc] init];
options.synchronous = YES;
options.resizeMode = PHImageRequestOptionsResizeModeExact;

Resources