Getting fixed size Image from PHCachingImageManager - ios

I'm trying to get thumbnails from my PHCachingImageManager so that I can put them into the built-in imageView in my UITableViewCells.
NSLog(#"thumbnail size: %#", NSStringFromCGSize(AssetGridThumbnailSize));
[self.imageManager
requestImageForAsset:asset
targetSize:AssetGridThumbnailSize
contentMode:PHImageContentModeAspectFill
options:nil
resultHandler:^(UIImage *result, NSDictionary *info) {
NSLog(#"image size: %#", NSStringFromCGSize(result.size));
// Only update the thumbnail if the cell tag hasn't changed. Otherwise, the cell has been re-used.
if (cell.tag == currentTag) {
cell.imageView.contentMode = UIViewContentModeScaleAspectFill;
cell.imageView.image = result;
}
}];
I can see that AssetGridThumbnailSize = 80 x 80 (40 x 40 retina) from the logs:
thumbnail size: {80, 80}
and I've set the contentMode to PHImageContentModeAspectFill but when I get the images back they are all different sizes and it makes the UITableView look very chaotic.
How can I make the PHCachingImageManager give me back an image of the right size?

While posting the question, I figured out the answer so I decided to continue posting and hopefully this will help someone else.
The targetSize is just a suggestion. In order to really control the size of the returned images you have to pass in a PHImageRequestOptions object with resizeMode = PHImageRequestOptionsResizeModeExact.
PHImageRequestOptions *options = [[PHImageRequestOptions alloc] init];
options.resizeMode = PHImageRequestOptionsResizeModeExact;
[self.imageManager requestImageForAsset:asset
targetSize:AssetGridThumbnailSize
contentMode:PHImageContentModeAspectFill
options:options
resultHandler:^(UIImage *result, NSDictionary *info) {

Related

iOS Photokit - PHAsset pixelWidth and pixelHeight does not match high-resolution image

my company is having a big problem with getting correct size metadata by fetching PHAssets.
We have developed an iOS applications that lets customers choose pictures from library, get the size (in pixel) for each of them, calculate coordinates for adjusting to gadgets we sell, then upload high quality version of picture to our server to print gadgets.
For some of our customers, the problem is that the size in pixel of some of the high-quality versions of pictures sent, does not match pixelWidth and pixelHeight given by the PHAsset object.
To make an example, we have a picture that:
is reported to be 2096x3724 by PHAsset object
but, when full size image is requested, a 1536x2730 picture is generated
The picture is not in iCloud, and is sent by an iPhone 6 SE running iOS 10.2.
This is the code to get full size image version:
PHImageRequestOptions *imgOpts = [[PHImageRequestOptions alloc] init];
imgOpts.deliveryMode = PHImageRequestOptionsDeliveryModeHighQualityFormat;
imgOpts.networkAccessAllowed = YES;
imgOpts.resizeMode = PHImageRequestOptionsResizeModeExact;
imgOpts.version = PHImageRequestOptionsVersionCurrent;
PHCachingImageManager *imageManager = [[PHCachingImageManager alloc] init];
[imageManager requestImageForAsset:imageAsset targetSize:PHImageManagerMaximumSize contentMode:PHImageContentModeDefault options:imgOpts resultHandler:^(UIImage * result, NSDictionary * info) {
NSData * imageData = UIImageJPEGRepresentation(result, 0.92f);
//UPLOAD OF imageData TO SERVER HERE
}]
Also tried with requestImageDataForAsset method, but with no luck:
PHImageRequestOptions *imgOpts = [[PHImageRequestOptions alloc] init];
imgOpts.deliveryMode = PHImageRequestOptionsDeliveryModeHighQualityFormat;
imgOpts.networkAccessAllowed = YES;
imgOpts.resizeMode = PHImageRequestOptionsResizeModeExact;
imgOpts.version = PHImageRequestOptionsVersionCurrent;
PHCachingImageManager *imageManager = [[PHCachingImageManager alloc] init];
[imageManager requestImageDataForAsset:imageAsset options:imgOpts resultHandler:^(NSData * imageData, NSString * dataUTI, UIImageOrientation orientation, NSDictionary * info) {
//UPLOAD OF imageData TO SERVER HERE
}]
Getting exact size from high-resolution version of every picture, before doing upload, is not an option for us, 'cause it would degrade a lot performance when selecting a large amount of assets from the library.
Are we missing or doing something wrong?
Is there a way to get asset size in pixel without loading full-resolution image into memory?
Thanks for helping
This is due to a bug in Photos framework. Details about the bug can be found here.
Sometimes, after a photo is edited, a smaller version is created. This only occurs for some larger photos.
Calling either requestImageForAsset: (with PHImageManagerMaximumSize) or requestImageDataForAsset: (with PHImageRequestOptionsDeliveryModeHighQualityFormat) will read the data from the smaller file version, when trying to retrieve the edited version (PHImageRequestOptionsVersionCurrent).
The info in the callback of the above methods will point the path to the image. As an example:
PHImageFileURLKey = "file:///[...]DCIM/100APPLE/IMG_0006/Adjustments/IMG_0006.JPG";
Inspecting that folder, I was able to find another image, FullSizeRender.jpg - this one has the full size and contains the latest edits. Thus, one way would be to try and read from the FullSizeRender.jpg, when such a file is present.
Starting with iOS 9, it's also possible to fetch the latest edit, at highest resolution, using the PHAssetResourceManager:
// if (#available(iOS 9.0, *)) {
// check if a high quality edit is available
NSArray<PHAssetResource *> *resources = [PHAssetResource assetResourcesForAsset:_asset];
PHAssetResource *hqResource = nil;
for (PHAssetResource *res in resources) {
if (res.type == PHAssetResourceTypeFullSizePhoto) {
// from my tests so far, this is only present for edited photos
hqResource = res;
break;
}
}
if (hqResource) {
PHAssetResourceRequestOptions *options = [[PHAssetResourceRequestOptions alloc] init];
options.networkAccessAllowed = YES;
long long fileSize = [[hqResource valueForKey:#"fileSize"] longLongValue];
NSMutableData *fullData = [[NSMutableData alloc] initWithCapacity:fileSize];
[[PHAssetResourceManager defaultManager] requestDataForAssetResource:hqResource options:options dataReceivedHandler:^(NSData * _Nonnull data) {
// append the data that we're receiving
[fullData appendData:data];
} completionHandler:^(NSError * _Nullable error) {
// handle completion, using `fullData` or `error`
// uti == hqResource.uniformTypeIdentifier
// orientation == UIImageOrientationUp
}];
}
else {
// use `requestImageDataForAsset:`, `requestImageForAsset:` or `requestDataForAssetResource:` with a different `PHAssetResource`
}
can you try this to fetch camera Roll pics:
__weak __typeof(self) weakSelf = self;
PHFetchResult<PHAssetCollection *> *albums = [PHAssetCollection fetchAssetCollectionsWithType:PHAssetCollectionTypeSmartAlbum subtype:PHAssetCollectionSubtypeSmartAlbumSelfPortraits options:nil];
[albums enumerateObjectsUsingBlock:^(PHAssetCollection * _Nonnull album, NSUInteger idx, BOOL * _Nonnull stop) {
PHFetchOptions *options = [[PHFetchOptions alloc] init];
options.wantsIncrementalChangeDetails = YES;
options.predicate = [NSPredicate predicateWithFormat:#"mediaType == %d",PHAssetMediaTypeImage];
options.sortDescriptors = #[[NSSortDescriptor sortDescriptorWithKey:#"creationDate" ascending:NO]];
PHFetchResult<PHAsset *> *assets = [PHAsset fetchAssetsInAssetCollection:album options:options];
if(assets.count>0)
{
[assets enumerateObjectsUsingBlock:^(PHAsset * _Nonnull asset, NSUInteger idx, BOOL * _Nonnull stop) {
if(asset!=nil)
{
[[PHImageManager defaultManager] requestImageForAsset:asset targetSize:PHImageManagerMaximumSize contentMode:PHImageContentModeAspectFill options:nil resultHandler:^(UIImage *result, NSDictionary *info)
{
dispatch_async(dispatch_get_main_queue(), ^{
[weakSelf addlocalNotificationForFilters:result];
// [weakSelf.buttonGalery setImage:result forState:UIControlStateNormal];
});
}];
*stop = YES;
}
else{
[weakSelf getlatestAferSelfie];
}
}];
}

How to store UIImage in NSMutableArray without loss of quality

I have a problem, I use the Photo framework, and when I do that (images is a NSMutableArray) :
[[PHImageManager defaultManager]requestImageForAsset:lastAsset targetSize:CGSizeMake(150, 300) contentMode:PHImageContentModeAspectFit options:nil resultHandler:^(UIImage *result, NSDictionary *info) {
[images addObject:result];
NSLog(#"%# %# %#",result,[images objectAtIndex:0],[images lastObject]);
}];
I have these results for the size of my UIImages in my console :
700,700 60,40 700,700
I don't understand why, and when I add an image of my NSMutableArray which is named "images" :
view.image = [images objectAtIndex:index];
I have all my images in very bad quality (I think they are all stored in 60,40 in my images NSMutableArray).
Or I want to recover theses in their original quality (700,700 not in 60,40) to display theses with a good quality.
Thanks a lot everyone !!
I have found the solution of my problem.
The solution is to add this option :
PHImageRequestOptions* options = [[PHImageRequestOptions alloc]init];
options.resizeMode = PHImageRequestOptionsResizeModeExact;
options.deliveryMode = PHImageRequestOptionsDeliveryModeHighQualityFormat;
options.version = PHImageRequestOptionsVersionUnadjusted;
options.synchronous = YES;
to my request.
We can see the deliveryMode which is very important for keep a good image quality.
Thanks for help everyone !

getting a poster frame from a video with PhotoKit (iOS8)

I want to extract a UIimage from a video asset, to use as a poster. According to the doc for the PHImageManager, I should be able to use requestImageForAsset:targetSize:contentMode:options:resultHandler:. Quoting from the doc:
You can use this method for both photo and video assets—for a video asset, an image request provides a thumbnail image or poster frame.
That hasn't been my experience though. Using requestImageForAsset:targetSize:contentMode:options:resultHandler: with a video asset, the callback block always returns nil for the image and a nil error. The info dictionary returned looks is as follow (nothing I could make sense of)
{
PHImageFileSandboxExtensionTokenKey = "31c0997752ae82ee32953503bd6d9a2436c50fac;00000000;00000000;000000000000001a;com.apple.app-sandbox.read;00000001;01000003;00000000000756cf;/private/var/mobile/Media/DCIM/100APPLE/IMG_0008.MOV";
PHImageFileURLKey = "file:///var/mobile/Media/DCIM/100APPLE/IMG_0008.MOV";
PHImageFileUTIKey = "dyn.ah62d4uv4ge804550";
PHImageResultDeliveredImageFormatKey = 9999;
PHImageResultIsDegradedKey = 0;
PHImageResultIsInCloudKey = 0;
PHImageResultIsPlaceholderKey = 0;
PHImageResultRequestIDKey = 26;
PHImageResultWantedImageFormatKey = 9999;
}
Here is the method I wrote in a PHAsset category to extract an image from a video PHAsset below. Has anyone been able to make this work?
#implementation PHAsset (util)
-(PHImageRequestID)fullSizeImage: (void(^)(UIImage *image, NSError *error)) resultHandler {
PHImageRequestOptions *options = [[PHImageRequestOptions alloc] init];
PHImageContentMode contentMode = PHImageContentModeAspectFill ;
PHImageManager *imageManager = [PHImageManager defaultManager] ;
CGSize targetSize = PHImageManagerMaximumSize ;
return [imageManager requestImageForAsset:self targetSize:targetSize contentMode:contentMode options:nil resultHandler:^(UIImage *result, NSDictionary *info) {
NSError *error = (NSError*)[info objectForKey:PHImageErrorKey];
if (result == nil) {
NSLog(#"ERROR while fetching fullSizeImage %#, info:\n%#", error, info);
}
[[NSOperationQueue mainQueue] addOperationWithBlock:^{
resultHandler(result, error);
}];
}];
}
#end
I got it to work after substituting
CGSize targetSize = PHImageManagerMaximumSize ;
with...
CGSize targetSize = CGSizeMake(self.pixelWidth*ratio, self.pixelHeight*ratio) ;
I haven't seen any relevant documentation, so probably a bug in iOS8.0.x (at the time of this writing, iOS8.1 beta is available but I haven't tested on it)

How to fetch squared thumbnails from PHImageManager?

Has anybody idea how to fetch squared thumbs from PHImageManager? PHImageContentModeAspectFill option has no effect.
[[PHImageManager defaultManager]
requestImageForAsset:(PHAsset *)_asset
targetSize:CGSizeMake(80, 80)
contentMode:PHImageContentModeAspectFill
options:nil
resultHandler:^(UIImage *result, NSDictionary *info) {
// sadly result is not a squared image
imageView.image = result;
}];
Update:
The bug in cropping images as they were retrieved from PHImageManager was fixed in iOS 8.3, so for that version of iOS and later, my original example works, as follows:
It seems the bugs are still there up to and including iOS 8.4, I can reproduce them with standard iPhone 6s back camera images, taking a full size square crop. They are properly fixed in iOS 9.0, where even large crops of a 63 megapixel panorama work fine.
The approach Apple defines is to pass a CGRect in the co-ordinate space of the image, where the origin is (0,0) and the maximum is (1,1). You pass this rect in the PHImageRequestOptions object, along with a resizeMode of PHImageRequestOptionsResizeModeExact, and then you should get back a cropped image.
- (void)showSquareImageForAsset:(PHAsset *)asset
{
NSInteger retinaScale = [UIScreen mainScreen].scale;
CGSize retinaSquare = CGSizeMake(100*retinaScale, 100*retinaScale);
PHImageRequestOptions *cropToSquare = [[PHImageRequestOptions alloc] init];
cropToSquare.resizeMode = PHImageRequestOptionsResizeModeExact;
CGFloat cropSideLength = MIN(asset.pixelWidth, asset.pixelHeight);
CGRect square = CGRectMake(0, 0, cropSideLength, cropSideLength);
CGRect cropRect = CGRectApplyAffineTransform(square,
CGAffineTransformMakeScale(1.0 / asset.pixelWidth,
1.0 / asset.pixelHeight));
cropToSquare.normalizedCropRect = cropRect;
[[PHImageManager defaultManager]
requestImageForAsset:(PHAsset *)asset
targetSize:retinaSquare
contentMode:PHImageContentModeAspectFit
options:cropToSquare
resultHandler:^(UIImage *result, NSDictionary *info) {
self.imageView.image = result;
}];
}
This example makes its cropRect of side length equal to the smaller of the width and height of the asset, and then transforms it to the co-ordinate space of the image using CGRectApplyAffineTransform. You may want to set the origin of square to something other than (0,0), as often you want the crop square centred along the axis of the image which is being cropped, but I'll leave that as an exercise for the reader. :-)
Original Answer:
John's answer got me most of the way there, but using his code I was getting stretched and squashed images. Here's how I got an imageView to display square thumbnails fetched from the PHImageManager.
Firstly, ensure that the contentMode property for your UIImageView is set to ScaleAspectFill. The default is to ScaleToFill, which doesn't work correctly for displaying square thumbnails from PHImageManager, so make sure you change this whether you've instantiated the UIImageView in code or in the storyboard.
//view dimensions are based on points, but we're requesting pixels from PHImageManager
NSInteger retinaMultiplier = [UIScreen mainScreen].scale;
CGSize retinaSquare = CGSizeMake(imageView.bounds.size.width * retinaMultiplier, imageView.bounds.size.height * retinaMultiplier);
[[PHImageManager defaultManager]
requestImageForAsset:(PHAsset *)_asset
targetSize:retinaSquare
contentMode:PHImageContentModeAspectFill
options:nil
resultHandler:^(UIImage *result, NSDictionary *info) {
// The result is not square, but correctly displays as a square using AspectFill
imageView.image = result;
}];
Specifying PHImageRequestOptionsResizeModeExact for the resizeMode is not required, as it will not give you a cropped image unless you also supply a normalizedCropRect, and should not be used here as there's no benefit, and using it means you don't get the benefits of quickly returned cached images.
The UIImage returned in result will be the same aspect ratio as the source, but scaled correctly for use in a UIImageView which is set to aspect fill to display as a square, so if you're just displaying it, this is the way to go. If you need to crop the image for print or export outside of the app, this isn't what you want - look into the use of normalizedCropRect for that. (edit- see below for example of what should work...)
Except this also make sure that the you set the content mode of the UIImageView to UIViewContentModeScaleAspectFill and that you set clipsToBounds = YES by the following 2 lines :
imageView.contentMode=UIViewContentModeScaleAspectFill;
imageView.clipsToBounds=YES;
Edit to add normalizedCropRect usage example
WARNING - this doesn't work, but should according to Apple's documentation.
The approach Apple defines is to pass a CGRect in the co-ordinate space of the image, where the origin is (0,0) and the maximum is (1,1). You pass this rect in the PHImageRequestOptions object, along with a resizeMode of PHImageRequestOptionsResizeModeExact, and then you should get back a cropped image. The problem is that you don't, it comes back as the original aspect ratio and the full image.
I've verified that the crop rect is created correctly in the image's co-ordinate space, and followed the instruction to use PHImageRequestOptionsResizeModeExact, but the result handler will still be passed an image in the original aspect ratio. This seems to be a bug in the framework, and when it is fixed, the following code should work.
- (void)showSquareImageForAsset:(PHAsset *)asset
{
NSInteger retinaScale = [UIScreen mainScreen].scale;
CGSize retinaSquare = CGSizeMake(100*retinaScale, 100*retinaScale);
PHImageRequestOptions *cropToSquare = [[PHImageRequestOptions alloc] init];
cropToSquare.resizeMode = PHImageRequestOptionsResizeModeExact;
CGFloat cropSideLength = MIN(asset.pixelWidth, asset.pixelHeight);
CGRect square = CGRectMake(0, 0, cropSideLength, cropSideLength);
CGRect cropRect = CGRectApplyAffineTransform(square,
CGAffineTransformMakeScale(1.0 / asset.pixelWidth,
1.0 / asset.pixelHeight));
cropToSquare.normalizedCropRect = cropRect;
[[PHImageManager defaultManager]
requestImageForAsset:(PHAsset *)asset
targetSize:retinaSquare
contentMode:PHImageContentModeAspectFit
options:cropToSquare
resultHandler:^(UIImage *result, NSDictionary *info) {
self.imageView.image = result;
}];
}
All I can suggest is that if you have this problem, you file a radar with Apple to request that they fix it!
PHImageRequestOptions *options = [[PHImageRequestOptions alloc] init];
options.resizeMode = PHImageRequestOptionsResizeModeExact;
NSInteger retinaMultiplier = [UIScreen mainScreen].scale;
CGSize retinaSquare = CGSizeMake(imageView.bounds.size.width * retinaMultiplier, imageView.bounds.size.height * retinaMultiplier);
[[PHImageManager defaultManager]
requestImageForAsset:(PHAsset *)_asset
targetSize:retinaSquare
contentMode:PHImageContentModeAspectFill
options:options
resultHandler:^(UIImage *result, NSDictionary *info) {
imageView.image =[UIImage imageWithCGImage:result.CGImage scale:retinaMultiplier orientation:result.imageOrientation];
}];
To get an exact square, you'll have to indicate that you want an exact size by passing options, like so:
PHImageRequestOptions *options = [[PHImageRequestOptions alloc] init];
// No, really, we want this exact size
options.resizeMode = PHImageRequestOptionsResizeModeExact
[[PHImageManager defaultManager]
requestImageForAsset:(PHAsset *)_asset
targetSize:CGSizeMake(160, 160)
contentMode:PHImageContentModeAspectFill
options:options
resultHandler:^(UIImage *result, NSDictionary *info) {
// Happily, result is now a squared image
imageView.image = result;
}];
let squareSize = CGSize(100, 100)
let options = PHImageRequestOptions()
options.resizeMode = .exact
options.deliveryMode = .highQualityFormat
PHImageManager.default().requestImage(for: asset, targetSize: squareSize, contentMode: .aspectFill, options: options) { (image, _) in
// Use the image.
}
This is working fine for me :
__block UIImage* imgThumb;
CGSize size=CGSizeMake(45, 45);// Size for Square image
[self.imageManager requestImageForAsset:<your_current_phAsset>
targetSize:size
contentMode:PHImageContentModeAspectFill
options:nil
resultHandler:^(UIImage *result, NSDictionary *info) {
imgThumb = result;
}];

iOS 8 PhotoKit. Get maximum-size image from iCloud Photo Sharing albums

How get access to the full-size images from iСloud? Every time I try to get this picture, I get image size 256x342. I not see progress too.
Code:
PHFetchResult *result = [PHAsset fetchAssetsWithLocalIdentifiers:#[assetIdentifier] options:nil];
PHImageManager *manager = [PHImageManager defaultManager];
[result enumerateObjectsUsingBlock:^(PHAsset *asset, NSUInteger idx, BOOL *stop) {
PHImageRequestOptions *options = [[PHImageRequestOptions alloc] init];
options.deliveryMode = PHImageRequestOptionsDeliveryModeHighQualityFormat;
options.synchronous = YES;
options.networkAccessAllowed = YES;
options.progressHandler = ^(double progress, NSError *error, BOOL *stop, NSDictionary *info) {
NSLog(#"%f", progress);
};
[manager requestImageForAsset:asset targetSize:PHImageManagerMaximumSize contentMode:PHImageContentModeDefault options:options resultHandler:^(UIImage *resultImage, NSDictionary *info)
{
UIImage *image = resultImage;
NSLog(#"%#", NSStringFromCGSize(resultImage.size));
}];
}];
Until I click the picture in Photo app, this picture will be of poor quality. But as soon as I click on the picture, it downloaded on the device and will be full-size quality.
I think the below should get the full resolution image data:
[manager requestImageDataForAsset:asset
options:options
resultHandler:^(NSData *imageData, NSString *dataUTI, UIImageOrientation orientation, NSDictionary *info)
{
UIImage *image = [UIImage imageWithData:imageData];
//...
}];
The entire Photos Framework (PhotoKit) is covered in the WWDC video: https://developer.apple.com/videos/wwdc/2014/#511
Hope this helps.
Edit:
The resultHandler can be called twice. This is explained in the video I linked to at around 30:00. Could be that you are only getting the thumbnail and the full image will come with the second time its called.
I'm having some of the same issues. It is either a bug or poor documentation. I've been able to get around the issue by specifying a requested size of 2000x2000. The problem with this is that I do get the full size image but sometimes it comes back marked as degraded so I keep waiting for a different image which never happens. This is what I do to get around those issues.
self.selectedAsset = asset;
self.collectionView.allowsSelection = NO;
PHImageRequestOptions* options = [[[PHImageRequestOptions alloc] init] autorelease];
options.synchronous = NO;
options.version = PHImageRequestOptionsVersionCurrent;
options.deliveryMode = PHImageRequestOptionsDeliveryModeOpportunistic;
options.resizeMode = PHImageRequestOptionsResizeModeNone;
options.networkAccessAllowed = YES;
options.progressHandler = ^(double progress,NSError *error,BOOL* stop, NSDictionary* dict) {
NSLog(#"progress %lf",progress); //never gets called
};
[self.delegate choosePhotoCollectionVCIsGettingPhoto:YES]; //show activity indicator
__block BOOL isStillLookingForPhoto = YES;
currentImageRequestId = [[PHImageManager defaultManager] requestImageForAsset:asset targetSize:CGSizeMake(2000, 2000) contentMode:PHImageContentModeAspectFill options:options resultHandler:^(UIImage *result, NSDictionary *info) {
NSLog(#"result size:%#",NSStringFromCGSize(result.size));
BOOL isRealDealForSure = NO;
NSNumber* n = info[#"PHImageResultIsPlaceholderKey"]; //undocumented key so I don't count on it
if (n != nil && [n boolValue] == NO){
isRealDealForSure = YES;
}
if([info[PHImageResultIsInCloudKey] boolValue]){
NSLog(#"image is in the cloud"); //never seen this. (because I allowed network access)
}
else if([info[PHImageResultIsDegradedKey] boolValue] && !isRealDealForSure){
//do something with the small image...but keep waiting
[self.delegate choosePhotoCollectionVCPreviewSmallPhoto:result];
self.collectionView.allowsSelection = YES;
dispatch_after(dispatch_time(DISPATCH_TIME_NOW, (int64_t)(3.0 * NSEC_PER_SEC)), dispatch_get_main_queue(), ^{ //random time of 3 seconds to get the full resolution in case the degraded key is lying to me. The user can move on but we will keep waiting.
if(isStillLookingForPhoto){
self.selectedImage = result;
[self.delegate choosePhotoCollectionVCPreviewFullPhoto:self.selectedImage]; //remove activity indicator and let the user move on
}
});
}
else {
//do something with the full result and get rid of activity indicator.
if(asset == self.selectedAsset){
isStillLookingForPhoto = NO;
self.selectedImage = result;
[self.delegate choosePhotoCollectionVCPreviewFullPhoto:self.selectedImage];
self.collectionView.allowsSelection = YES;
}
else {
NSLog(#"ignored asset because another was pressed");
}
}
}];
To get the full size image you need to check the info list.
I used this to test if the returned result is the full image, or a degraded version.
if ([[info valueForKey:#"PHImageResultIsDegradedKey"]integerValue]==0){
// Do something with the FULL SIZED image
} else {
// Do something with the regraded image
}
or you could use this to check if you got back what you asked for.
if ([[info valueForKey:#"PHImageResultWantedImageFormatKey"]integerValue]==[[info valueForKey:#"PHImageResultDeliveredImageFormatKey"]integerValue]){
// Do something with the FULL SIZED image
} else {
// Do something with the regraded image
}
There are a number of other, undocumented but useful, keys e.g.
PHImageFileOrientationKey = 3;
PHImageFileSandboxExtensionTokenKey = "/private/var/mobile/Media/DCIM/100APPLE/IMG_0780.JPG";
PHImageFileURLKey = "file:///var/mobile/Media/DCIM/100APPLE/IMG_0780.JPG";
PHImageFileUTIKey = "public.jpeg";
PHImageResultDeliveredImageFormatKey = 9999;
PHImageResultIsDegradedKey = 0;
PHImageResultIsInCloudKey = 0;
PHImageResultIsPlaceholderKey = 0;
PHImageResultRequestIDKey = 1;
PHImageResultWantedImageFormatKey = 9999;
Have fun.
Linasses
I believe it's related to you setting PHImageRequestOptionsDeliveryModeOpportunistic.
Note that this is not even supported for asynchronous mode (default).
Try PHImageRequestOptionsDeliveryModeHighQualityFormat intead.

Resources