Has anybody idea how to fetch squared thumbs from PHImageManager? PHImageContentModeAspectFill option has no effect.
[[PHImageManager defaultManager]
requestImageForAsset:(PHAsset *)_asset
targetSize:CGSizeMake(80, 80)
contentMode:PHImageContentModeAspectFill
options:nil
resultHandler:^(UIImage *result, NSDictionary *info) {
// sadly result is not a squared image
imageView.image = result;
}];
Update:
The bug in cropping images as they were retrieved from PHImageManager was fixed in iOS 8.3, so for that version of iOS and later, my original example works, as follows:
It seems the bugs are still there up to and including iOS 8.4, I can reproduce them with standard iPhone 6s back camera images, taking a full size square crop. They are properly fixed in iOS 9.0, where even large crops of a 63 megapixel panorama work fine.
The approach Apple defines is to pass a CGRect in the co-ordinate space of the image, where the origin is (0,0) and the maximum is (1,1). You pass this rect in the PHImageRequestOptions object, along with a resizeMode of PHImageRequestOptionsResizeModeExact, and then you should get back a cropped image.
- (void)showSquareImageForAsset:(PHAsset *)asset
{
NSInteger retinaScale = [UIScreen mainScreen].scale;
CGSize retinaSquare = CGSizeMake(100*retinaScale, 100*retinaScale);
PHImageRequestOptions *cropToSquare = [[PHImageRequestOptions alloc] init];
cropToSquare.resizeMode = PHImageRequestOptionsResizeModeExact;
CGFloat cropSideLength = MIN(asset.pixelWidth, asset.pixelHeight);
CGRect square = CGRectMake(0, 0, cropSideLength, cropSideLength);
CGRect cropRect = CGRectApplyAffineTransform(square,
CGAffineTransformMakeScale(1.0 / asset.pixelWidth,
1.0 / asset.pixelHeight));
cropToSquare.normalizedCropRect = cropRect;
[[PHImageManager defaultManager]
requestImageForAsset:(PHAsset *)asset
targetSize:retinaSquare
contentMode:PHImageContentModeAspectFit
options:cropToSquare
resultHandler:^(UIImage *result, NSDictionary *info) {
self.imageView.image = result;
}];
}
This example makes its cropRect of side length equal to the smaller of the width and height of the asset, and then transforms it to the co-ordinate space of the image using CGRectApplyAffineTransform. You may want to set the origin of square to something other than (0,0), as often you want the crop square centred along the axis of the image which is being cropped, but I'll leave that as an exercise for the reader. :-)
Original Answer:
John's answer got me most of the way there, but using his code I was getting stretched and squashed images. Here's how I got an imageView to display square thumbnails fetched from the PHImageManager.
Firstly, ensure that the contentMode property for your UIImageView is set to ScaleAspectFill. The default is to ScaleToFill, which doesn't work correctly for displaying square thumbnails from PHImageManager, so make sure you change this whether you've instantiated the UIImageView in code or in the storyboard.
//view dimensions are based on points, but we're requesting pixels from PHImageManager
NSInteger retinaMultiplier = [UIScreen mainScreen].scale;
CGSize retinaSquare = CGSizeMake(imageView.bounds.size.width * retinaMultiplier, imageView.bounds.size.height * retinaMultiplier);
[[PHImageManager defaultManager]
requestImageForAsset:(PHAsset *)_asset
targetSize:retinaSquare
contentMode:PHImageContentModeAspectFill
options:nil
resultHandler:^(UIImage *result, NSDictionary *info) {
// The result is not square, but correctly displays as a square using AspectFill
imageView.image = result;
}];
Specifying PHImageRequestOptionsResizeModeExact for the resizeMode is not required, as it will not give you a cropped image unless you also supply a normalizedCropRect, and should not be used here as there's no benefit, and using it means you don't get the benefits of quickly returned cached images.
The UIImage returned in result will be the same aspect ratio as the source, but scaled correctly for use in a UIImageView which is set to aspect fill to display as a square, so if you're just displaying it, this is the way to go. If you need to crop the image for print or export outside of the app, this isn't what you want - look into the use of normalizedCropRect for that. (edit- see below for example of what should work...)
Except this also make sure that the you set the content mode of the UIImageView to UIViewContentModeScaleAspectFill and that you set clipsToBounds = YES by the following 2 lines :
imageView.contentMode=UIViewContentModeScaleAspectFill;
imageView.clipsToBounds=YES;
Edit to add normalizedCropRect usage example
WARNING - this doesn't work, but should according to Apple's documentation.
The approach Apple defines is to pass a CGRect in the co-ordinate space of the image, where the origin is (0,0) and the maximum is (1,1). You pass this rect in the PHImageRequestOptions object, along with a resizeMode of PHImageRequestOptionsResizeModeExact, and then you should get back a cropped image. The problem is that you don't, it comes back as the original aspect ratio and the full image.
I've verified that the crop rect is created correctly in the image's co-ordinate space, and followed the instruction to use PHImageRequestOptionsResizeModeExact, but the result handler will still be passed an image in the original aspect ratio. This seems to be a bug in the framework, and when it is fixed, the following code should work.
- (void)showSquareImageForAsset:(PHAsset *)asset
{
NSInteger retinaScale = [UIScreen mainScreen].scale;
CGSize retinaSquare = CGSizeMake(100*retinaScale, 100*retinaScale);
PHImageRequestOptions *cropToSquare = [[PHImageRequestOptions alloc] init];
cropToSquare.resizeMode = PHImageRequestOptionsResizeModeExact;
CGFloat cropSideLength = MIN(asset.pixelWidth, asset.pixelHeight);
CGRect square = CGRectMake(0, 0, cropSideLength, cropSideLength);
CGRect cropRect = CGRectApplyAffineTransform(square,
CGAffineTransformMakeScale(1.0 / asset.pixelWidth,
1.0 / asset.pixelHeight));
cropToSquare.normalizedCropRect = cropRect;
[[PHImageManager defaultManager]
requestImageForAsset:(PHAsset *)asset
targetSize:retinaSquare
contentMode:PHImageContentModeAspectFit
options:cropToSquare
resultHandler:^(UIImage *result, NSDictionary *info) {
self.imageView.image = result;
}];
}
All I can suggest is that if you have this problem, you file a radar with Apple to request that they fix it!
PHImageRequestOptions *options = [[PHImageRequestOptions alloc] init];
options.resizeMode = PHImageRequestOptionsResizeModeExact;
NSInteger retinaMultiplier = [UIScreen mainScreen].scale;
CGSize retinaSquare = CGSizeMake(imageView.bounds.size.width * retinaMultiplier, imageView.bounds.size.height * retinaMultiplier);
[[PHImageManager defaultManager]
requestImageForAsset:(PHAsset *)_asset
targetSize:retinaSquare
contentMode:PHImageContentModeAspectFill
options:options
resultHandler:^(UIImage *result, NSDictionary *info) {
imageView.image =[UIImage imageWithCGImage:result.CGImage scale:retinaMultiplier orientation:result.imageOrientation];
}];
To get an exact square, you'll have to indicate that you want an exact size by passing options, like so:
PHImageRequestOptions *options = [[PHImageRequestOptions alloc] init];
// No, really, we want this exact size
options.resizeMode = PHImageRequestOptionsResizeModeExact
[[PHImageManager defaultManager]
requestImageForAsset:(PHAsset *)_asset
targetSize:CGSizeMake(160, 160)
contentMode:PHImageContentModeAspectFill
options:options
resultHandler:^(UIImage *result, NSDictionary *info) {
// Happily, result is now a squared image
imageView.image = result;
}];
let squareSize = CGSize(100, 100)
let options = PHImageRequestOptions()
options.resizeMode = .exact
options.deliveryMode = .highQualityFormat
PHImageManager.default().requestImage(for: asset, targetSize: squareSize, contentMode: .aspectFill, options: options) { (image, _) in
// Use the image.
}
This is working fine for me :
__block UIImage* imgThumb;
CGSize size=CGSizeMake(45, 45);// Size for Square image
[self.imageManager requestImageForAsset:<your_current_phAsset>
targetSize:size
contentMode:PHImageContentModeAspectFill
options:nil
resultHandler:^(UIImage *result, NSDictionary *info) {
imgThumb = result;
}];
Related
my company is having a big problem with getting correct size metadata by fetching PHAssets.
We have developed an iOS applications that lets customers choose pictures from library, get the size (in pixel) for each of them, calculate coordinates for adjusting to gadgets we sell, then upload high quality version of picture to our server to print gadgets.
For some of our customers, the problem is that the size in pixel of some of the high-quality versions of pictures sent, does not match pixelWidth and pixelHeight given by the PHAsset object.
To make an example, we have a picture that:
is reported to be 2096x3724 by PHAsset object
but, when full size image is requested, a 1536x2730 picture is generated
The picture is not in iCloud, and is sent by an iPhone 6 SE running iOS 10.2.
This is the code to get full size image version:
PHImageRequestOptions *imgOpts = [[PHImageRequestOptions alloc] init];
imgOpts.deliveryMode = PHImageRequestOptionsDeliveryModeHighQualityFormat;
imgOpts.networkAccessAllowed = YES;
imgOpts.resizeMode = PHImageRequestOptionsResizeModeExact;
imgOpts.version = PHImageRequestOptionsVersionCurrent;
PHCachingImageManager *imageManager = [[PHCachingImageManager alloc] init];
[imageManager requestImageForAsset:imageAsset targetSize:PHImageManagerMaximumSize contentMode:PHImageContentModeDefault options:imgOpts resultHandler:^(UIImage * result, NSDictionary * info) {
NSData * imageData = UIImageJPEGRepresentation(result, 0.92f);
//UPLOAD OF imageData TO SERVER HERE
}]
Also tried with requestImageDataForAsset method, but with no luck:
PHImageRequestOptions *imgOpts = [[PHImageRequestOptions alloc] init];
imgOpts.deliveryMode = PHImageRequestOptionsDeliveryModeHighQualityFormat;
imgOpts.networkAccessAllowed = YES;
imgOpts.resizeMode = PHImageRequestOptionsResizeModeExact;
imgOpts.version = PHImageRequestOptionsVersionCurrent;
PHCachingImageManager *imageManager = [[PHCachingImageManager alloc] init];
[imageManager requestImageDataForAsset:imageAsset options:imgOpts resultHandler:^(NSData * imageData, NSString * dataUTI, UIImageOrientation orientation, NSDictionary * info) {
//UPLOAD OF imageData TO SERVER HERE
}]
Getting exact size from high-resolution version of every picture, before doing upload, is not an option for us, 'cause it would degrade a lot performance when selecting a large amount of assets from the library.
Are we missing or doing something wrong?
Is there a way to get asset size in pixel without loading full-resolution image into memory?
Thanks for helping
This is due to a bug in Photos framework. Details about the bug can be found here.
Sometimes, after a photo is edited, a smaller version is created. This only occurs for some larger photos.
Calling either requestImageForAsset: (with PHImageManagerMaximumSize) or requestImageDataForAsset: (with PHImageRequestOptionsDeliveryModeHighQualityFormat) will read the data from the smaller file version, when trying to retrieve the edited version (PHImageRequestOptionsVersionCurrent).
The info in the callback of the above methods will point the path to the image. As an example:
PHImageFileURLKey = "file:///[...]DCIM/100APPLE/IMG_0006/Adjustments/IMG_0006.JPG";
Inspecting that folder, I was able to find another image, FullSizeRender.jpg - this one has the full size and contains the latest edits. Thus, one way would be to try and read from the FullSizeRender.jpg, when such a file is present.
Starting with iOS 9, it's also possible to fetch the latest edit, at highest resolution, using the PHAssetResourceManager:
// if (#available(iOS 9.0, *)) {
// check if a high quality edit is available
NSArray<PHAssetResource *> *resources = [PHAssetResource assetResourcesForAsset:_asset];
PHAssetResource *hqResource = nil;
for (PHAssetResource *res in resources) {
if (res.type == PHAssetResourceTypeFullSizePhoto) {
// from my tests so far, this is only present for edited photos
hqResource = res;
break;
}
}
if (hqResource) {
PHAssetResourceRequestOptions *options = [[PHAssetResourceRequestOptions alloc] init];
options.networkAccessAllowed = YES;
long long fileSize = [[hqResource valueForKey:#"fileSize"] longLongValue];
NSMutableData *fullData = [[NSMutableData alloc] initWithCapacity:fileSize];
[[PHAssetResourceManager defaultManager] requestDataForAssetResource:hqResource options:options dataReceivedHandler:^(NSData * _Nonnull data) {
// append the data that we're receiving
[fullData appendData:data];
} completionHandler:^(NSError * _Nullable error) {
// handle completion, using `fullData` or `error`
// uti == hqResource.uniformTypeIdentifier
// orientation == UIImageOrientationUp
}];
}
else {
// use `requestImageDataForAsset:`, `requestImageForAsset:` or `requestDataForAssetResource:` with a different `PHAssetResource`
}
can you try this to fetch camera Roll pics:
__weak __typeof(self) weakSelf = self;
PHFetchResult<PHAssetCollection *> *albums = [PHAssetCollection fetchAssetCollectionsWithType:PHAssetCollectionTypeSmartAlbum subtype:PHAssetCollectionSubtypeSmartAlbumSelfPortraits options:nil];
[albums enumerateObjectsUsingBlock:^(PHAssetCollection * _Nonnull album, NSUInteger idx, BOOL * _Nonnull stop) {
PHFetchOptions *options = [[PHFetchOptions alloc] init];
options.wantsIncrementalChangeDetails = YES;
options.predicate = [NSPredicate predicateWithFormat:#"mediaType == %d",PHAssetMediaTypeImage];
options.sortDescriptors = #[[NSSortDescriptor sortDescriptorWithKey:#"creationDate" ascending:NO]];
PHFetchResult<PHAsset *> *assets = [PHAsset fetchAssetsInAssetCollection:album options:options];
if(assets.count>0)
{
[assets enumerateObjectsUsingBlock:^(PHAsset * _Nonnull asset, NSUInteger idx, BOOL * _Nonnull stop) {
if(asset!=nil)
{
[[PHImageManager defaultManager] requestImageForAsset:asset targetSize:PHImageManagerMaximumSize contentMode:PHImageContentModeAspectFill options:nil resultHandler:^(UIImage *result, NSDictionary *info)
{
dispatch_async(dispatch_get_main_queue(), ^{
[weakSelf addlocalNotificationForFilters:result];
// [weakSelf.buttonGalery setImage:result forState:UIControlStateNormal];
});
}];
*stop = YES;
}
else{
[weakSelf getlatestAferSelfie];
}
}];
}
While working with PHAsset in Swift, I am facing a common problem which must have a good design/solution. For example, I have a collection of PHAsset, lets say assetCollection. Now I want to get total size of the assetCollection which is the sum of all assets in it.
I know there is a asynchronous API to get individual asset size https://stackoverflow.com/a/26551990/1084174 (in objective-c),
[[PHImageManager defaultManager] requestImageDataForAsset:asset options:nil resultHandler:^(NSData *imageData, NSString *dataUTI, UIImageOrientation orientation, NSDictionary *info) {
float imageSize = imageData.length;
//convert to Megabytes
imageSize = imageSize/(1024*1024);
NSLog(#"%f",imageSize);
}];
But when its a collection how do I design the solution?
What's in my mind is, I can run async call inside a loop for each asset in assetCollection summing sizes in total variable until when I get the last result (may be using global variable). total will be the final collection size. But I think there must be some better design/solution to such common problem.
It would be appreciated if anyone suggest.
After spending weeks i have come to a better solution. I am sharing my solution but still there might be some better design. Please share if you get any better.
Solution
I create a wrapper collection, let's say sizedAssetCollection with a property called size. I also implement add() and remove() methods. When I add or remove an item through my add/remove method i calculate update the size on the fly. Like,
void add (PHAsset *asset) {
[[PHImageManager defaultManager] requestImageDataForAsset:asset options:nil resultHandler:^(NSData *imageData, NSString *dataUTI, UIImageOrientation orientation, NSDictionary *info) {
float imageSize = imageData.length;
imageSize = imageSize/(1024*1024);
size += imageSize;
}];
}
void remove(PHAsset *asset) {
[[PHImageManager defaultManager] requestImageDataForAsset:asset options:nil resultHandler:^(NSData *imageData, NSString *dataUTI, UIImageOrientation orientation, NSDictionary *info) {
float imageSize = imageData.length;
imageSize = imageSize/(1024*1024);
size -= imageSize;
}];
}
Finally I get the expected result in size variable.
I am using Photos framework to select photos from the Camera roll. After selecting the assets from the grid, I am using PHImageManager to access each of the selected images and then storing these images in array to show in a collection view of mine.
I am using this piece of code to achieve that:-
-(void)extractFullSizeImagesFromAssets{
PHImageRequestOptions* options = [[PHImageRequestOptions alloc] init];
options.version = PHImageRequestOptionsVersionCurrent;
options.deliveryMode = PHImageRequestOptionsDeliveryModeHighQualityFormat;
options.resizeMode = PHImageRequestOptionsResizeModeExact;
options.networkAccessAllowed = TRUE;
for (int i = 0; i < self.assets.count; i++) {
PHAsset * asset = [self.assets objectAtIndex:i];
CGSize fullSizeImage = CGSizeMake(1000, (asset.pixelHeight / asset.pixelWidth) * 1000);
[[PHImageManager defaultManager] requestImageForAsset:asset
targetSize:fullSizeImage
contentMode:PHImageContentModeAspectFit
options:options
resultHandler:^(UIImage *image, NSDictionary *info){
// [self.arr_images addObject:image];
[_arr_fullSizeImages addObject:image];
}];
}
}
Now my array "arr_fullSizeImages" contains the extracted images in some different random order than the way I did select while picking up the assets. For Example If I have selected 5 images from the camera roll then sometimes the selected image which was at index 3 in Camera Roll is saved on index 5 in the arr_fullSizeImages.
I am not able to track the reason for this behaviour. Please identify the source of the mistake and how t solve this error also.
Thanks.
This is the expected behaviour as requestImageForAsset executed by default asynchronously.
If you want a synchronous behaviour (and no random order), just set
options.synchronous = YES;
I'm trying to get thumbnails from my PHCachingImageManager so that I can put them into the built-in imageView in my UITableViewCells.
NSLog(#"thumbnail size: %#", NSStringFromCGSize(AssetGridThumbnailSize));
[self.imageManager
requestImageForAsset:asset
targetSize:AssetGridThumbnailSize
contentMode:PHImageContentModeAspectFill
options:nil
resultHandler:^(UIImage *result, NSDictionary *info) {
NSLog(#"image size: %#", NSStringFromCGSize(result.size));
// Only update the thumbnail if the cell tag hasn't changed. Otherwise, the cell has been re-used.
if (cell.tag == currentTag) {
cell.imageView.contentMode = UIViewContentModeScaleAspectFill;
cell.imageView.image = result;
}
}];
I can see that AssetGridThumbnailSize = 80 x 80 (40 x 40 retina) from the logs:
thumbnail size: {80, 80}
and I've set the contentMode to PHImageContentModeAspectFill but when I get the images back they are all different sizes and it makes the UITableView look very chaotic.
How can I make the PHCachingImageManager give me back an image of the right size?
While posting the question, I figured out the answer so I decided to continue posting and hopefully this will help someone else.
The targetSize is just a suggestion. In order to really control the size of the returned images you have to pass in a PHImageRequestOptions object with resizeMode = PHImageRequestOptionsResizeModeExact.
PHImageRequestOptions *options = [[PHImageRequestOptions alloc] init];
options.resizeMode = PHImageRequestOptionsResizeModeExact;
[self.imageManager requestImageForAsset:asset
targetSize:AssetGridThumbnailSize
contentMode:PHImageContentModeAspectFill
options:options
resultHandler:^(UIImage *result, NSDictionary *info) {
I want to extract a UIimage from a video asset, to use as a poster. According to the doc for the PHImageManager, I should be able to use requestImageForAsset:targetSize:contentMode:options:resultHandler:. Quoting from the doc:
You can use this method for both photo and video assets—for a video asset, an image request provides a thumbnail image or poster frame.
That hasn't been my experience though. Using requestImageForAsset:targetSize:contentMode:options:resultHandler: with a video asset, the callback block always returns nil for the image and a nil error. The info dictionary returned looks is as follow (nothing I could make sense of)
{
PHImageFileSandboxExtensionTokenKey = "31c0997752ae82ee32953503bd6d9a2436c50fac;00000000;00000000;000000000000001a;com.apple.app-sandbox.read;00000001;01000003;00000000000756cf;/private/var/mobile/Media/DCIM/100APPLE/IMG_0008.MOV";
PHImageFileURLKey = "file:///var/mobile/Media/DCIM/100APPLE/IMG_0008.MOV";
PHImageFileUTIKey = "dyn.ah62d4uv4ge804550";
PHImageResultDeliveredImageFormatKey = 9999;
PHImageResultIsDegradedKey = 0;
PHImageResultIsInCloudKey = 0;
PHImageResultIsPlaceholderKey = 0;
PHImageResultRequestIDKey = 26;
PHImageResultWantedImageFormatKey = 9999;
}
Here is the method I wrote in a PHAsset category to extract an image from a video PHAsset below. Has anyone been able to make this work?
#implementation PHAsset (util)
-(PHImageRequestID)fullSizeImage: (void(^)(UIImage *image, NSError *error)) resultHandler {
PHImageRequestOptions *options = [[PHImageRequestOptions alloc] init];
PHImageContentMode contentMode = PHImageContentModeAspectFill ;
PHImageManager *imageManager = [PHImageManager defaultManager] ;
CGSize targetSize = PHImageManagerMaximumSize ;
return [imageManager requestImageForAsset:self targetSize:targetSize contentMode:contentMode options:nil resultHandler:^(UIImage *result, NSDictionary *info) {
NSError *error = (NSError*)[info objectForKey:PHImageErrorKey];
if (result == nil) {
NSLog(#"ERROR while fetching fullSizeImage %#, info:\n%#", error, info);
}
[[NSOperationQueue mainQueue] addOperationWithBlock:^{
resultHandler(result, error);
}];
}];
}
#end
I got it to work after substituting
CGSize targetSize = PHImageManagerMaximumSize ;
with...
CGSize targetSize = CGSizeMake(self.pixelWidth*ratio, self.pixelHeight*ratio) ;
I haven't seen any relevant documentation, so probably a bug in iOS8.0.x (at the time of this writing, iOS8.1 beta is available but I haven't tested on it)