I have a problem, I use the Photo framework, and when I do that (images is a NSMutableArray) :
[[PHImageManager defaultManager]requestImageForAsset:lastAsset targetSize:CGSizeMake(150, 300) contentMode:PHImageContentModeAspectFit options:nil resultHandler:^(UIImage *result, NSDictionary *info) {
[images addObject:result];
NSLog(#"%# %# %#",result,[images objectAtIndex:0],[images lastObject]);
}];
I have these results for the size of my UIImages in my console :
700,700 60,40 700,700
I don't understand why, and when I add an image of my NSMutableArray which is named "images" :
view.image = [images objectAtIndex:index];
I have all my images in very bad quality (I think they are all stored in 60,40 in my images NSMutableArray).
Or I want to recover theses in their original quality (700,700 not in 60,40) to display theses with a good quality.
Thanks a lot everyone !!
I have found the solution of my problem.
The solution is to add this option :
PHImageRequestOptions* options = [[PHImageRequestOptions alloc]init];
options.resizeMode = PHImageRequestOptionsResizeModeExact;
options.deliveryMode = PHImageRequestOptionsDeliveryModeHighQualityFormat;
options.version = PHImageRequestOptionsVersionUnadjusted;
options.synchronous = YES;
to my request.
We can see the deliveryMode which is very important for keep a good image quality.
Thanks for help everyone !
Related
I use GMImagePicker and when i select more than 50 images from camera role the application going to be crash and it gives error like
Received memory warning.
Please help me to solve this problem.
It uses very high memory.
The code i did
- (void)assetsPickerController:(GMImagePickerController *)pickerdidFinishPickingAssets:(NSArray *)assetArray{
self.requestOptions = [[PHImageRequestOptions alloc] init];
self.requestOptions.resizeMode = PHImageRequestOptionsResizeModeExact;
self.requestOptions.deliveryMode = PHImageRequestOptionsDeliveryModeHighQualityFormat;
// this one is key
self.requestOptions.synchronous = true;
// self.assets = [NSMutableArray arrayWithArray:assets];
PHImageManager *manager = [PHImageManager defaultManager];
Albumimages = [NSMutableArray arrayWithCapacity:[assetArray count]];
// assets contains PHAsset objects.
__block UIImage *ima;
for (PHAsset *asset in assetArray) {
// Do something with the asset
[manager requestImageForAsset:asset
targetSize:PHImageManagerMaximumSize
contentMode:PHImageContentModeDefault
options:self.requestOptions
resultHandler:^void(UIImage *image, NSDictionary *info) {
ima = image;
[Albumimages addObject:ima];
}];
}
NSLog(#"%#",Albumimages);
[self dismissViewControllerAnimated:YES completion:nil];
}
The application crashed in for loop.
It will obviously crash as you are picking 50 photos at once. just think in terms of RAM allocation. Lets assume each photo is 5 MB in size so 50*5 MB = 250 MB.OS will not provide enough ram and you are receiving memory warning due to this. See whatsapp and other apps allowed 10 images max.
may be you could try the same approach.
I am using Photos framework to select photos from the Camera roll. After selecting the assets from the grid, I am using PHImageManager to access each of the selected images and then storing these images in array to show in a collection view of mine.
I am using this piece of code to achieve that:-
-(void)extractFullSizeImagesFromAssets{
PHImageRequestOptions* options = [[PHImageRequestOptions alloc] init];
options.version = PHImageRequestOptionsVersionCurrent;
options.deliveryMode = PHImageRequestOptionsDeliveryModeHighQualityFormat;
options.resizeMode = PHImageRequestOptionsResizeModeExact;
options.networkAccessAllowed = TRUE;
for (int i = 0; i < self.assets.count; i++) {
PHAsset * asset = [self.assets objectAtIndex:i];
CGSize fullSizeImage = CGSizeMake(1000, (asset.pixelHeight / asset.pixelWidth) * 1000);
[[PHImageManager defaultManager] requestImageForAsset:asset
targetSize:fullSizeImage
contentMode:PHImageContentModeAspectFit
options:options
resultHandler:^(UIImage *image, NSDictionary *info){
// [self.arr_images addObject:image];
[_arr_fullSizeImages addObject:image];
}];
}
}
Now my array "arr_fullSizeImages" contains the extracted images in some different random order than the way I did select while picking up the assets. For Example If I have selected 5 images from the camera roll then sometimes the selected image which was at index 3 in Camera Roll is saved on index 5 in the arr_fullSizeImages.
I am not able to track the reason for this behaviour. Please identify the source of the mistake and how t solve this error also.
Thanks.
This is the expected behaviour as requestImageForAsset executed by default asynchronously.
If you want a synchronous behaviour (and no random order), just set
options.synchronous = YES;
I'm trying to get thumbnails from my PHCachingImageManager so that I can put them into the built-in imageView in my UITableViewCells.
NSLog(#"thumbnail size: %#", NSStringFromCGSize(AssetGridThumbnailSize));
[self.imageManager
requestImageForAsset:asset
targetSize:AssetGridThumbnailSize
contentMode:PHImageContentModeAspectFill
options:nil
resultHandler:^(UIImage *result, NSDictionary *info) {
NSLog(#"image size: %#", NSStringFromCGSize(result.size));
// Only update the thumbnail if the cell tag hasn't changed. Otherwise, the cell has been re-used.
if (cell.tag == currentTag) {
cell.imageView.contentMode = UIViewContentModeScaleAspectFill;
cell.imageView.image = result;
}
}];
I can see that AssetGridThumbnailSize = 80 x 80 (40 x 40 retina) from the logs:
thumbnail size: {80, 80}
and I've set the contentMode to PHImageContentModeAspectFill but when I get the images back they are all different sizes and it makes the UITableView look very chaotic.
How can I make the PHCachingImageManager give me back an image of the right size?
While posting the question, I figured out the answer so I decided to continue posting and hopefully this will help someone else.
The targetSize is just a suggestion. In order to really control the size of the returned images you have to pass in a PHImageRequestOptions object with resizeMode = PHImageRequestOptionsResizeModeExact.
PHImageRequestOptions *options = [[PHImageRequestOptions alloc] init];
options.resizeMode = PHImageRequestOptionsResizeModeExact;
[self.imageManager requestImageForAsset:asset
targetSize:AssetGridThumbnailSize
contentMode:PHImageContentModeAspectFill
options:options
resultHandler:^(UIImage *result, NSDictionary *info) {
I have an app in which I retrieve and display images from the iDevice. I use the following code:
PHImageRequestOptions *options = [[PHImageRequestOptions alloc] init];
options.resizeMode = PHImageRequestOptionsResizeModeNone;
options.deliveryMode = PHImageRequestOptionsDeliveryModeHighQualityFormat;
options.networkAccessAllowed = YES;
PHCachingImageManager *manager = [[PHCachingImageManager alloc] init];
[manager requestImageForAsset:asset
targetSize:CGSizeMake(asset.pixelHeight, asset.pixelWidth)
contentMode:PHImageContentModeAspectFit
options:options
resultHandler:^(UIImage *result, NSDictionary *info) {
// Do something with the result
}];
My problem is that when the image I am trying to retrieve is not on the user's device, but only on iCloud (Settings -> iCloud -> Photos -> Optimise iPhone Storage), the requestImageForAsset: returns nil as result and the following NSError:
NSError * domain: #"NSCocoaErrorDomain" - code: 18446744073709551615
The documentation for PHCachingImageManager says that:
When you need an image for an individual asset, call the
requestImageForAsset:targetSize:contentMode:options:resultHandler:
method, and pass the same parameters you used when preparing that
asset.
If the image you request is among those already prepared, the
PHCachingImageManager object immediately returns that image.
Otherwise, Photos prepares the image on demand and caches it for later
use.
So in theory my code should work. Any ideas what might be causing this?
I am using the following method to request a number of photos and add them to an array for later use:
-(void) fetchImages{
self.assets = [[PHFetchResult alloc]init];
PHFetchOptions *options = [[PHFetchOptions alloc] init];
options.sortDescriptors = #[[NSSortDescriptor sortDescriptorWithKey:#"creationDate" ascending:YES]];
self.assets = [PHAsset fetchAssetsWithMediaType:PHAssetMediaTypeImage options:options];
self.photosToVideofy = [[NSMutableArray alloc]init];
CGSize size = CGSizeMake(640, 480);
PHImageRequestOptions *photoRequestOptions = [[PHImageRequestOptions alloc]init];
photoRequestOptions.synchronous = YES;
for (NSInteger i = self.firstPhotoIndex; i < self.lastPhotoIndex; i++)
{
PHAsset *asset = self.assets[i];
[[PHImageManager defaultManager] requestImageForAsset:asset targetSize:size contentMode:PHImageContentModeAspectFit options:photoRequestOptions resultHandler:^(UIImage *result, NSDictionary *info) {
if (result) {
[self.photosToVideofy addObject:result];
}
}];
}
NSLog(#"There are %lu photos to Videofy", (unsigned long)self.photosToVideofy.count);
}
This works fine when the number of photos is less than 50. After that memory jumps to 150-160mb, I get the message Connection to assetsd was interrupted or assetsd died and the app crashes.
How can I release the assets (PHFetchResult) from memory after I get the ones I want?(do i need to?)
I would like to be able to add up to 150 photos.
Any ideas?
Thanks
You should not put the results from PHFetchResult into an Array. The idea of PHFetchResult is to point to many images from the Photos library without storing them all in RAM, (I'm not sure how exactly it does this) just use the PHFetchResult object like an array and it handles the memory issues for you. For example, connect a collectionViewController to the PHFetchResult object directly and use the PHImageManager to request images only for visible cells etc.
From apple documentation:
"Unlike an NSArray object, however, a PHFetchResult object dynamically loads its contents from the Photos library as needed, providing optimal performance even when handling a large number of results."
https://developer.apple.com/library/ios/documentation/Photos/Reference/PHFetchResult_Class/
Your code inside fetchImages method needs some refactoring, take a look on this suggestion:
-(void) fetchImages {
PHFetchOptions *options = [[PHFetchOptions alloc] init];
options.sortDescriptors = #[[NSSortDescriptor sortDescriptorWithKey:#"creationDate" ascending:YES]];
PHFetchResult *assets = [PHAsset fetchAssetsWithMediaType:PHAssetMediaTypeImage options:options];
self.photosToVideofy = [[NSMutableArray alloc] init];
CGSize size = CGSizeMake(640, 480);
PHImageRequestOptions *photoRequestOptions = [[PHImageRequestOptions alloc] init];
photoRequestOptions.synchronous = YES;
for (NSInteger i = self.firstPhotoIndex; i < self.lastPhotoIndex; i++)
{
PHAsset *asset = assets[i];
[[PHImageManager defaultManager] requestImageForAsset:asset targetSize:size contentMode:PHImageContentModeAspectFit options:photoRequestOptions resultHandler:^(UIImage *result, NSDictionary *info) {
if (result) {
[self.photosToVideofy addObject:result];
}
}];
}
NSLog(#"There are %lu photos to Videofy", (unsigned long)self.photosToVideofy.count);
}
But the problem is memory consumption. Lets make some calculations.
Single image, using ARGB and 4 bytes per pixel:
640x480x4 = 1.2MB
And now, you want to store in the RAM 150 images, so:
150x1.2MB = 180MB
For example, iPhone 4 with 512 MB will crash if you use more that about 300 MB, but it can be less if other apps are also consuming a lot of RAM.
I think, you should consider storing images to files instead to RAM.
This might be intentional (can't tell without looking at the rest of your code), but self.photosToVideofy is never released: since you're accessing it in a block, the object to which you pass the block ([PHImageManager defaultManager]) will always have a reference to the array.
Try explicitly clearing your array when you're done with the images. The array itself still won't be released, but the objects it contains will (or can be if they're not referenced anywhere else).
The best solution is to remove the array from the block. But, that would require changing the logic of your code.
You have to set
photoRequestOptions.synchronous = NO;
instead of
photoRequestOptions.synchronous = YES;
Worked for me, iOS 10.2