GMImagePicker causes application crash - ios

I use GMImagePicker and when i select more than 50 images from camera role the application going to be crash and it gives error like
Received memory warning.
Please help me to solve this problem.
It uses very high memory.
The code i did
- (void)assetsPickerController:(GMImagePickerController *)pickerdidFinishPickingAssets:(NSArray *)assetArray{
self.requestOptions = [[PHImageRequestOptions alloc] init];
self.requestOptions.resizeMode = PHImageRequestOptionsResizeModeExact;
self.requestOptions.deliveryMode = PHImageRequestOptionsDeliveryModeHighQualityFormat;
// this one is key
self.requestOptions.synchronous = true;
// self.assets = [NSMutableArray arrayWithArray:assets];
PHImageManager *manager = [PHImageManager defaultManager];
Albumimages = [NSMutableArray arrayWithCapacity:[assetArray count]];
// assets contains PHAsset objects.
__block UIImage *ima;
for (PHAsset *asset in assetArray) {
// Do something with the asset
[manager requestImageForAsset:asset
targetSize:PHImageManagerMaximumSize
contentMode:PHImageContentModeDefault
options:self.requestOptions
resultHandler:^void(UIImage *image, NSDictionary *info) {
ima = image;
[Albumimages addObject:ima];
}];
}
NSLog(#"%#",Albumimages);
[self dismissViewControllerAnimated:YES completion:nil];
}
The application crashed in for loop.

It will obviously crash as you are picking 50 photos at once. just think in terms of RAM allocation. Lets assume each photo is 5 MB in size so 50*5 MB = 250 MB.OS will not provide enough ram and you are receiving memory warning due to this. See whatsapp and other apps allowed 10 images max.
may be you could try the same approach.

Related

Very slow to load images from PHAsset

I am using following code to fetch images and AVAsset from PHAsset. Here are two arrays in code :
galleryArr : to store images for collection view.
mutableDataArr : store images (for image asset) and videos (for AVAsset) to upload on server
Its very slow to fetch all images from PHAssets array.
I googled about this, most of people says remove this line [options setSynchronous:YES]; but if I remove this line then completion is called twice and array duplicates the objects (as objects are appended in array within completion).
for (int i = 0; i < assets.count; i++) {
dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0), ^{
PHImageRequestOptions *options = [[PHImageRequestOptions alloc] init];
options.deliveryMode = PHImageRequestOptionsDeliveryModeOpportunistic;
options.resizeMode = PHImageRequestOptionsResizeModeExact;
[options setNetworkAccessAllowed:YES];
[options setSynchronous:YES];
PHImageManager *manager = PHImageManager.defaultManager;
PHVideoRequestOptions *videoOptions = [[PHVideoRequestOptions alloc] init];
videoOptions.networkAccessAllowed = YES;
__weak typeof(self) weakSelf = self;
if (assets[i].mediaType == PHAssetMediaTypeVideo) {
[manager requestAVAssetForVideo:[assets objectAtIndex:i] options:videoOptions resultHandler:^(AVAsset * _Nullable asset, AVAudioMix * _Nullable audioMix, NSDictionary * _Nullable info) {
if ([asset isKindOfClass:[AVURLAsset class]])
{
[weakSelf.mutableDataArr addObject:asset];
}
}];
}
[manager requestImageForAsset:[assets objectAtIndex:i]
targetSize: CGSizeMake(1024, 1024) //PHImageManagerMaximumSize
contentMode:PHImageContentModeAspectFit
options:options
resultHandler:^(UIImage *image, NSDictionary *info) {
if (image) {
dispatch_async(dispatch_get_main_queue(), ^{
if (assets[i].mediaType != PHAssetMediaTypeVideo) {
[weakSelf.mutableDataArr addObject:image];
}
[galleryArr addObject:image];
if (i+1 == assets.count) {
[SVProgressHUD dismiss];
[weakSelf.galleryCollectionView reloadData];
}
});
}
}];
});
}
Any suggestion please?
Just one thought, it looks like you are loading all the images from the array before removing your progress HUD and displaying the gallery. As the number of images could be very large and presuming you are using a collection view or similar, that's quite an overhead before anything is displayed.
I did something like this a while ago and instead of looping through the array and loading everything up front, I let the cells request images as they needed them. This makes it very fast and efficient as cells can display immediately with a loading icon, then flip to the image when it was available. Efficiency comes from only loading images the user is actually going to see.
To make things performant, and by performant I mean I could scroll as fast as I liked without the display freezing, each cell would first check an in memory cache for the image, then trigger a request for an image on a background thread.
When the image was returned, the cell would add it to the in memory cache and then if the cell had not being reused for a different image (due to fast scrolling) it would display the image.
Further, I also used a NSCache for the in memory cache so that if the app started to use a lot of memory, images would be automatically dropped and reloaded the next time a cell wanted one.
The summary is to use a memory aware cache, and only load what you actually need.

iOS Photokit - PHAsset pixelWidth and pixelHeight does not match high-resolution image

my company is having a big problem with getting correct size metadata by fetching PHAssets.
We have developed an iOS applications that lets customers choose pictures from library, get the size (in pixel) for each of them, calculate coordinates for adjusting to gadgets we sell, then upload high quality version of picture to our server to print gadgets.
For some of our customers, the problem is that the size in pixel of some of the high-quality versions of pictures sent, does not match pixelWidth and pixelHeight given by the PHAsset object.
To make an example, we have a picture that:
is reported to be 2096x3724 by PHAsset object
but, when full size image is requested, a 1536x2730 picture is generated
The picture is not in iCloud, and is sent by an iPhone 6 SE running iOS 10.2.
This is the code to get full size image version:
PHImageRequestOptions *imgOpts = [[PHImageRequestOptions alloc] init];
imgOpts.deliveryMode = PHImageRequestOptionsDeliveryModeHighQualityFormat;
imgOpts.networkAccessAllowed = YES;
imgOpts.resizeMode = PHImageRequestOptionsResizeModeExact;
imgOpts.version = PHImageRequestOptionsVersionCurrent;
PHCachingImageManager *imageManager = [[PHCachingImageManager alloc] init];
[imageManager requestImageForAsset:imageAsset targetSize:PHImageManagerMaximumSize contentMode:PHImageContentModeDefault options:imgOpts resultHandler:^(UIImage * result, NSDictionary * info) {
NSData * imageData = UIImageJPEGRepresentation(result, 0.92f);
//UPLOAD OF imageData TO SERVER HERE
}]
Also tried with requestImageDataForAsset method, but with no luck:
PHImageRequestOptions *imgOpts = [[PHImageRequestOptions alloc] init];
imgOpts.deliveryMode = PHImageRequestOptionsDeliveryModeHighQualityFormat;
imgOpts.networkAccessAllowed = YES;
imgOpts.resizeMode = PHImageRequestOptionsResizeModeExact;
imgOpts.version = PHImageRequestOptionsVersionCurrent;
PHCachingImageManager *imageManager = [[PHCachingImageManager alloc] init];
[imageManager requestImageDataForAsset:imageAsset options:imgOpts resultHandler:^(NSData * imageData, NSString * dataUTI, UIImageOrientation orientation, NSDictionary * info) {
//UPLOAD OF imageData TO SERVER HERE
}]
Getting exact size from high-resolution version of every picture, before doing upload, is not an option for us, 'cause it would degrade a lot performance when selecting a large amount of assets from the library.
Are we missing or doing something wrong?
Is there a way to get asset size in pixel without loading full-resolution image into memory?
Thanks for helping
This is due to a bug in Photos framework. Details about the bug can be found here.
Sometimes, after a photo is edited, a smaller version is created. This only occurs for some larger photos.
Calling either requestImageForAsset: (with PHImageManagerMaximumSize) or requestImageDataForAsset: (with PHImageRequestOptionsDeliveryModeHighQualityFormat) will read the data from the smaller file version, when trying to retrieve the edited version (PHImageRequestOptionsVersionCurrent).
The info in the callback of the above methods will point the path to the image. As an example:
PHImageFileURLKey = "file:///[...]DCIM/100APPLE/IMG_0006/Adjustments/IMG_0006.JPG";
Inspecting that folder, I was able to find another image, FullSizeRender.jpg - this one has the full size and contains the latest edits. Thus, one way would be to try and read from the FullSizeRender.jpg, when such a file is present.
Starting with iOS 9, it's also possible to fetch the latest edit, at highest resolution, using the PHAssetResourceManager:
// if (#available(iOS 9.0, *)) {
// check if a high quality edit is available
NSArray<PHAssetResource *> *resources = [PHAssetResource assetResourcesForAsset:_asset];
PHAssetResource *hqResource = nil;
for (PHAssetResource *res in resources) {
if (res.type == PHAssetResourceTypeFullSizePhoto) {
// from my tests so far, this is only present for edited photos
hqResource = res;
break;
}
}
if (hqResource) {
PHAssetResourceRequestOptions *options = [[PHAssetResourceRequestOptions alloc] init];
options.networkAccessAllowed = YES;
long long fileSize = [[hqResource valueForKey:#"fileSize"] longLongValue];
NSMutableData *fullData = [[NSMutableData alloc] initWithCapacity:fileSize];
[[PHAssetResourceManager defaultManager] requestDataForAssetResource:hqResource options:options dataReceivedHandler:^(NSData * _Nonnull data) {
// append the data that we're receiving
[fullData appendData:data];
} completionHandler:^(NSError * _Nullable error) {
// handle completion, using `fullData` or `error`
// uti == hqResource.uniformTypeIdentifier
// orientation == UIImageOrientationUp
}];
}
else {
// use `requestImageDataForAsset:`, `requestImageForAsset:` or `requestDataForAssetResource:` with a different `PHAssetResource`
}
can you try this to fetch camera Roll pics:
__weak __typeof(self) weakSelf = self;
PHFetchResult<PHAssetCollection *> *albums = [PHAssetCollection fetchAssetCollectionsWithType:PHAssetCollectionTypeSmartAlbum subtype:PHAssetCollectionSubtypeSmartAlbumSelfPortraits options:nil];
[albums enumerateObjectsUsingBlock:^(PHAssetCollection * _Nonnull album, NSUInteger idx, BOOL * _Nonnull stop) {
PHFetchOptions *options = [[PHFetchOptions alloc] init];
options.wantsIncrementalChangeDetails = YES;
options.predicate = [NSPredicate predicateWithFormat:#"mediaType == %d",PHAssetMediaTypeImage];
options.sortDescriptors = #[[NSSortDescriptor sortDescriptorWithKey:#"creationDate" ascending:NO]];
PHFetchResult<PHAsset *> *assets = [PHAsset fetchAssetsInAssetCollection:album options:options];
if(assets.count>0)
{
[assets enumerateObjectsUsingBlock:^(PHAsset * _Nonnull asset, NSUInteger idx, BOOL * _Nonnull stop) {
if(asset!=nil)
{
[[PHImageManager defaultManager] requestImageForAsset:asset targetSize:PHImageManagerMaximumSize contentMode:PHImageContentModeAspectFill options:nil resultHandler:^(UIImage *result, NSDictionary *info)
{
dispatch_async(dispatch_get_main_queue(), ^{
[weakSelf addlocalNotificationForFilters:result];
// [weakSelf.buttonGalery setImage:result forState:UIControlStateNormal];
});
}];
*stop = YES;
}
else{
[weakSelf getlatestAferSelfie];
}
}];
}

PHCachingImageManager returns NSError for iCloud images

I have an app in which I retrieve and display images from the iDevice. I use the following code:
PHImageRequestOptions *options = [[PHImageRequestOptions alloc] init];
options.resizeMode = PHImageRequestOptionsResizeModeNone;
options.deliveryMode = PHImageRequestOptionsDeliveryModeHighQualityFormat;
options.networkAccessAllowed = YES;
PHCachingImageManager *manager = [[PHCachingImageManager alloc] init];
[manager requestImageForAsset:asset
targetSize:CGSizeMake(asset.pixelHeight, asset.pixelWidth)
contentMode:PHImageContentModeAspectFit
options:options
resultHandler:^(UIImage *result, NSDictionary *info) {
// Do something with the result
}];
My problem is that when the image I am trying to retrieve is not on the user's device, but only on iCloud (Settings -> iCloud -> Photos -> Optimise iPhone Storage), the requestImageForAsset: returns nil as result and the following NSError:
NSError * domain: #"NSCocoaErrorDomain" - code: 18446744073709551615
The documentation for PHCachingImageManager says that:
When you need an image for an individual asset, call the
requestImageForAsset:targetSize:contentMode:options:resultHandler:
method, and pass the same parameters you used when preparing that
asset.
If the image you request is among those already prepared, the
PHCachingImageManager object immediately returns that image.
Otherwise, Photos prepares the image on demand and caches it for later
use.
So in theory my code should work. Any ideas what might be causing this?

Photos Framework requestImageDataForAsset occasionally fails

I'm using the photos framework on iOS8.1 and requesting the image data for the asset using requestImageDataForAsset... Most of the time it works and I get the image data and a dictionary containing what you see below. But sometimes the call completes, but the data is nil and the dictionary contains three generic looking entries.
The calls are performed sequentially and on the same thread. It is not specific to any particular image. The error will happen on images I've successfully opened in the past. Has anyone encountered this?
+ (NSData *)retrieveAssetDataPhotosFramework:(NSURL *)urlMedia resolution:(CGFloat)resolution imageOrientation:(ALAssetOrientation*)imageOrientation {
__block NSData *iData = nil;
PHFetchResult *result = [PHAsset fetchAssetsWithALAssetURLs:#[urlMedia] options:nil];
PHAsset *asset = [result firstObject];
PHImageManager *imageManager = [PHImageManager defaultManager];
PHImageRequestOptions *options = [[PHImageRequestOptions alloc]init];
options.synchronous = YES;
options.version = PHImageRequestOptionsVersionCurrent;
#autoreleasepool {
[imageManager requestImageDataForAsset:asset options:options resultHandler:^(NSData *imageData, NSString *dataUTI, UIImageOrientation orientation, NSDictionary *info) {
iData = [imageData copy];
NSLog(#"requestImageDataForAsset returned info(%#)", info);
*imageOrientation = (ALAssetOrientation)orientation;
}];
}
assert(iData.length != 0);
return iData;
}
This is the desired result where I get image data and the dictionary of meta data:
requestImageDataForAsset returned info({
PHImageFileDataKey = <PLXPCShMemData: 0x1702214a0> bufferLength=1753088 dataLength=1749524;
PHImageFileOrientationKey = 1;
PHImageFileSandboxExtensionTokenKey = "6e14948c4d0019fbb4d14cc5e021199f724f0323;00000000;00000000;000000000000001a;com.apple.app-sandbox.read;00000001;01000003;000000000009da80;/private/var/mobile/Media/DCIM/107APPLE/IMG_7258.JPG";
PHImageFileURLKey = "file:///var/mobile/Media/DCIM/107APPLE/IMG_7258.JPG";
PHImageFileUTIKey = "public.jpeg";
PHImageResultDeliveredImageFormatKey = 9999;
PHImageResultIsDegradedKey = 0;
PHImageResultIsInCloudKey = 0;
PHImageResultIsPlaceholderKey = 0;
PHImageResultWantedImageFormatKey = 9999;
})
Here's what I get occasionally. image data is nil. Dictionary contains not so much.
requestImageDataForAsset returned info({
PHImageResultDeliveredImageFormatKey = 9999;
PHImageResultIsDegradedKey = 0;
PHImageResultWantedImageFormatKey = 9999;
})
I had a problem with similar symptoms where requestImageDataForAsset returned nil image data but was also accompanied by a console error message like this:
[Generic] Failed to load image data for asset <PHAsset: 0x13d041940> 87CCAFDC-A0E3-4AC9-AD1C-3F57B897A52E/L0/001 mediaType=1/0, sourceType=2, (113x124), creationDate=2015-06-29 04:56:34 +0000, location=0, hidden=0, favorite=0 with format 9999
In my case, the problem suddenly started happening on a specific device only with assets in iCloud shared albums after upgrading from iOS 10.x to 11.0.3, and since then through to 11.2.5. Thinking that maybe requestImageDataForAsset was trying to use files locally cached in /var/mobile/Media/PhotoData/PhotoCloudSharingData/ (from the info dictionary's PHImageFileURLKey key) and that the cache may be corrupt I thought about how to clear that cache.
Toggling the 'iCloud Photo Sharing' switch in iOS' Settings -> Accounts & Passwords -> iCloud -> Photos seems to have done the trick. requestImageDataForAsset is now working for those previously failing assets.
Update 9th March 2018
I can reproduce this problem now. It seems to occur after restoring a backup from iTunes:
Use the iOS app and retrieve photos from an iCloud shared album.
Backup the iOS device using iTunes.
Restore the backup using iTunes.
Using the app again to retrieve the same photos from the iCloud shared album now fails with the above console message.
Toggling the 'iCloud Photo Sharing' switch fixes it still. Presumably the restore process somehow corrupts some cache. I've reported it as Bug 38290463 to Apple.
You are likely iterating through an array, and memory is not freed timely, you can try the below code. Make sure theData is marked by __block.
#autoreleasepool {
[imageManager requestImageDataForAsset:asset options:options resultHandler:^(NSData *imageData, NSString *dataUTI, UIImageOrientation orientation, NSDictionary *info) {
NSLog(#"requestImageDataForAsset returned info(%#)", info);
theData = [imageData copy];
}];
}
Getting back to this after a long while, I have solved a big part of my problem. No mystery, just bad code:
PHFetchResult *result = [PHAsset fetchAssetsWithALAssetURLs:#[urlMedia] options:nil];
PHAsset *asset = [result firstObject];
if (asset != nil) { // the fix
PHImageManager *imageManager = [PHImageManager defaultManager];
PHImageRequestOptions *options = [[PHImageRequestOptions alloc]init];
...
}
The most common cause for me was a problem with the media URL passed to fetchAssetsWithALAssetURLs causing asset to be nil and requestImageDataForAsset return a default info object.
The following code maybe help. I think the class PHImageRequestOptions has a bug, so I pass nil , and then fix the bug.
dispatch_semaphore_t sema = dispatch_semaphore_create(0);
[[PHImageManager defaultManager] requestImageDataForAsset:asset options:nil resultHandler:^(NSData * _Nullable imageData, NSString * _Nullable dataUTI, UIImageOrientation orientation, NSDictionary * _Nullable info) {
assetModel.size = imageData.length;
NSString *filename = [asset valueForKey:#"filename"];
assetModel.fileName = filename;
dispatch_semaphore_signal(sema);
}];
dispatch_semaphore_wait(sema, DISPATCH_TIME_FOREVER);

memory leak when requesting photos using the Photos framework

I am using the following method to request a number of photos and add them to an array for later use:
-(void) fetchImages{
self.assets = [[PHFetchResult alloc]init];
PHFetchOptions *options = [[PHFetchOptions alloc] init];
options.sortDescriptors = #[[NSSortDescriptor sortDescriptorWithKey:#"creationDate" ascending:YES]];
self.assets = [PHAsset fetchAssetsWithMediaType:PHAssetMediaTypeImage options:options];
self.photosToVideofy = [[NSMutableArray alloc]init];
CGSize size = CGSizeMake(640, 480);
PHImageRequestOptions *photoRequestOptions = [[PHImageRequestOptions alloc]init];
photoRequestOptions.synchronous = YES;
for (NSInteger i = self.firstPhotoIndex; i < self.lastPhotoIndex; i++)
{
PHAsset *asset = self.assets[i];
[[PHImageManager defaultManager] requestImageForAsset:asset targetSize:size contentMode:PHImageContentModeAspectFit options:photoRequestOptions resultHandler:^(UIImage *result, NSDictionary *info) {
if (result) {
[self.photosToVideofy addObject:result];
}
}];
}
NSLog(#"There are %lu photos to Videofy", (unsigned long)self.photosToVideofy.count);
}
This works fine when the number of photos is less than 50. After that memory jumps to 150-160mb, I get the message Connection to assetsd was interrupted or assetsd died and the app crashes.
How can I release the assets (PHFetchResult) from memory after I get the ones I want?(do i need to?)
I would like to be able to add up to 150 photos.
Any ideas?
Thanks
You should not put the results from PHFetchResult into an Array. The idea of PHFetchResult is to point to many images from the Photos library without storing them all in RAM, (I'm not sure how exactly it does this) just use the PHFetchResult object like an array and it handles the memory issues for you. For example, connect a collectionViewController to the PHFetchResult object directly and use the PHImageManager to request images only for visible cells etc.
From apple documentation:
"Unlike an NSArray object, however, a PHFetchResult object dynamically loads its contents from the Photos library as needed, providing optimal performance even when handling a large number of results."
https://developer.apple.com/library/ios/documentation/Photos/Reference/PHFetchResult_Class/
Your code inside fetchImages method needs some refactoring, take a look on this suggestion:
-(void) fetchImages {
PHFetchOptions *options = [[PHFetchOptions alloc] init];
options.sortDescriptors = #[[NSSortDescriptor sortDescriptorWithKey:#"creationDate" ascending:YES]];
PHFetchResult *assets = [PHAsset fetchAssetsWithMediaType:PHAssetMediaTypeImage options:options];
self.photosToVideofy = [[NSMutableArray alloc] init];
CGSize size = CGSizeMake(640, 480);
PHImageRequestOptions *photoRequestOptions = [[PHImageRequestOptions alloc] init];
photoRequestOptions.synchronous = YES;
for (NSInteger i = self.firstPhotoIndex; i < self.lastPhotoIndex; i++)
{
PHAsset *asset = assets[i];
[[PHImageManager defaultManager] requestImageForAsset:asset targetSize:size contentMode:PHImageContentModeAspectFit options:photoRequestOptions resultHandler:^(UIImage *result, NSDictionary *info) {
if (result) {
[self.photosToVideofy addObject:result];
}
}];
}
NSLog(#"There are %lu photos to Videofy", (unsigned long)self.photosToVideofy.count);
}
But the problem is memory consumption. Lets make some calculations.
Single image, using ARGB and 4 bytes per pixel:
640x480x4 = 1.2MB
And now, you want to store in the RAM 150 images, so:
150x1.2MB = 180MB
For example, iPhone 4 with 512 MB will crash if you use more that about 300 MB, but it can be less if other apps are also consuming a lot of RAM.
I think, you should consider storing images to files instead to RAM.
This might be intentional (can't tell without looking at the rest of your code), but self.photosToVideofy is never released: since you're accessing it in a block, the object to which you pass the block ([PHImageManager defaultManager]) will always have a reference to the array.
Try explicitly clearing your array when you're done with the images. The array itself still won't be released, but the objects it contains will (or can be if they're not referenced anywhere else).
The best solution is to remove the array from the block. But, that would require changing the logic of your code.
You have to set
photoRequestOptions.synchronous = NO;
instead of
photoRequestOptions.synchronous = YES;
Worked for me, iOS 10.2

Resources