Very slow to load images from PHAsset - ios

I am using following code to fetch images and AVAsset from PHAsset. Here are two arrays in code :
galleryArr : to store images for collection view.
mutableDataArr : store images (for image asset) and videos (for AVAsset) to upload on server
Its very slow to fetch all images from PHAssets array.
I googled about this, most of people says remove this line [options setSynchronous:YES]; but if I remove this line then completion is called twice and array duplicates the objects (as objects are appended in array within completion).
for (int i = 0; i < assets.count; i++) {
dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0), ^{
PHImageRequestOptions *options = [[PHImageRequestOptions alloc] init];
options.deliveryMode = PHImageRequestOptionsDeliveryModeOpportunistic;
options.resizeMode = PHImageRequestOptionsResizeModeExact;
[options setNetworkAccessAllowed:YES];
[options setSynchronous:YES];
PHImageManager *manager = PHImageManager.defaultManager;
PHVideoRequestOptions *videoOptions = [[PHVideoRequestOptions alloc] init];
videoOptions.networkAccessAllowed = YES;
__weak typeof(self) weakSelf = self;
if (assets[i].mediaType == PHAssetMediaTypeVideo) {
[manager requestAVAssetForVideo:[assets objectAtIndex:i] options:videoOptions resultHandler:^(AVAsset * _Nullable asset, AVAudioMix * _Nullable audioMix, NSDictionary * _Nullable info) {
if ([asset isKindOfClass:[AVURLAsset class]])
{
[weakSelf.mutableDataArr addObject:asset];
}
}];
}
[manager requestImageForAsset:[assets objectAtIndex:i]
targetSize: CGSizeMake(1024, 1024) //PHImageManagerMaximumSize
contentMode:PHImageContentModeAspectFit
options:options
resultHandler:^(UIImage *image, NSDictionary *info) {
if (image) {
dispatch_async(dispatch_get_main_queue(), ^{
if (assets[i].mediaType != PHAssetMediaTypeVideo) {
[weakSelf.mutableDataArr addObject:image];
}
[galleryArr addObject:image];
if (i+1 == assets.count) {
[SVProgressHUD dismiss];
[weakSelf.galleryCollectionView reloadData];
}
});
}
}];
});
}
Any suggestion please?

Just one thought, it looks like you are loading all the images from the array before removing your progress HUD and displaying the gallery. As the number of images could be very large and presuming you are using a collection view or similar, that's quite an overhead before anything is displayed.
I did something like this a while ago and instead of looping through the array and loading everything up front, I let the cells request images as they needed them. This makes it very fast and efficient as cells can display immediately with a loading icon, then flip to the image when it was available. Efficiency comes from only loading images the user is actually going to see.
To make things performant, and by performant I mean I could scroll as fast as I liked without the display freezing, each cell would first check an in memory cache for the image, then trigger a request for an image on a background thread.
When the image was returned, the cell would add it to the in memory cache and then if the cell had not being reused for a different image (due to fast scrolling) it would display the image.
Further, I also used a NSCache for the in memory cache so that if the app started to use a lot of memory, images would be automatically dropped and reloaded the next time a cell wanted one.
The summary is to use a memory aware cache, and only load what you actually need.

Related

CIImage with CGImage memory grow up

I'm creating a photo slideshow app.
My app flow :
user select photo assets > (over 100+)
load images from assets > (display image size)
set imageView image or add CIFilter to image and slides
Question :
When I create CIImage Object from CGImage , the memory is growing up so fast and if the image count over 100+, the app crash.
But it's strange that I remove the create code, the app works fine.
Can anybody help me ?
more code :
- (void)loadDisplayImagesWithCompletion:(void (^)(NSArray *images))completion {
dispatch_async(dispatch_queue_create("PhotosEditViewController_loadImageQueue", nil), ^{
__block NSMutableArray *images = [NSMutableArray array];
__block int handleCount = 0;
__weak PhotosEditViewController *weakSelf = self;
for (PHAsset *asset in self.photoAssets) {
[KPPhotoManager requestImageForAsset:asset targetSize:self.contentView.frame.size completeBlock:^(UIImage *image) {
[images addObject:image];
#autoreleasepool {
CIImage *ciImage = [CIImage imageWithCGImage:[image CGImage]];//memory growing up
}
}];
}
});
}
// KPPhotoManager
+ (void)requestImageForAsset:(PHAsset*)asset targetSize:(CGSize)size completeBlock:(void (^)(UIImage *image))completeBlock
{
[[PHImageManager defaultManager] requestImageForAsset:asset targetSize:size contentMode:PHImageContentModeAspectFill options:[self createImageRequestOptions] resultHandler:^(UIImage * _Nullable result, NSDictionary * _Nullable info) {
completeBlock(result);
}];
}
I created a new project copying your code. I increased the number of calls to 10000 and inspecting the memory all looks well. It does jump a bit but does not inflate.
I also tried to force image load using:
for (int i=0; i<10000; i++) {
#autoreleasepool {
UIImage *image = [UIImage imageWithContentsOfFile:[[[NSBundle mainBundle] resourcePath] stringByAppendingPathComponent:#"image2.png"]];
[self createFilterImageWithImage:image];
}
}
and it still does not inflate the memory.
How are you inspecting your memory leak? Are you looking at the memory consumption? Are there any specifics in your project or file such as having disabled ARC?

Assign objects to variables outside asynchronous blocks

I am curious if it is possible to easily assign and retain __block objects inside asynchronous blocks in MRC and ARC? I obtained the following code while rewriting a piece of code with minimal changes. I got stuck when trying to return the image from an asynchronous block. A concern is that the code would one day be converted to ARC. I do not want to have hidden memory crash after the conversion. My options were
Using a GCD object holder if such a thing exists
Using a homebrew object holder, an array or others
Directly adding the image to the array (which I am using)
Rewriting the code to another structure
Basically the code load multiple images in a background thread. Searching // image deallocated :( will give you the location.
dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0), ^{
ALAssetsLibrary *assetsLibrary = nil;
for (int i = 0; i < sources.count; ++i) {
__block UIImage *image = nil;
id source = sources[i];
if ([source isKindOfClass:[NSString class]]) {
image = something
} else if ([source isKindOfClass:[NSURL class]]) {
NSURL *url = source;
if ([url.scheme isEqualToString:#"assets-library"]) {
dispatch_group_t group = dispatch_group_create();
if (!assetsLibrary)
assetsLibrary = [[[ALAssetsLibrary alloc] init] autorelease];
dispatch_group_enter(group);
[assetsLibrary assetForURL:url resultBlock:^(ALAsset *asset) {
image = something
dispatch_group_leave(group);
} failureBlock:^(NSError *error) {
dispatch_group_leave(group);
}];
dispatch_group_wait(group, DISPATCH_TIME_FOREVER);
// image deallocated :(
} else if (url.isFileURL) {
image = something
}
}
// add image to an array
}
dispatch_async(dispatch_get_main_queue(), ^{
// notify
});
});
I think moving the image variable declaration would be the first right step to think about this. You can't expect to have one image variable for multiple concurrent operations.
so to get started, let's do this-
ALAssetsLibrary *assetsLibrary = nil;
for (int i = 0; i < sources.count; ++i) {
__block UIImage *image = nil;
After that, you are currently keeping UIImage instances in an array? This is not recommended. Please find out a way to write these images to documents directory and later read from there when needed. Keeping UIImage instances in memory will lead to severe memory overuse issues.
You need to come up with a naming convention to write your images into documents directory. Whenever an image load completes, just save it to your local folder and later read from there.
You could also call the notify completion part as soon as one image is loaded and keep notifying about others as you get them. Not sure how your design is working.
Also, I noticed you are creating multiple groups. One new group with each iteration inside for loop here-
dispatch_group_t group = dispatch_group_create();
Moving it outside for loop would be a good call.
Hope this helps.

memory leak when requesting photos using the Photos framework

I am using the following method to request a number of photos and add them to an array for later use:
-(void) fetchImages{
self.assets = [[PHFetchResult alloc]init];
PHFetchOptions *options = [[PHFetchOptions alloc] init];
options.sortDescriptors = #[[NSSortDescriptor sortDescriptorWithKey:#"creationDate" ascending:YES]];
self.assets = [PHAsset fetchAssetsWithMediaType:PHAssetMediaTypeImage options:options];
self.photosToVideofy = [[NSMutableArray alloc]init];
CGSize size = CGSizeMake(640, 480);
PHImageRequestOptions *photoRequestOptions = [[PHImageRequestOptions alloc]init];
photoRequestOptions.synchronous = YES;
for (NSInteger i = self.firstPhotoIndex; i < self.lastPhotoIndex; i++)
{
PHAsset *asset = self.assets[i];
[[PHImageManager defaultManager] requestImageForAsset:asset targetSize:size contentMode:PHImageContentModeAspectFit options:photoRequestOptions resultHandler:^(UIImage *result, NSDictionary *info) {
if (result) {
[self.photosToVideofy addObject:result];
}
}];
}
NSLog(#"There are %lu photos to Videofy", (unsigned long)self.photosToVideofy.count);
}
This works fine when the number of photos is less than 50. After that memory jumps to 150-160mb, I get the message Connection to assetsd was interrupted or assetsd died and the app crashes.
How can I release the assets (PHFetchResult) from memory after I get the ones I want?(do i need to?)
I would like to be able to add up to 150 photos.
Any ideas?
Thanks
You should not put the results from PHFetchResult into an Array. The idea of PHFetchResult is to point to many images from the Photos library without storing them all in RAM, (I'm not sure how exactly it does this) just use the PHFetchResult object like an array and it handles the memory issues for you. For example, connect a collectionViewController to the PHFetchResult object directly and use the PHImageManager to request images only for visible cells etc.
From apple documentation:
"Unlike an NSArray object, however, a PHFetchResult object dynamically loads its contents from the Photos library as needed, providing optimal performance even when handling a large number of results."
https://developer.apple.com/library/ios/documentation/Photos/Reference/PHFetchResult_Class/
Your code inside fetchImages method needs some refactoring, take a look on this suggestion:
-(void) fetchImages {
PHFetchOptions *options = [[PHFetchOptions alloc] init];
options.sortDescriptors = #[[NSSortDescriptor sortDescriptorWithKey:#"creationDate" ascending:YES]];
PHFetchResult *assets = [PHAsset fetchAssetsWithMediaType:PHAssetMediaTypeImage options:options];
self.photosToVideofy = [[NSMutableArray alloc] init];
CGSize size = CGSizeMake(640, 480);
PHImageRequestOptions *photoRequestOptions = [[PHImageRequestOptions alloc] init];
photoRequestOptions.synchronous = YES;
for (NSInteger i = self.firstPhotoIndex; i < self.lastPhotoIndex; i++)
{
PHAsset *asset = assets[i];
[[PHImageManager defaultManager] requestImageForAsset:asset targetSize:size contentMode:PHImageContentModeAspectFit options:photoRequestOptions resultHandler:^(UIImage *result, NSDictionary *info) {
if (result) {
[self.photosToVideofy addObject:result];
}
}];
}
NSLog(#"There are %lu photos to Videofy", (unsigned long)self.photosToVideofy.count);
}
But the problem is memory consumption. Lets make some calculations.
Single image, using ARGB and 4 bytes per pixel:
640x480x4 = 1.2MB
And now, you want to store in the RAM 150 images, so:
150x1.2MB = 180MB
For example, iPhone 4 with 512 MB will crash if you use more that about 300 MB, but it can be less if other apps are also consuming a lot of RAM.
I think, you should consider storing images to files instead to RAM.
This might be intentional (can't tell without looking at the rest of your code), but self.photosToVideofy is never released: since you're accessing it in a block, the object to which you pass the block ([PHImageManager defaultManager]) will always have a reference to the array.
Try explicitly clearing your array when you're done with the images. The array itself still won't be released, but the objects it contains will (or can be if they're not referenced anywhere else).
The best solution is to remove the array from the block. But, that would require changing the logic of your code.
You have to set
photoRequestOptions.synchronous = NO;
instead of
photoRequestOptions.synchronous = YES;
Worked for me, iOS 10.2

Use ALAssetRepresentation getBytes:fromOffset:length:error: but with correct orientation

High level here is what I am trying to do in my iOS app:
Allow the user to select an asset from their Photo Library
Share a URL which entities external to my app can use to request that same asset
My app is running a webserver (CocoaHTTPServer) and it is able to serve up that data to the requesting entity
I intentionally implemented this using ALAssetRepresentation getBytes:fromOffset:length:error: , thinking that I could avoid the following headaches:
writing a local copy of the image before serving it up (and then having to manage the local copies)
putting all of the image data into memory
potentially slow performance, waiting to have the whole image before serving any of it
It works high level, with one problem. The orientation of the image is not right in some cases.
This is a well known issue but the solutions generally seem to require creating a CGImage or UIImage from the ALAssetRepresentation. As soon as I do that, it eliminates my ability to use that handy ALAssetRepresentation getBytes:fromOffset:length:error: method.
It would be cool if there is a way to continue to use this method but with the corrected orientation. If not, I would appreciate any recommendations on a next best approach. Thanks!
Here are a couple of relevant methods:
- (id)initWithAssetURL:(NSURL *)assetURL forConnection:(MyHTTPConnection *)parent{
if((self = [super init]))
{
HTTPLogTrace();
// Need this now so we can use it throughout the class
self.connection = parent;
self.assetURL = assetURL;
offset = 0;
// Grab a pointer to the ALAssetsLibrary which we can persistently use
self.lib = [[ALAssetsLibrary alloc] init];
// Really need to know the size of the asset, but we will also set a property for the ALAssetRepresentation to use later
// Otherwise we would have to asynchronously look up the asset again by assetForURL
[self.lib assetForURL:assetURL
resultBlock:^(ALAsset *asset) {
self.assetRepresentation = [asset defaultRepresentation];
self.assetSize = [self.assetRepresentation size];
// Even though we don't really have any data, this will enable the response headers to be sent
// It will call our delayResponseHeaders method, which will now return NO, since we've set self.repr
[self.connection responseHasAvailableData:self];
}
failureBlock:^(NSError *error) {
// recover from error, then
NSLog(#"error in the failureBlock: %#",[error localizedDescription]);
}];
}
return self;
}
- (NSData *)readDataOfLength:(NSUInteger)lengthParameter{
HTTPLogTrace2(#"%#[%p]: readDataOfLength:%lu", THIS_FILE, self, (unsigned long)lengthParameter);
NSUInteger remaining = self.assetSize - offset;
NSUInteger length = lengthParameter < remaining ? lengthParameter : remaining;
Byte *buffer = (Byte *)malloc(length);
NSError *error;
NSUInteger buffered = [self.assetRepresentation getBytes:buffer fromOffset:offset length:length error:&error];
NSLog(#"Error: %#",[error localizedDescription]);
NSData *data = [NSData dataWithBytesNoCopy:buffer length:buffered freeWhenDone:YES];
// now that we've read data of length, update the offset for the next invocation
offset += length;
return data;
}

iOS: TableView with multiple Images per Row (SDWebImage)

I am a bit desperated about my sectioned tableview. I use a custom UITableViewCell with 4 images like the one below:
I try to load the images via SDWebImage for each cell.
The loading procedures are all done in my custom UITableViewCell - not in the UITableViewController. From the cellForRowAtIndexPath i just call [cell setup] which executes the following code in the current cell:
NSArray *imageViews = [NSArray arrayWithObjects:self.imageView1, self.imageView2, self.imageView3, self.imageView4, nil];
for (int i = 0; i < self.products.count; i++) {
Product *currentProduct = [self.products objectAtIndex:i];
UIImageView *currentImageView = [imageViews objectAtIndex:i];
NSString *thumbURL = [[CommonCode getUnzippedDirectory] stringByAppendingPathComponent:currentProduct.collectionFolderName];
thumbURL = [thumbURL stringByAppendingPathComponent:thumbFolder];
thumbURL = [thumbURL stringByAppendingPathComponent:currentProduct.product_photo];
[currentImageView setContentMode:UIViewContentModeScaleAspectFit];
[currentImageView setImageWithURL:[NSURL fileURLWithPath:thumbURL]
placeholderImage:[UIImage imageNamed:#"placeholder.png"]];
}
The images are all stored in the documents directory and are not greater than max. 500Kb.
Problem:
My Problem is that when I scroll through my tableview it suddenly crashes and I don't know why. Enabling a symbolic breakpoint for all exceptions shows that it crashes because of one line in SDWebImage. Unfortunately there isn't a debugger output: (It crashes where the image is allocated)
UIImage *image = nil;
if ([imageOrData isKindOfClass:[NSData class]])
{
image = [[UIImage alloc] initWithData:(NSData *)imageOrData];
}
I also tried to load images via dispatch_asnyc with a similar result. Is it possible that it has something to do with concurrent file operations?
Additionally I get Memory Warnings when I scroll very fast so that I have to clear the cache of SDWebImage. At the same time it stops at the code line in SDWebImage mentioned above.
I already searched the web for 2 days now and I didn't find something useful. I would be glad for some hints to fix this problem. If somebody needs additional data such as crash reports or something, just let me know and I will provide them quickly.
I have meet the same problem, But I solve it like below temporarily, but it cause memory problem especially on iPhone 4s
NSArray *arrImgs = cellModel.thumbnails_qqnews;
if (arrImgs.count > 0) {
[self.imgView.subviews enumerateObjectsUsingBlock:^(__kindof UIImageView * _Nonnull obj, NSUInteger idx, BOOL * _Nonnull stop) {
[obj sd_setImageWithURL:[NSURL URLWithString:arrImgs[idx]] placeholderImage:[UIImage imageNamed:#"placeholdImage"] completed:^(UIImage *image, NSError *error, SDImageCacheType cacheType, NSURL *imageURL) {
if (image != nil) {
obj.image = [UIImage cutImageWithTargetSize:[NEMulTiNewsCell getImgSize] image:image];
}
}];
}];
}
I think multi images download is illogical,but the method i think is download multi images then draw multi images in one image then display, but it may create large number line of code.

Resources