Preload next (currently invisible) cell UICollectionView - ios

I'm working with a UICollectionView and currently have full screen cells and when I swipe to the next cell the image takes a bit to load into the cell (since they're high res). I'm already using PHCachingImageManager to cache the image before hand and it's already cached before I load the cell, but just loading the image into the imageView takes a noticeable spark of time. I was wondering if there's a way to preload the invisible next cell BEFORE it gets to cellForItemAtIndexPath?
Any thoughts?

It apparently takes some time to load an image that has already been cached from the PHCachingImageManager which was the cause of my delay. I created a test function that saved the next image (and will add the previous one too) in a property while setting up the current cell and my flicker is no more. I'll post the working example when I've cleaned it up...
Update:
So I created an _images NSMutableDictionary to store the already loaded images and populate it using this function
- (void)startCachingSurroundingImages:(NSInteger)index {
CGSize targetSize = PHImageManagerMaximumSize;
PHImageRequestOptions *options = [[PHImageRequestOptions alloc] init];
[options setNetworkAccessAllowed:YES];
[options setDeliveryMode:PHImageRequestOptionsDeliveryModeHighQualityFormat];
if (index < _imagesArray.count - 1 && ![_images objectForKey:[NSIndexPath indexPathForItem:0 inSection:index+1]]) {
NSDictionary *assetDictionary = [_imagesArray objectAtIndex:index + 1];
NSString *assetRefID = [assetDictionary objectForKey:#"asset_id"];
PHFetchResult *assetFetchResult = [PHAsset fetchAssetsWithLocalIdentifiers:#[assetRefID] options:[PHFetchOptions new]];
PHAsset *asset = [assetFetchResult firstObject];
[photoManager requestImageForAsset:asset targetSize:targetSize contentMode:PHImageContentModeAspectFit options:options resultHandler:^(UIImage *result, NSDictionary *info) {
if (result) {
[_images setObject:result forKey:[NSIndexPath indexPathForItem:0 inSection:index+1]];
}
}];
}
if (index - 1 >= 0 && ![_images objectForKey:[NSIndexPath indexPathForItem:0 inSection:index-1]]) {
NSDictionary *leftAssetDictionary = [_imagesArray objectAtIndex:index - 1];
NSString *leftAssetRefID = [leftAssetDictionary objectForKey:#"asset_id"];
PHFetchResult *leftAssetFetchResult = [PHAsset fetchAssetsWithLocalIdentifiers:#[leftAssetRefID] options:[PHFetchOptions new]];
PHAsset *leftAsset = [leftAssetFetchResult firstObject];
[photoManager requestImageForAsset:leftAsset targetSize:targetSize contentMode:PHImageContentModeAspectFit options:options resultHandler:^(UIImage *result, NSDictionary *info) {
if (result) {
[_images setObject:result forKey:[NSIndexPath indexPathForItem:0 inSection:index-1]];
}
}];
}
}
Which I call in cellForItemAtIndexPath like so...
[self startCachingSurroundingImages:indexPath.section];
and still in cellForItemAtIndexPath, I load the image like so...
if ([_images objectForKey:indexPath]) {
cell.imageView.image = [_images objectForKey:indexPath];
[photoCell.activityIndicator setHidden:YES];
} else {
photoCell.tag = (int)[photoManager requestImageForAsset:asset targetSize:targetSize contentMode:PHImageContentModeAspectFit options:options resultHandler:^(UIImage *result, NSDictionary *info) {
PhotoCell *photoCell = (PhotoCell *)[self.collectionView cellForItemAtIndexPath:indexPath];
if (photoCell && result) {
photoCell.imageView.image = result;
[photoCell.activityIndicator setHidden:YES];
} else {
NSLog(#"BAD IMAGE!!! %#", info);
[photoCell.activityIndicator setHidden:NO];
}
}];
}
Then in didReceiveMemoryWarning I clean up a bit...
- (void)didReceiveMemoryWarning {
NSArray *allKeys = [_images allKeys];
for (int i = 0; i < _images.count; i++) {
NSIndexPath *path = [allKeys objectAtIndex:i];
if (path.section != _lastIndex && path.section != _lastIndex - 1 && path.section != _lastIndex + 1) {
[_images removeObjectForKey:path];
}
}
// Though one could just as easily do this
// [_images removeAllObjects];
}
It's not pretty, but it works.

Related

The iCloud photo is all black

We use PHAsset api to fetch the thumbnail of the iCloud photo ourselves. Customer report that they saw all black photo in Client. Did you encounter the same issue before and did you have any suggestion with this specific issue?
The codes used to display the UIImage is as following. We use CALayer for displaying the picture.
- (void)setImageLayerWithImage:(UIImage *)image
{
self.imageLayer.bounds = CGRectMake(0, 0, image.size.width, image.size.height);
self.imageLayer.position = CGPointMake(image.size.width/2, image.size.height/2);
self.imageLayer.contents = (id)image.CGImage;
}
The code to fetch thumbnail should be:
- (PHImageRequestID)imageInFullScreen:(MDPhoto *)photoItem
Id:(NSString*)photoID
targetSize:(CGSize)targetSize
highQualityFormat:(BOOL)highQualityFormat
isPreLoad:(BOOL)ispreLoad
saveFile:(BOOL)theSaveFile
progressHandler:(PHAssetImageProgressHandler)progressHandler
completionHandler:(void (^) (NSDictionary *, NSDictionary *))completionHandler {
CGFloat scale = [UIScreen mainScreen].scale;
targetSize = CGSizeMake(targetSize.width * scale, targetSize.height * scale);
if(targetSize.width < [MDDeviceInfo shareDevice].minImageWidth)
targetSize = CGSizeMake([MDDeviceInfo shareDevice].minImageWidth, [MDDeviceInfo shareDevice].minImageWidth * targetSize.height/targetSize.width);
PHImageRequestOptions *options = [self fullScreenRequestOptions:targetSize];
options.progressHandler = progressHandler;
MDPhoto* dict = photoItem;
PHAsset *asset = [(FBYPHAssetWrapper *)[photoItem objectForKey:kFBYAsset] entity];
if (!asset || [dict[kFBYUploadNeedRefetchAsset] boolValue] || self.isInFullscreenBrowser) {
FBYPHAssetWrapper *wraper = [FBYAssetManager getAssetWrapperByLocalIdentifier:photoItem[kFBYPHAssetLocalIdentifier]];
if (wraper) {
dict[kFBYUploadNeedRefetchAsset] = #NO;
[dict setObject:wraper forKey:kFBYAsset];
asset = [wraper entity];
}
}
if (!asset.localIdentifier) {
NSLog(#"");
}
__block UIImage *aImage = nil;
if(asset.localIdentifier && [[WDPhotoBookManager manager].currentPhotoBook containsAssetIdentifier:[asset.localIdentifier MD5]]){
NSData* imageData = [[WDPhotoBookManager manager].currentPhotoBook loadImageDataFromIdentifier:[asset.localIdentifier MD5]];
aImage = [UIImage imageWithData:imageData];
}
if (theSaveFile) {
if (!self.isInFullscreenBrowser) {
if (!aImage) {
NSString *localIdentifier = photoItem[kFBYPHAssetLocalIdentifier];
aImage = [FBYPHAssetHelper mediumSizedImageForAssetID:[FBYGlobal ImagePathWithLocalIdentifier:photoID targetSize:targetSize] pictureDir:[localIdentifier MD5]];
if (aImage && !self.isInFullscreenBrowser) {
[self drawImage:aImage localIdentifier:[FBYGlobal ImagePathWithLocalIdentifier:photoID targetSize:targetSize]];
}
}
} else {
options.networkAccessAllowed = YES;
}
}
if (ispreLoad) {
return 0;
}
theSaveFile = YES;
if (self.isInFullscreenBrowser) {
aImage = nil;
// targetSize = PHImageManagerMaximumSize;
theSaveFile = NO;
// WREN-1253
// iOS: full screen preview shows unnormally
// Comments: The fetched image is corrupted when using PHImageManagerMaximumSize.
// So I had to use the explict target size.
if(IS_SMALL_IPHONE){
}else {
targetSize = CGSizeMake(asset.pixelWidth, asset.pixelHeight);
}
}
if (self.isInEditPhotoMode) {
aImage = nil;
theSaveFile = NO;
}
BOOL assetHasBurstIdentifier = asset.burstIdentifier && [asset.burstIdentifier length] > 0;
if (!aImage) {
if (asset) {
return [[self class] requestImageForAsset:asset
withImageManager:nil
targetSize:targetSize
options:options
fixOrientation:YES
representsBurst:asset.representsBurst || assetHasBurstIdentifier
saveFile:theSaveFile
isInFullScreen:self.isInFullscreenBrowser
isNeedTranparent:YES
resultHandler:^(UIImage *result, NSDictionary *info) {
if(!result) {
NSDictionary* context = [NSDictionary dictionaryWithObjectsAndKeys:photoID, #"id",
nil];
NSMutableDictionary *dict = [NSMutableDictionary dictionaryWithDictionary:info];
[dict addEntriesFromDictionary:info];
if ([NSThread isMainThread]) {
completionHandler(context, dict);
} else {
dispatch_async(dispatch_get_main_queue(), ^{
completionHandler(context, dict);
});
}
return ;
}
NSDictionary* context = [NSDictionary dictionaryWithObjectsAndKeys:
result, #"image", photoID, #"id", nil];
if ([NSThread isMainThread]) {
[self checkAndStoreThumbnailCache:result FullScreen:photoItem Id:photoID targetSize:targetSize];
completionHandler(context, info);
} else {
dispatch_async(dispatch_get_main_queue(), ^{
[self checkAndStoreThumbnailCache:result FullScreen:photoItem Id:photoID targetSize:targetSize];
completionHandler(context, info);
});
}
if (!self.isInFullscreenBrowser) {
[self drawImage:result localIdentifier:[FBYGlobal ImagePathWithLocalIdentifier:photoID
targetSize:targetSize]];
}
}];
} else {
NSDictionary* context = [NSDictionary dictionaryWithObjectsAndKeys:photoID, #"id",
nil];
NSDictionary* info = [NSDictionary dictionaryWithObjectsAndKeys:
photoID, #"id",#"assetMissing",#"error", nil];
if ([NSThread isMainThread]) {
completionHandler(context, info);
} else {
dispatch_async(dispatch_get_main_queue(), ^{
completionHandler(context, info);
});
}
}
} else {
void (^workToDo) (void) = ^{
[self checkAndStoreThumbnailCache:aImage FullScreen:photoItem Id:photoID targetSize:targetSize];
NSDictionary* context = [NSDictionary dictionaryWithObjectsAndKeys:
aImage, #"image", photoID, #"id", nil];
completionHandler(context, nil);
};
if ([NSThread isMainThread]) {
workToDo();
} else {
dispatch_async(dispatch_get_main_queue(), ^{
workToDo();
});
}
}
return 0;
}

UI is getting blocked when fetching video duration from AVURLAsset in dispatch_async

I have 2 View Controllers Home and Home Details. In Home I have a table view in which I am showing thumbnail and duration of a video. When I click on a particular row it's details are shown in Home Details. On returning back I am updating that selected row. So for that in viewWillDisappear Method of Home Details I have written following code :
if ([self.delegate respondsToSelector:#selector(changeSelectedBucketData:)]) {
[self.delegate changeSelectedBucketData:_videoId];
}
Now in the Home Controller I have defined that method as:
-(void)changeSelectedBucketData:(NSString*)videoId {
NSString *dataStr = [NSString stringWithFormat:#"%#bucket_id=%#",kGetBucketById,videoId];
[[WebServiceCall sharedInstance] sendGetRequestToWebWithData:dataStr success:^(NSDictionary *json) {
if([[json valueForKey:#"ResponseCode"] integerValue] == 0) {
} else {
dispatch_async(dispatch_get_main_queue(), ^{
[_arrayOfContent replaceObjectAtIndex:selectedIndex withObject:[json valueForKey:#"GetData"]];
if (_arrayOfContent.count) {
TableViewCellHome *cell = [self.mTableView cellForRowAtIndexPath:[NSIndexPath indexPathForRow:selectedIndex inSection:0]];
[self fillDataForIndexPath:[NSIndexPath indexPathForRow:selectedIndex inSection:0] forCell:cell];
}
});
}
} failure:^(NSError *error) {
dispatch_async(dispatch_get_main_queue(), ^{
});
}];
}
-(void)fillDataForIndexPath:(NSIndexPath*)indexPath forCell:(TableViewCellHome*)cell{
NSDictionary *dict = [_arrayOfContent objectAtIndex:indexPath.row];
NSURL *url = [NSURL URLWithString:[[_arrayOfContent objectAtIndex:indexPath.row] valueForKey:#"video_URL"]];
[self downloadDurationAtURL:url cellTag:indexPath];
}
Now I have used the following code to Download Duration of a video :
- (NSUInteger)videoDuration:(NSURL *)videoURL {
AVURLAsset *videoAVURLAsset = [AVURLAsset assetWithURL:videoURL];
CMTime durationV = videoAVURLAsset.duration;
return CMTimeGetSeconds(durationV);
}
- (NSString *)videoDurationTextDurationTotalSeconds:(NSUInteger)dTotalSeconds {
NSUInteger dHours = floor(dTotalSeconds / 3600);
NSUInteger dMinutes = floor(dTotalSeconds % 3600 / 60);
NSUInteger dSeconds = floor(dTotalSeconds % 3600 % 60);
if (dHours > 0) {
return [NSString stringWithFormat:#"%i:%02i:%02i",dHours, dMinutes, dSeconds];
} else {
return [NSString stringWithFormat:#"%02i:%02i",dMinutes, dSeconds];
}
}
-(void)downloadDurationAtURL:(NSURL *)videoURL cellTag:(NSIndexPath*)indexPath {
dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0), ^{
//retrive image on global queue
NSUInteger dTotalSeconds = [self videoDuration:videoURL];
NSLog(#"dTotalSeconds %i",dTotalSeconds);
if (dTotalSeconds > 0) {
NSString *videoDurationText = [self videoDurationTextDurationTotalSeconds:dTotalSeconds];
dispatch_async(dispatch_get_main_queue(), ^{
TableViewCellHome *cell = [self.mTableView cellForRowAtIndexPath:[NSIndexPath indexPathForRow:indexPath.row inSection:0]];
[[_arrayOfContent objectAtIndex:indexPath.row] setObject : videoDurationText forKey:#"duration"];
cell.labelDuration.text = videoDurationText;
cell.labelDuration.hidden = false;
});
}
else {
dispatch_async(dispatch_get_main_queue(), ^{
TableViewCellHome *cell = [self.mTableView cellForRowAtIndexPath:[NSIndexPath indexPathForRow:indexPath.row inSection:0]];
[[_arrayOfContent objectAtIndex:indexPath.row] setObject : #"" forKey:#"duration"];
cell.labelDuration.hidden = true;
cell.labelDuration.text = #"";
});
}
});
}
Now problem is that UI is getting blocked until the duration is changed in the cell. I am not able to select a particular row until duration is displayed on the cell. But it is working fine when I display the Home controller for the first time after calling the API. It only gets blocked when I call it from Home detail.
You need to load the duration asynchronously, like this:
- (void)videoDuration:(NSURL *)videoURL completion:(void (^)(CMTime))durationCallback {
AVURLAsset *videoAVURLAsset = [AVURLAsset assetWithURL:videoURL];
[videoAVURLAsset loadValuesAsynchronouslyForKeys:#[ #"duration"] completionHandler:^{
NSError *error;
if([videoAVURLAsset statusOfValueForKey:#"duration" error:&error]) {
NSLog(#"error getting duration: %#", error);
durationCallback(kCMTimeZero); // or something
} else {
durationCallback(videoAVURLAsset.duration);
}
}];
}

UI getting blocked while fetching image path using Photos Framework in iOS

- (void)updateFetchRequest {
PHFetchOptions *options = [PHFetchOptions new];
switch (self.imagePickerController.mediaType) {
case QBImagePickerMediaTypeImages:
options.predicate = [NSPredicate predicateWithFormat:#"mediaType == %ld", PHAssetMediaTypeImage];
break;
case QBImagePickerMediaTypeVideos:
options.predicate = [NSPredicate predicateWithFormat:#"mediaType == %ld", PHAssetMediaTypeVideo];
break;
default:
break;
}
self.fetchResult = [PHAsset fetchAssetsInAssetCollection:self.assetCollections[0] options:options];
PHContentEditingInputRequestOptions *editoptions = [[PHContentEditingInputRequestOptions alloc] init];
[editoptions setCanHandleAdjustmentData:^BOOL(PHAdjustmentData *adjustmentData) {
return [adjustmentData.formatIdentifier isEqualToString:AdjustmentFormatIdentifier] && [adjustmentData.formatVersion isEqualToString:#"1.0"];
}];
NSLog(#"fectarray %#",self.fetchResult);
PHImageRequestOptions *option = [PHImageRequestOptions new];
option.networkAccessAllowed = YES;
option.synchronous = YES;
option.version = PHImageRequestOptionsVersionOriginal;
PHAssetCollection *assetCollection = self.assetCollections[0];
__weak __typeof(self) weakSelf = self; // New C99 uses __typeof(..)
dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0), ^{
NSLog(#"Work Dispatched");
PHAsset *asset;
SyncAlbumNames = [NSString stringWithFormat:#"%#",assetCollection.localizedTitle];
for (int i=0; i<self.fetchResult.count; i++) {
asset = weakSelf.fetchResult[i];
NSLog(#"asset is %#",asset);
[asset requestContentEditingInputWithOptions:editoptions
completionHandler:^(PHContentEditingInput *contentEditingInput, NSDictionary *info) {
NSURL *imageURL = contentEditingInput.fullSizeImageURL;
NSLog (#"imageUrl %#",imageURL);
// [weakSelf getLocalId:[NSString stringWithFormat:#"%#",imageURL]];
[self performSelectorOnMainThread:#selector(getLocalId:)
withObject:[NSString stringWithFormat:#"%#",imageURL]
waitUntilDone:YES];
if(uparray.count == 0){
[arrayAssets addObject:asset];
[newArray addObject:asset];
if(arrayAssets.count !=0){
[dict setValue:[NSArray arrayWithArray:arrayAssets]
forKey:assetCollection.localizedTitle];
NSLog(#"dict count is %lu",(unsigned long)dict.count);
}
[SyncAlbum getArraySubTypes];
NSLog(#"arrayAssets count %d",arrayAssets.count);
NSLog(#"fetchresult count %d",weakSelf.fetchResult.count);
__typeof(weakSelf) strongSelf = weakSelf;
if (strongSelf) {
if(arrayAssets.count + 2 + alcount == strongSelf.fetchResult.count){
[strongSelf SyncAlbums];
}
// When finished call back on the main thread:
dispatch_async(dispatch_get_main_queue(), ^{
// Return data and update on the main thread
// Task 3: Deliver the data to a 3rd party component (always do this on the main thread, especially UI).
});
}
//add count to array asset
}else{
NSLog(#"asset already uploaded");
alcount++;
//asset already upload count here..
}
}];
}
});
}
My UI gets blocked when executing this method. I am using SQLite database. Can anyone help me with this?
I believe that
option.synchronous = YES;
If you cmd+click on synchronous shows that
// return only a single result, blocking until available (or failure). Defaults to NO
Also setting
PHImageRequestOptionsDeliveryMode
to HighQualityFormat will force synchronous = true.

Difficulty getting media to replace placeholder following download using JSQMessage

I'm using JSQMessage and am having a little difficulty with showing the placeholder for media until I have it correctly downloading, and then replacing with the media. I have everything working correctly as far as adding the messages and media to server, I just can't get it to replace the placeholders.
Currently, I have a function that queries my database and pulls an array of objects for messages and then loops through and calls this function for each object to output and add it to my message thread. I'm struggling to figure out why the section with "messageToAdd.isMediaMessage" is not replacing the placeholders with the actual media following it's download from the server. Does anyone know how I should be handling this to make sure it adds the message with a placeholder, and then replaces once the media is downloaded correctly?
- (void)addMessage:(PFObject *)object
{
id<JSQMessageMediaData> messageMedia = nil;
PFObject *user = object[#"messageSender"];
[users addObject:user];
NSString *name = #"";
if(user[#"profileFName"] && user[#"profileLName"])
name= [NSString stringWithFormat:#"%# %#",user[#"profileFName"],user[#"profileLName"]];
else
name= [NSString stringWithFormat:#"%# %#",user[#"consultantFName"],user[#"consultantLName"]];
if([object[#"messageFileType"] isEqual: #"video"]){
JSQVideoMediaItem *messageMedia = [[JSQVideoMediaItem alloc] init];
messageMedia.fileURL = nil;
messageMedia.isReadyToPlay = NO;
messageToAdd = [JSQMessage messageWithSenderId:user.objectId displayName:name media:messageMedia];
} else if ([object[#"messageFileType"] isEqual: #"image"]){
JSQPhotoMediaItem *messageMedia = [[JSQPhotoMediaItem alloc] init];
messageMedia.image = nil;
messageToAdd = [JSQMessage messageWithSenderId:user.objectId displayName:name media:messageMedia];
} else{
messageToAdd= [[JSQMessage alloc] initWithSenderId:user.objectId senderDisplayName:name date:object[#"sendDate"] text:object[#"messageContent"]];
}
if(isLoadMore)
[messages insertObject:messageToAdd atIndex:0];
else
[messages addObject:messageToAdd];
// NOT TRIGGERING THESE AFTER MEDIA DOWNLOADED
if (messageToAdd.isMediaMessage) {
dispatch_async(dispatch_get_global_queue( DISPATCH_QUEUE_PRIORITY_DEFAULT, 0), ^(void){
if ([object[#"messageFileType"] isEqual: #"image"]){
[object[#"messageMedia"] getDataInBackgroundWithBlock:^(NSData *imageData, NSError *error) {
if (!error) {
JSQPhotoMediaItem *photoItem = [[JSQPhotoMediaItem alloc] initWithImage:[UIImage imageWithData:imageData]];
((JSQPhotoMediaItem *)messageMedia).image = [UIImage imageWithCGImage:photoItem.image.CGImage];
[self.collectionView reloadData];
}
}];
}
else if([object[#"messageFileType"] isEqual: #"video"]){
PFFile *videoFile = object[#"messageMedia"];
NSURL *videoURL = [NSURL URLWithString:videoFile.url];
((JSQVideoMediaItem *)messageMedia).fileURL = videoURL;
((JSQVideoMediaItem *)messageMedia).isReadyToPlay = YES;
[self.collectionView reloadData];
}
else {
NSLog(#"%s error: unrecognized media item", __PRETTY_FUNCTION__);
}
});
}
}
For others who come along with the same issue/question, I resolved how it was working by looking at the project NotificationChat here:https://github.com/relatedcode/NotificationChat/blob/master/NotificationChat/Classes/Chat/ChatView.m. It gives a really good overview of using the JSQMessage platform.
Here's my modified function so you can see the finished product.
- (void)addMessage:(PFObject *)object
{
PFObject *user = object[#"messageSender"];
[users addObject:user];
PFFile *mediaMessage = object[#"messageMedia"];
NSString *name = #"";
if(user[#"profileFName"] && user[#"profileLName"])
name= [NSString stringWithFormat:#"%# %#",user[#"profileFName"],user[#"profileLName"]];
else
name= [NSString stringWithFormat:#"%# %#",user[#"consultantFName"],user[#"consultantLName"]];
if([object[#"messageFileType"] isEqual: #"video"]){
JSQVideoMediaItem *mediaItem = [[JSQVideoMediaItem alloc] initWithFileURL:[NSURL URLWithString:mediaMessage.url] isReadyToPlay:YES];
mediaItem.appliesMediaViewMaskAsOutgoing = [user.objectId isEqualToString:self.senderId];
messageToAdd = [[JSQMessage alloc] initWithSenderId:user.objectId senderDisplayName:name date:object.createdAt media:mediaItem];
} else if ([object[#"messageFileType"] isEqual: #"image"]){
JSQPhotoMediaItem *mediaItem = [[JSQPhotoMediaItem alloc] initWithImage:nil];
mediaItem.appliesMediaViewMaskAsOutgoing = [user.objectId isEqualToString:self.senderId];
messageToAdd = [[JSQMessage alloc] initWithSenderId:user.objectId senderDisplayName:name date:object.createdAt media:mediaItem];
[mediaMessage getDataInBackgroundWithBlock:^(NSData *imageData, NSError *error)
{
if (error == nil)
{
mediaItem.image = [UIImage imageWithData:imageData];
[self.collectionView reloadData];
}
}];
} else{
messageToAdd= [[JSQMessage alloc] initWithSenderId:user.objectId senderDisplayName:name date:object[#"sendDate"] text:object[#"messageContent"]];
}
if(isLoadMore)
[messages insertObject:messageToAdd atIndex:0];
else
[messages addObject:messageToAdd];
}
Based on the code I think one possible reason is you need reloadData on main(UI) thread after download data successfully and asynchronously on background thread

ios - possible memory leaks with nsmutablearray not deleted?

I'm new to iOS development and in my app I'm seeing some strange memory usage behavior.
I'm getting objects from server in such setupDataForPage method:
- (void)setupDataForPage:(int)page actionType:(NSString *)type success:(void (^)())callback
{
__weak MyTableViewController *weakSelf = self;
// clearing image cache because feed contains a lot of images
[[SDImageCache sharedImageCache] clearMemory];
[[SDImageCache sharedImageCache] clearDisk];
MyHTTPClient *API = [MyHTTPClient new];
[API feedFor:page success:^(AFHTTPRequestOperation *operation, id data) {
NSArray *data = [data objectForKey:#"data"];
if ([data count] > 0) {
// remove all objects to refresh with new ones
if ([type isEqualToString:#"pullToRefresh"]) {
[weakSelf.models removeAllObjects];
}
// populate data
NSMutableArray *result = [NSMutableArray new];
for (NSDictionary *modelData in data) {
MyModel *model = [[MyModel alloc] initWithDictionary:modelData];
[result addObject:model];
}
[weakSelf.models addObjectsFromArray:result];
[weakSelf.tableView reloadData];
}
callback();
} failure:nil];
}
it is used in viewDidLoad while getting initial request and also for pull refresh and infinite scrolling:
- (void)viewDidLoad {
[super viewDidLoad];
__block int page = 1;
__weak MyTableViewController *weakSelf = self;
// initial load
[self setupDataForPage:page actionType:#"initial" success:^{ page += 1; }];
// pull to refresh
[self.tableView addPullToRefreshWithActionHandler:^{
[weakSelf setupDataForPage:1 actionType:#"pullToRefresh" success:^{
[weakSelf.tableView.pullToRefreshView stopAnimating];
}];
}];
// infinite scrolling
[self.tableView addInfiniteScrollingWithActionHandler:^{
[weakSelf setupItemsForPage:page actionType:#"infiniteScroll" success:^{
page += 1;
[weakSelf.tableView.infiniteScrollingView stopAnimating];
}];
}];
}
I noticed that even after pull to refresh action which returns the same data (and I'm just removing all models and add them once more) my app's memory usage grows from nearly 19mb to 24mb..
I would like someone more experienced to look at this piece of code to determine whether it contains some possible memory leaks.. Should I somehow delete NSMutableArray *result variable after assigning it to models array?
Thanks!
First of all, use #autoreleasepool here:
#autoreleasepool {
NSArray *data = [data objectForKey:#"data"];
if ([data count] > 0) {
// remove all objects to refresh with new ones
if ([type isEqualToString:#"pullToRefresh"]) {
[weakSelf.models removeAllObjects];
}
// populate data
NSMutableArray *result = [NSMutableArray new];
for (NSDictionary *modelData in data) {
MyModel *model = [[MyModel alloc] initWithDictionary:modelData];
[result addObject:model];
}
[weakSelf.models addObjectsFromArray:result];
[weakSelf.tableView reloadData];
}
}
#autoreleasepool allows you to release every object allocated in that scope IMMEDIATELY.
This is perfect situation where use it ;)

Resources