iOS - DrawRect performance issue - ios

I am using -drawRect for the first time in an attempt to alternatively speed up a UITableView. However, the drawRect method seems to be slowing the table down quite largely.
Please can you tell me how I can improve the drawRect method below in order to speed up the table?
Edit---
In the drawRect method, I am writing two NSStrings to the cell's view, two UIImages and a drop shadow to both of the NSStrings and one of the UIImages.
One of the aforementioned images is downloaded asynchronously and then setNeedsDisplay is called to draw that UIImage to the screen. I believe that this could initially be the reason for the lag occurring.
- (void) drawRect:(CGRect) rect {
CGContextRef context = UIGraphicsGetCurrentContext();
[[UIColor clearColor] set];
CGContextFillRect(context, rect);
CGContextSaveGState(context);
CGContextSetShadow(context, CGSizeMake(1,1),1);
//draw text here
if (shouldDrawImage == YES) {
CGContextDrawImage(context, CGRectMake(10, 10, 40, 40), self.image.CGImage);
}
CGContextDrawImage(context, CGRectMake(self.frame.size.width - 16, 0, 16, self.frame.size.height), [UIImage imageNamed:#"right_bar_including_holes"].CGImage);
NSString *authorName = [[self.info objectForKey:#"user"] objectForKey:#"full_name"];
[RGB(219, 240, 73) set];
CGSize maximumLabelSize = CGSizeMake(self.frame.size.width - 10 - 55 - 16,9999);
CGSize authorsize = [authorName sizeWithFont:[UIFont boldSystemFontOfSize:15]
constrainedToSize:maximumLabelSize
lineBreakMode:UILineBreakModeWordWrap];
[authorName drawInRect:CGRectMake(60, 10, self.frame.size.width - 60, authorsize.height) withFont:[UIFont boldSystemFontOfSize:15]];
[RGB(249,249,249) set];
NSString *description = [self.info objectForKey:#"description"];
CGSize descriptionSize = [description sizeWithFont:[UIFont systemFontOfSize:14] constrainedToSize:maximumLabelSize lineBreakMode:UILineBreakModeWordWrap];
[description drawInRect:CGRectMake(60, authorsize.height + 15, descriptionSize.width, descriptionSize.height) withFont:[UIFont systemFontOfSize:14]];
CGContextRestoreGState(context);
}
- (NSString *) reuseIdentifier {
return NSStringFromClass([SlideCell class]);
}
- (void) updateCellInfo:(NSDictionary *)_info {
[self setInfo:_info];
UIImageView *iv = [[UIImageView alloc] initWithFrame:CGRectMake(0, 0, 1, 1)];
[iv setImageWithURLRequest:[NSURLRequest requestWithURL:[NSURL URLWithString:[[self.info objectForKey:#"user"] objectForKey:#"avatar"]]] placeholderImage:nil success:^(NSURLRequest *request, NSHTTPURLResponse *response, UIImage *_image) {
dispatch_async(dispatch_get_main_queue(), ^{
shouldDrawImage = YES;
self.image = [self roundedImage:_image];
[iv release];
[self setNeedsDisplay];
});
} failure:nil];
[self setNeedsDisplay];
[self setSelectionStyle:UITableViewCellSelectionStyleNone];
}

Yes - you run Instruments' Time Profile on your app to tell you exactly how much time is spent, and where. It will tell you if it is the image, string, shadow, or something else.

You should profile that code and see if image is the problem first.
I'm not sure how AFNetworking library (which you are using) works when downloading asynchronously the images.
If you know that image is the problem, I suspect that image needs to be rescaled in UIImageView when it's set. That could be the problem. You would need to rescale the UIImage you want to set to the UIImageView to the UIIImageView's frame so no autorescaling triggers. That's costly when scrolling.
You receive image and inmediately dispatch code to main thread. Potential rescaling could work 'under the hood'. I would change that method to:
UIImageView *iv = [[UIImageView alloc] initWithFrame:CGRectMake(0, 0, IMGSIZEX, IMGSIZEY)];
[iv setImageWithURLRequest:[NSURLRequest requestWithURL:[NSURL URLWithString:[[self.info objectForKey:#"user"] objectForKey:#"avatar"]]] placeholderImage:nil success:^(NSURLRequest *request, NSHTTPURLResponse *response, UIImage *_image) {
//Note you have to implement rescaleToFitSize
UIImage* rescaled = [_image rescaleToFitSize:iv.frame.size];
dispatch_async(dispatch_get_main_queue(), ^{
self.image = _image;
[iv release];
[self setNeedsDisplay];
});
} failure:^{
//Handle failure! You create mem. leak on failure
[iv release];
}];
As a side note, you don't handle failures in image download. You definitively should.

Stock UITableView is as efficient as it gets, provided you use it properly:
1. Recycle cells (dequeue)
2. Avoid transparency whenever posible (alpha blending DOES slow things down)
3. Configuration of reused cells doesn't take much processing time.
I don't think anyone can significantly improve on Apple's code by overriding drawRect...

Related

How to get rid of massive Memory leak with blurred image in uitableview cell

Somehow when I scroll to the bottom of my table (96 items only) I get 1 gb of memory usage (it increase for every cell that gets created. I have a table that has an image with a blurred image in front of it that is using a cropped version of the original image with text then on top of that. Pretty simple. I'm using the apple provided sample code for blurring images available here: https://github.com/iGriever/TWSReleaseNotesView/blob/master/TWSReleaseNotesView/UIImage%2BImageEffects.m
- (UITableViewCell *)tableView:(UITableView *)tableView cellForRowAtIndexPath:(NSIndexPath *)indexPath
{
static NSString *CellIdentifier = #"itemCell";
UITableViewCell *cell = [tableView dequeueReusableCellWithIdentifier:CellIdentifier forIndexPath:indexPath];
NSDictionary *foodItem = [self.foodItems objectAtIndex:indexPath.row];
// Set up the image view
UIImageView *imageView = (UIImageView *)[cell viewWithTag:1];
UIImage *foodImage = [UIImage imageNamed:[foodItem objectForKey:FOOD_IMAGE_FILE_KEY]];
[imageView setImage:foodImage];
[imageView setContentMode:UIViewContentModeScaleAspectFill];
// Set up the label
UILabel *labelView = (UILabel *)[cell viewWithTag:2];
[labelView setFont:[UIFont flatFontOfSize:20.0]];
labelView.text = #"Blah";
// Set up the image view
UIImageView *blurredView = (UIImageView *)[cell viewWithTag:3];
dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0), ^{
UIImage *blurredImage = [[self cropForBlur:foodImage] applyBlurWithRadius:4
tintColor:[UIColor colorWithWhite:1.0 alpha:0.2]
saturationDeltaFactor:1.2
maskImage:nil];
dispatch_sync(dispatch_get_main_queue(), ^{
blurredView.image = blurredImage;
});
});
return cell;
}
*Note: I know it is most likely the blur (as opposed to my cropping) as it only happens when I do the blur. Also it's nothing to do with the async dispatch stuff as it still happens if I don't do that.
Yes I'm using ARC. Yes I'm using storyboards.
Here's cropForBlur:
- (UIImage *)cropForBlur:(UIImage *)originalImage
{
CGSize size = [originalImage size];
int startCroppingPosition = 100;
if (size.height > size.width) {
startCroppingPosition = size.height/2 + ((size.width / 320) * 45);
} else {
startCroppingPosition = size.height/2 + ((size.width / 320) * 45);
}
// WTF: Don't forget that the CGImageCreateWithImageInRect believes that
// the image is 180 rotated, so x and y are inverted, same for height and width.
CGRect cropRect = CGRectMake(0, startCroppingPosition, size.width, ((size.width/320) * 35));
CGImageRef imageRef = CGImageCreateWithImageInRect([originalImage CGImage], cropRect);
UIImage *newImage = [UIImage imageWithCGImage:imageRef scale:(size.width/160) orientation:originalImage.imageOrientation];
CGImageRelease(imageRef);
return newImage;
}
I've also tried looking in Instruments but it will show that I'm using heaps of memory in total but the big chunks of memory don't show up in the breakdown. Weird.
Here's the bit that's saying I am using heaps of memory in the left bar.
Here's the allocations bit in Instruments. I don't see how they could match up. I haven't got zombies on or anything (unless there's somewhere other that in edit scheme to change that).
Here's the leaks Instruments view after scrolling down a bit. Doesn't seem to show any real leaks :S So confused.
See the solution from this link. The problem is the image has a different scale from your screen scale. The final output image can be very big.

I think I get a deadlock but not quite understand why

I am trying to place an image one after another using GCD like following
-(void)setUpImages {
NSArray *images = #[[UIImage imageNamed:#"blogger-icon.png"],
[UIImage imageNamed:#"gplus-icon.png"],
[UIImage imageNamed:#"facebok-icon.png"]
];
[images enumerateObjectsUsingBlock:^(id obj, NSUInteger idx, BOOL *stop) {
dispatch_sync(dispatch_get_main_queue(), ^{
UIImageView *imageView = [[UIImageView alloc] initWithFrame:CGRectMake(80, idx * ((UIImage*)obj).size.height + idx*30 + 10, ((UIImage*)obj).size.width, ((UIImage*)obj).size.height)];
NSLog(#"index is %#",NSStringFromCGRect(imageView.frame));
[imageView setImage:(UIImage*)obj];
[self.view.layer addSublayer:imageView.layer];
sleep(1);
});
}];
}
I am using dispatch_sync because I want it will wait until its block is done ( first image is placed on the screen) then second images will be and so the third one does. And all thing are happening on the main thread now.
However, it seems I am getting a deadlock in the middle and my logic is wrong at some points.
I need help to understand this situation. Please help.
From the documentation of dispatch_sync:
Calling this function and targeting the current queue results in deadlock.
That said, I think you have a design problem and you can achieve the desired result without involving GCD.
Since it looks like you want to update the UI every k seconds, avoid using sleep(1) and invoke a method recursively with a delay using performSelector:withObject:afterDelay:.
Something like
- (void)updateImageViewWithImageAtIndex:(NSNumber *)i {
UIImage * image = self.images[i.intValue];
UIImageView * imageView = [[UIImageView alloc] initWithFrame:CGRectMake(80, idx * image.size.height + idx*30 + 10, image.size.width, image.size.height)];
NSLog(#"index is %#",NSStringFromCGRect(imageView.frame));
[imageView setImage:image];
[self.view.layer addSublayer:imageView.layer];
if (i < images.count - 1) {
[self performSelector:#selector(updateImageViewWithImageAtIndex:) withObject:#(i.intValue++) afterDelay:1];
}
}
- (void)setUpImages {
// Assuming you have declared images as a property
self.images = #[[UIImage imageNamed:#"blogger-icon.png"],
[UIImage imageNamed:#"gplus-icon.png"],
[UIImage imageNamed:#"facebok-icon.png"]
];
[self updateImageViewFromImages:images index:0];
}
Now I may be completely wrong here, but it looks like all you're trying to do is change the image in an image view once a second, in which case your approach is incorrect. UIImageView has methods for this built in. For example:
NSArray *images = #[[UIImage imageNamed:#"blogger-icon.png"],
[UIImage imageNamed:#"gplus-icon.png"],
[UIImage imageNamed:#"facebok-icon.png"]
];
UIImageView *imageView = [[UIImageView alloc] initWithFrame:CGRectMake(80, idx * ((UIImage*)obj).size.height + idx*30 + 10, ((UIImage*)obj).size.width, ((UIImage*)obj).size.height)];
[imageView setAnimationDuration:3];
[imageView setAnimationImages:images];
[imageView startAnimating];
I think Gabriele nailed the deadlock above. Don't call dispatch_sync and pass in the main queue.
Another possible problem: I suspect that manipulating a layer's views is not thread-safe. It shouldn't cause a deadlock, but it could cause undesirable consequences.

UICollectionView scroll performance using AFNetworking to load images

I have read quite a few of the UICollectionView posts about poor scrolling, but none seem to directly apply or they are still unanswered.
I'm using AFNetworking to asynchronously load the images (95px squared) onto each cell and then when the images are scrolled into view again, the image is restored from cache (as verified by the response code given as 0 instead of 200).
Here's what I've tried:
Commented out weakCell.photoView.image = image; so the images aren't draw on screen and the scrolling was smoother (still stuttered a little during the HTTP get)
Removed all of the AFNetworking code from the cellForRowAtIndexPath method and the scrolling was much smoother (even with the custom cell shadows, etc. still being drawn on screen)
When I draw only the cell view (with the shadows) on screen, scrolling is very smooth for 100 cells. As soon as I start drawing the images on screen, scrolling is very poor on my device and it's even noticeable on the simulator. Instagram has very smooth scrolling for hundreds of cells on their profile view, so I'm trying to get close to their performance.
Are there any ways that I can improve any of my code below in order to improve scrolling performance?
Here is my cell code:
#import "PhotoGalleryCell.h"
#implementation PhotoGalleryCell
- (id)initWithFrame:(CGRect)frame
{
self = [super initWithFrame:frame];
if (self)
{
// Setup the background color, shadow, and border
self.backgroundColor = [UIColor colorWithWhite:0.25f alpha:1.0f];
self.layer.borderColor = [UIColor blackColor].CGColor;
self.layer.borderWidth = 0.5f;
self.layer.shadowColor = [UIColor blackColor].CGColor;
self.layer.shadowRadius = 3.0f;
self.layer.shadowOffset = CGSizeMake(0.0f, 2.0f);
self.layer.shadowOpacity = 0.5f;
// Make sure we rasterize for retina
self.layer.rasterizationScale = [UIScreen mainScreen].scale;
self.layer.shouldRasterize = YES;
// Add to the content view
self.photoView = [[UIImageView alloc] initWithFrame:self.bounds];
[self.contentView addSubview:self.photoView];
}
return self;
}
- (void)prepareForReuse
{
[super prepareForReuse];
self.photoView.image = nil;
self.largeImageURL = nil;
}
And here is my UICollectionView code:
#pragma mark - Collection View Delegates
- (NSInteger)numberOfSectionsInCollectionView:(UICollectionView *)collectionView
{
return 1;
}
- (NSInteger)collectionView:(UICollectionView *)collectionView numberOfItemsInSection:(NSInteger)section
{
return [zePhotos count];
}
- (UICollectionViewCell *)collectionView:(UICollectionView *)collectionView cellForItemAtIndexPath:(NSIndexPath *)indexPath
{
PhotoGalleryCell *cell = [collectionView dequeueReusableCellWithReuseIdentifier:kPGPhotoCellIdentifier forIndexPath:indexPath];
// Get a reference to the image dictionary
NSDictionary *photoDict = [[zePhotos objectAtIndex:indexPath.row] objectForKey:#"image"];
// Asynchronously set the thumbnail view
__weak PhotoGalleryCell *weakCell = cell;
NSString *thumbnailURL = [[photoDict objectForKey:#"thumbnail"] objectForKey:#"url"];
NSURLRequest *photoRequest = [NSURLRequest requestWithURL:[NSURL URLWithString:thumbnailURL]];
[cell.photoView setImageWithURLRequest:photoRequest
placeholderImage:nil
success:^(NSURLRequest *request, NSHTTPURLResponse *response, UIImage *image) {
weakCell.photoView.image = image;
}
failure:^(NSURLRequest *request, NSHTTPURLResponse *response, NSError *error) {
NSLog(#"Error retrieving thumbnail... %#", [error localizedDescription]);
}];
// Cache the large image URL in case they tap on this cell later
cell.largeImageURL = [[photoDict objectForKey:#"large"] objectForKey:#"url"];
return cell;
}
- (void)collectionView:(UICollectionView *)collectionView didSelectItemAtIndexPath:(NSIndexPath *)indexPath
{
[self performSegueWithIdentifier:#"showPhotoDetail" sender:self];
}
You could try adding a shadowPath to your cell init, it should improve performance, that's the code I used on one of my project to add a rounded shadowPath (see the UIBezierPath methods for more choice)
self.layer.shadowPath = [UIBezierPath bezierPathWithRoundedRect:self.frame.bounds
byRoundingCorners:UIRectCornerAllCorners
cornerRadii:CGSizeMake(10, 10)].CGPath;
Moreover if I remember correctly AFNetworking doesn't resize the image returned from the server, so it could have an impact on the quality of your image (despite the scale method you added to the UIImageView), I recommend dispatching the returned image to resize it if you want as so :
CGSize targetSize = cell.photoView.bounds.size;
[cell.photoView setImageWithURLRequest:photoRequest
placeholderImage:nil
success:^(NSURLRequest *request, NSHTTPURLResponse *response, UIImage *image) {
dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0), ^{
CGFloat imageHeight = image.size.height;
CGFloat imageWidth = image.size.width;
CGSize newSize = weakCell.imageView.bounds.size;
CGFloat scaleFactor = targetSize.width / imageWidth;
newSize.height = imageHeight * scaleFactor;
UIGraphicsBeginImageContextWithOptions(newSize, NO, 0.0);
[image drawInRect:CGRectMake(0, 0, newSize.width, newSize.height)];
UIImage *small = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
dispatch_async(dispatch_get_main_queue(),^{
weakCell.photoView.image = small;
});
});
}
failure:^(NSURLRequest *request, NSHTTPURLResponse *response, NSError *error) {
NSLog(#"Error retrieving thumbnail... %#", [error localizedDescription]);
}];
Code inspection looks good, though I bet it is the compositing of the shadow which is adding a good deal to the lag. The way you figure out exactly what is causing the delay is to use the Time Profiler tool in Instruments. Here are the docs from Apple.
The problem is when you scroll quickly you're starting up hundreds of network requests at the same time. If you have the image cached, display it immediately. If you don't, only start the download when the table view slows down.
You can use something like this:
//Properties or Instance Variables
NSDate *scrollDateBuffer;
CGPoint scrollOffsetBuffer;
- (void)scrollViewDidScroll:(UIScrollView *)scrollView
{
NSTimeInterval secondsSinceLastScroll = [[NSDate date] timeIntervalSinceDate:scrollDateBuffer];
CGFloat distanceSinceLastScroll = fabsf(scrollView.contentOffset.y - scrollOffsetBuffer.y);
BOOL slow = (secondsSinceLastScroll > 0 && secondsSinceLastScroll < 0.02);
BOOL small = (distanceSinceLastScroll > 0 && distanceSinceLastScroll < 1);
if (slow && small) {
[self loadImagesForOnscreenRows];
}
scrollDateBuffer = [NSDate date];
scrollOffsetBuffer = scrollView.contentOffset;
}
You will want to call loadImagesForOnscreenRows in other methods, like when new data comes in, viewWillAppear, and scrollViewDidScrollToTop.
Here's an example implementation of loadImagesForOnscreenRows:
- (void)loadImagesForOnscreenRows
{
#try {
for (UITableViewCell *cell in self.tableView.visibleCells) {
// load your images
NSURLRequest *photoRequest = …;
if (photoRequest) {
[cell.photoView setImageWithURLRequest:…];
}
}
}
#catch (NSException *exception) {
NSLog(#"Exception when loading table cells: %#", exception);
}
}
I have this in a try/catch block because in my experience [UITableView -visibleCells] isn't reliable - it occasionally returns deallocated cells or cells without a superview. If you make sure this method is only called when the table is not scrolling quickly, it shouldn't impact scroll performance too much.
Also, note that the AFNetworking UIImageView category doesn't expose the cache object. You'll need to modify it slightly to check if you already have an image cached; this answer should point you in the right direction.

Reduce Memory Usage in iOS App without leaks

My iOS app has high memory usage but no memory leaks. How do I reduce the memory usage.
Using Instruments, I discovered that my app maxes out at 90MB, before a memory warning occurs, and other memory is deallocated, and then it stays around 55-65MB for the rest of its usage.
I feel that 55-65MB is too high, right?
Since, Instruments did not catch any leaks. What can I do to reduce this memory usage?
I went through this year's WWDC video, but of the stuff I understood (this is my first iOS app), it mostly covered dealing with leaks.
Some possibly useful information:
VM: ImageIO_GIF_Data 30.35MB Live Bytes | 115 Living | 300 Transient |
136.12 MB Overall Bytes
VM: MappedFile 36.04 MB Live Bytes | 16 Living | 11 Transient | 36.09 MB Overall Bytes
All the other stuff is under 1MB
My app downloads around 30 GIF files from the internet, I use SDWebImage, and I just save the URLs of the images, and SDWebImage does the rest. :P
Thanks in advance,
From An iOS Memory Management First Timer
Thanks once again for you help
You say you are using a table view. Although cells are reused automatically, this makes it very easy to make mistakes and create too many objects.
1 common error is allocating objects (eg. UIImageView) in the cellForRowAtIndexPath method, as this means every time a cell is reused a new UIImageView is added to it as well as keeping the old ones. So double check what is going on in your cellForRowAtIndexPath method.
I decided to add full code for memory saving, if you are using GIF files, modify UIImage scale method (Found it here, an Stackoverflow). As said GangstaGraham in SD Image exist method sd_animatedImageByScalingAndCroppingToSize
#interface UIImage (Scaling)
-(UIImage *)imageByScalingProportionallyToSize:(CGSize)targetSize;
-(UIImage*) croppedImageWithRect: (CGRect) rect;
#end
#implementation UIImage (Scaling)
- (UIImage *)imageByScalingProportionallyToSize:(CGSize)targetSize {
if ([[UIScreen mainScreen] respondsToSelector:#selector(scale)]) {
if ([[UIScreen mainScreen] scale] == 2.0) {
targetSize.height *= 2.0f;
targetSize.width *= 2.0f;
}
}
NSUInteger width = targetSize.width;
NSUInteger height = targetSize.height;
UIImage *newImage = [self resizedImageWithMinimumSize: CGSizeMake (width, height)];
return [newImage croppedImageWithRect: CGRectMake ((newImage.size.width - width) / 2, (newImage.size.height - height) / 2, width, height)];
}
-(CGImageRef)CGImageWithCorrectOrientation
{
if (self.imageOrientation == UIImageOrientationDown) {
//retaining because caller expects to own the reference
CGImageRetain([self CGImage]);
return [self CGImage];
}
UIGraphicsBeginImageContext(self.size);
CGContextRef context = UIGraphicsGetCurrentContext();
if (self.imageOrientation == UIImageOrientationRight) {
CGContextRotateCTM (context, 90 * M_PI/180);
} else if (self.imageOrientation == UIImageOrientationLeft) {
CGContextRotateCTM (context, -90 * M_PI/180);
} else if (self.imageOrientation == UIImageOrientationUp) {
CGContextRotateCTM (context, 180 * M_PI/180);
}
[self drawAtPoint:CGPointMake(0, 0)];
CGImageRef cgImage = CGBitmapContextCreateImage(context);
UIGraphicsEndImageContext();
return cgImage;
}
-(UIImage*)resizedImageWithMinimumSize:(CGSize)size
{
CGImageRef imgRef = [self CGImageWithCorrectOrientation];
CGFloat original_width = CGImageGetWidth(imgRef);
CGFloat original_height = CGImageGetHeight(imgRef);
CGFloat width_ratio = size.width / original_width;
CGFloat height_ratio = size.height / original_height;
CGFloat scale_ratio = width_ratio > height_ratio ? width_ratio : height_ratio;
CGImageRelease(imgRef);
return [self drawImageInBounds: CGRectMake(0, 0, round(original_width * scale_ratio), round(original_height * scale_ratio))];
}
-(UIImage*)drawImageInBounds:(CGRect)bounds
{
UIGraphicsBeginImageContext(bounds.size);
[self drawInRect: bounds];
UIImage *resizedImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return resizedImage;
}
-(UIImage*)croppedImageWithRect:(CGRect)rect
{
UIGraphicsBeginImageContext(rect.size);
CGContextRef context = UIGraphicsGetCurrentContext();
CGRect drawRect = CGRectMake(-rect.origin.x, -rect.origin.y, self.size.width, self.size.height);
CGContextClipToRect(context, CGRectMake(0, 0, rect.size.width, rect.size.height));
[self drawInRect:drawRect];
UIImage* subImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return subImage;
}
-(UIImage *) resizableImageWithCapInsets2: (UIEdgeInsets) inset
{
if ([self respondsToSelector:#selector(resizableImageWithCapInsets:resizingMode:)])
{
return [self resizableImageWithCapInsets:inset resizingMode:UIImageResizingModeStretch];
}
else
{
float left = (self.size.width-2)/2;//The middle points rarely vary anyway
float top = (self.size.height-2)/2;
return [self stretchableImageWithLeftCapWidth:left topCapHeight:top];
}
}
#end
And UIImageView:
#import <SDWebImage/SDImageCache.h>
#implementation UIImageView (Scaling)
-(void)setImageWithURL:(NSURL*)url scaleToSize:(BOOL)scale
{
if(url.absoluteString.length < 10) return;
if(!scale){
[self setImageWithURL:url];
return;
}
__block UIImageView* selfimg = self;
__block NSString* prevKey = SPRINTF(#"%#_%ix%i", url.absoluteString, (int)self.frame.size.width, (int)self.frame.size.height);
__block UIImage* prevImage = nil;
dispatch_queue_t queue = dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0);
dispatch_async(queue, ^ {
prevImage = [[SDImageCache sharedImageCache] imageFromDiskCacheForKey:prevKey];
if(prevImage){
dispatch_async(dispatch_get_main_queue(), ^ {
[self setImage:prevImage];
});
}else{
[[SDWebImageDownloader sharedDownloader] downloadImageWithURL:url options:SDWebImageDownloaderFILOQueueMode progress:nil completed:^(UIImage *image, NSData *data, NSError *error, BOOL finished) {
if(error){
[selfimg setImageWithURL:url scaleToSize:scale];
}else{
dispatch_queue_t queue = dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0);
dispatch_async(queue, ^ {
prevImage = [image imageByScalingProportionallyToSize:self.frame.size];
if(finished)
[[SDImageCache sharedImageCache] storeImage:prevImage forKey:prevKey];
dispatch_async(dispatch_get_main_queue(), ^ {
[self setImage:prevImage];
});
});
}
}];
}
});
return;
}
-(void)setImageWithURL:(NSURL *)url placeholderImage:(UIImage *)placeholder scaleToSize:(BOOL)scale
{
[self setImage:placeholder];
[self setImageWithURL:url scaleToSize:scale];
}
#end
I would suggest, that you use Instruments and Heapshot Analysis. bbum wrote an article about it at his blog.
Here is a quick overview:
Start your App in Instruments and select the Allocations template
Wait some time after your App start to settle down
In Allocations, press Mark Heap; This is your baseline.
Use your app and return to the same screen as in #2. Press Mark Heap again.
Repeat that for some time.
If you see a steady growth of memory, you can drill down in the heapshots and see all objects allocated. That should give you a good start to reduce your memory footprint.
SDWebImage doesn't do the rest.
You need handle less images in memory as can:
erase UIImageView when it's not shown;
use reusable objects pattern;
and of course clear not visible (cached in memory) images when you've got memory warnings,
for this just use self.urImage = nil;
So, good look for app memory saving ;)

Loading UIImage from NSData causing jerky UITableView scrolling

I am having getting my tableview to scroll smoothly when loading in images from NSData.
Some background ... I'm storing image NSData in Core Data (using the 'allows external storage' option so i dont think storing the URLs or something like that is the solution here). Further, I am storing two types of data, one high res with UIImagePNGRepresentation, and one low res with UIImageJPGRepresentation. I am loading in the low res images for my table view. The image property on the entry object is not stored in core data, its set after the fact. Here is how I am doing it:
- (UITableViewCell *)tableView:(UITableView *)tableView cellForRowAtIndexPath:(NSIndexPath *)indexPath
{
DMLogTableViewCell *cell = (DMLogTableViewCell *)[self.tableView dequeueReusableCellWithIdentifier:#"LogCell" forIndexPath:indexPath];
[cell setGradients:#[[UIColor whiteColor], [UIColor lightTextColor]]];
Entry *entry = [self.fetchedResultsController objectAtIndexPath:indexPath];
UIImage *entryImage = entry.image;
if (entryImage)
cell.rightImageView.image = entryImage;
else {
NSOperationQueue *oQueue = [[NSOperationQueue alloc] init];
[oQueue addOperationWithBlock:^(void) {
UIImage *image = [UIImage imageWithData:entry.photo.lowResImageData];
if (!image) image = [UIImage imageNamed:#"defaultImage.png"];
entry.image = image;
}];
[oQueue addOperationWithBlock:^(void) {
[[NSOperationQueue mainQueue] addOperationWithBlock:^(void) {
DMLogTableViewCell *cell = (DMLogTableViewCell *)[self.tableView cellForRowAtIndexPath:indexPath];
cell.rightImageView.image = entry.image;
}];
}];
}
return cell;
}
Any ideas on how to get this to run more smoothly? Any help is greatly appreciated!
Thanks
UPDATE
This has gotten me much closer, but there is still a little bit of jerkiness . . .
(within cellForRowAtIndexPath)
UIImage *entryImage = entry.image;
if (entryImage)
cell.rightImageView.image = entryImage;
else {
[self.imageQueue addOperationWithBlock:^(void) {
UIImage *image = [UIImage imageWithData:entry.photo.lowResImageData];
if (!image) image = [UIImage imageNamed:#"bills_moduleIcon.png"];
entry.image = image;
[entry.image preload];
[[NSOperationQueue mainQueue] addOperationWithBlock:^(void) {
DMLogTableViewCell *cell = (DMLogTableViewCell *)[self.tableView cellForRowAtIndexPath:indexPath];
cell.rightImageView.image = entry.image;
}];
}];
}
that preload method is in a cateogory on UIImageView and looks like this:
- (void)preload
{
CGImageRef ref = self.CGImage;
size_t width = CGImageGetWidth(ref);
size_t height = CGImageGetHeight(ref);
CGColorSpaceRef space = CGColorSpaceCreateDeviceRGB();
CGContextRef context = CGBitmapContextCreate(NULL, width, height, 8, width * 4, space, kCGImageAlphaPremultipliedFirst);
CGColorSpaceRelease(space);
CGContextDrawImage(context, CGRectMake(0, 0, width, height), ref);
CGContextRelease(context);
}
I think my next plan of attack is to mess around with the size of the images stored (as mentioned in the comments) or perhaps trying to not update the imageViews while the table is scrolling. Any thoughts on those two methods are much appreciated! ill check back in when i've done that
UPDATE 2
Many many thanks for everyone who helped me out with this! Here was the final solution (in cellForRowAtIndexPath:
if (entryImage)
cell.rightImageView.image = entryImage;
else {
[self.imageQueue addOperationWithBlock:^(void) {
UIImage *image = [UIImage imageWithData:entry.photo.lowResImageData];
if (!image) image = [UIImage imageNamed:#"bills_moduleIcon.png"];
entry.image = [image scaleToSize:cell.rightImageView.frame.size];
[[NSOperationQueue mainQueue] addOperationWithBlock:^(void) {
DMLogTableViewCell *cell = (DMLogTableViewCell *)[self.tableView cellForRowAtIndexPath:indexPath];
cell.rightImageView.image = entry.image;
}];
}];
}
adding a new category method to UIImage:
-(UIImage*)scaleToSize:(CGSize)size
{
UIGraphicsBeginImageContext(size);
[self drawInRect:CGRectMake(0, 0, size.width, size.height)];
UIImage* scaledImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return scaledImage;
}
Scrolling is now perfectly smooth. Awesome
A couple of issues:
I'd want to see how much time these two lines are taking:
Entry *entry = [self.fetchedResultsController objectAtIndexPath:indexPath];
UIImage *entryImage = entry.image;
While I gather that entry.image is a cached image, I wouldn't be so confident that objectAtIndexPath:indexPath isn't actually retrieving the NSData from Core Data (which is a good portion of the performance issue in storing images in a local database rather than the file system). I'd be inclined to do this in the background queue.
I don't understand why you're adding two operations to oQueue (which could operate concurrently!). It should just be as follows, so you don't try to dispatch to main queue until the image is retrieved:
[oQueue addOperationWithBlock:^(void) {
UIImage *image = [UIImage imageWithData:entry.photo.lowResImageData];
if (!image) image = [UIImage imageNamed:#"defaultImage.png"];
entry.image = image;
[[NSOperationQueue mainQueue] addOperationWithBlock:^(void) {
DMLogTableViewCell *cell = (DMLogTableViewCell *)[self.tableView cellForRowAtIndexPath:indexPath];
cell.rightImageView.image = entry.image;
}];
}];
It's probably not material, but you really shouldn't create a new NSOperationQueue for each cell. You should define a class property that is a NSOperationQueue, initialize it in viewDidLoad, and then use that queue in cellForRowAtIndexPath. It's not material, I'm sure, but it's not worth the overhead of creating a queue for each row of your table. (And if you ever go to server-based image retrieval, using a single queue is very important so you can control the maxConcurrentOperationCount.)
UIImage wont actually do the decode until it needs to draw the image, so its actually happening on the main thread when you assign the image to the image view.
I have a UIImage category which will force the image to be decoded (which you should run on a background thread) here : https://github.com/tonymillion/TMDiskCache

Resources