Memory leaks when loading images to scroller - ios

I have a scrollView in which i load images into from the net .I sometimes get memory warnings, which i assume are because i am doing something wrong with the images loader.
I am trying to fix little things, and i just wanted to show the code here, and hear maybe there are more things i can fix to get rid of this warnings.
So every time the scroller (iPad) has only 4/5 images that are : current page-3->current page+3.
This is how i load the images(every image has also a blur effect with Apple's classes) :
(should i allocated imageView every time? can i improve something here? )
dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_BACKGROUND, 0), ^
{
NSData *imdata2 = [NSData dataWithContentsOfURL:url];
dispatch_async(dispatch_get_main_queue(), ^
{
UIImage *theImage=[UIImage imageWithData:imdata2 scale:1];
UIImage *LightImage = [theImage applyLightEffect];
UIImage *scaledImage =[resizer resizeImageToWidth:[Globals sharedGlobals].imagesWidth WithImage:theImage];
CGRect viewSizeBack=CGRectMake(scroller.bounds.size.width*toPage , 0, scroller.bounds.size.width, scroller.bounds.size.height);
int x=[Globals sharedGlobals].pageMargins;
int y=([UIScreen mainScreen].bounds.size.height-scaledImage.size.height)/2;
CGRect viewSizeFront=CGRectMake(x , y, scaledImage.size.width,scaledImage.size.height);
UIImageView *backImageView=[[UIImageView alloc] initWithFrame:viewSizeBack];
UIImageView *frontImageView=[[UIImageView alloc] initWithFrame:viewSizeFront];
backImageView.layer.cornerRadius = 0.0;
backImageView.layer.masksToBounds = YES;
backImageView.image=LightImage;
frontImageView.layer.cornerRadius = 0.0;
frontImageView.layer.masksToBounds = YES;
frontImageView.image=scaledImage;
frontImageView.layer.borderWidth=1.0;
frontImageView.layer.borderColor=[UIColor colorWithRed:255.0 green:255.0 blue:255.0 alpha:1.0].CGColor;
[backImageView addSubview:frontImageView];
backImageView.tag=toPage;
frontImageView.tag=toPage;
[scroller addSubview:backImageView];
});
});

You should only ever have 3 images loaded at a maximum - the previous page (if it exists), the current page and the next page.
Any other images you have loaded above this is wasteful because you can't see them and they're just taking up memory for no good reason. If the images aren't too big then you can maintain them in memory and purge them when you get a warning, but for large images this will still generally cause you issues.

If you don't use ARC then add this:
[backImageView autorelease];
[frontImageView autorelease];

Related

Receive Memory warning in combining two UIImage

I try to combine two UIImage with the following code:
- (void)combineImage:(UIImage *)image WithFrame:(CGRect)frame Completion:(ImageProcessorCompletionBlock)block {
__weak typeof(self) wSelf = self;
dispatch_async(_queue, ^{
if (wSelf) {
typeof(wSelf) sSelf = wSelf;
UIGraphicsBeginImageContextWithOptions(sSelf.originalImage.size, NO, 0.0);
[sSelf.originalImage drawInRect:CGRectMake(0, 0, sSelf.originalImage.size.width, sSelf.originalImage.size.height)];
[image drawInRect:frame];
UIImage *result = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
dispatch_async(dispatch_get_main_queue(), ^{
if (block) {
block(result);
}
});
}
});
}
That works but when I check out the usage of memory, it scared me. Every time I run the method the memory rise up and never release. Sometimes I receive the memory warning. Can anyone tell me why and give me a solution to solve the problem? Thanks a lot!
Finally I figure out the problem.
UIGraphicsBeginImageContextWithOptions(sSelf.originalImage.size, NO, 0.0);
The first parameter is the size of the image and the last one is the scale factor. At the beginning I have already set the image size same as the original one. But I also set the scale as 0.0, which means it is set to the scale factor of the device’s main screen. So the result image is enlarged.
If I run the code several times, the result's size gets bigger and bigger, finally it use up the memory and I receive the warning.

UIImageView stops displaying images after a specific amount of loop iterations

My iOS app utilizes a loop to cycle through images in a folder.
My application is supposed to loop through a total of 2031 images (sized 1200x900) inside a folder. The images were taken at 8fps and each image will be displayed as the loop continues to simulate a video clip. After the 696th picture, the images will cease to be displayed in the UIImageView although the app will continue looping.
I tested to see if the disconnect was because of the picture not existing
I started the loop at picture 200, but after picture 896 the UIImageView stop displaying the pictures.
The Code:
imgName = [NSString stringWithFormat:#"subject_basline_mat k (%d).png",jojo];
jojo++;
imageToCrop.image = [UIImage imageNamed:imgName]; //imageToCrop is the name of the UIImageView image and it is set to the image file here
imageToCrop.image = [self imageWithImage:imageToCrop.image convertToSize:self.imageToCrop.frame.size]; //Here the image is converted to fit the bounds of the simulator which is 320x240
The code loops due to a timer that loops it about once every 0.8 seconds.
I ran my code with instruments to see if there was a memory problem occurring,and instruments is very heavy on my computer. As such, my application ran quite slowly. However, when I arrived at the 696th picture, the pictures kept displaying themselves. It was almost as if my application running too quickly caused the picture to not be displayed... which I don't really understand.
The only memory heavy part of the image switching seems to be the size conversion step which is called by the line imageToCrop.image = [self imageWithImage:imageToCrop.image convertToSize:self.imageToCrop.frame.size];
imageToCrop.image = [self imageWithImage:imageToCrop.image convertToSize:self.imageToCrop.frame.size];
The method "imageWithImage" is here:
- (UIImage *)imageWithImage:(UIImage *)image convertToSize:(CGSize)size {
#autoreleasepool {
UIGraphicsBeginImageContext(size);
[image drawInRect:CGRectMake(0, 0, size.width, size.height)];
UIImage *destImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return destImage;
}
And the line [image drawInRect:CGRectMake(0, 0, size.width, size.height)]; uses around up the most memory out of all the image management in the app.
Any Ideas as to why my app will only display a certain amount of images?
Try loading the full-size images from the app bundle by URL. For example:
#autoreleasepool {
NSString *imgName = [NSString stringWithFormat:#"subject_basline_mat k (%d)",jojo];
NSURL *imageURL = [[NSBundle mainBundle] URLForResource:imgName withExtension:#"png"];
UIImage *image = [UIImage imageWithContentsOfFile:[imageURL path]];
imageToCrop.image = [self imageWithImage:image convertToSize:self.imageToCrop.frame.size];
}
Almost for sure your problem is [UIImage imageNamed:imgName]. There are hundreds of posts here on the pitfalls of using it. The issue is that it caches the images - its real purpose is for some small number of images in your bundle.
If you have oodles of images, get the path to the image, then get the image through a URL or file pointer. That way its not cached. Note that when you do this, you lose the automatic "get-retina-image-automatically", and so you will need to grab the appropriately sized image depending on whether the device is retina or not.

ImageCaching on IOS

I've some trouble with memory on my app. I've checked Instruments to get more clue about this issue and i've found that 79% of my memory is used by this :
So i've searched on Google and some people said that is image caching which saved in memory all my images. Maybe it comes from my allocation ?
Here is how i call my images :
info = [InfoModel getInfo:[NSString stringWithFormat:#"%d", self.idEnigme]];
myImage = [UIImage imageNamed:[NSString stringWithFormat:#"res/img/%#", [info objectForKey:#"path1"]]];
myImageView = [[UIImageView alloc] initWithImage:myImage];
myImageView.frame = CGRectMake(0, 200, [[UIScreen mainScreen] bounds].size.width, 400);
[self.scrollView addSubview:myImageView];
Info is a class where i parse a Json file where are my path to images.
Thanks for helping, this drives me crazy.
iOS automatically caches your image for future use when you call imageNamed:
As discussed in a few places, including here:Does UIImageView cache images?
You can get around this caching if you know you are only going to create it once by using
[UIImage imageWithContentsOfFile:[[NSBundle mainBundle] pathForResource:fileName ofType:nil]]
instead
I wouldn't worry about that too much, the UIImage cache is cleared when the app receives a low memory warning. It's all handled automatically, so any images that are no longer in use will be flushed from memory at this point.
So if your app is crashing from running out of memory it is not likely because the OS is caching images that are no longer in use.
You can handle your own caching of images by using initWithData instead of imageNamed but I doubt this solution will help you.
I'm very worried about the following screen. When i launch "Instruments" in Allocations mode, i see every image of my app adding to "Living". I really don't understand how it could be possible...
I've found the problem and i solved it. I'll explain you what was the problem, maybe it could help someone.
The real problem was my UIImage init. Initially i allocated my UIImage in method like the following code :
NSString* fileName = [NSString stringWithFormat:#"%#%#%#", name, number, ext];
UIImage *myImageHeader = [UIImage imageWithContentsOfFile:[[NSBundle mainBundle] pathForResource:fileName ofType:nil]];
UIImageView *myImageViewHeader = [[UIImageView alloc] initWithImage:myImageHeader];
myImageViewHeader.frame = CGRectMake(0, 0, [[UIScreen mainScreen] bounds].size.width, 277);
[self.scrollView addSubview:myImageViewHeader];
Finally i've decided to create a UIImageview in my .h via a property and to set it in my method, like this :
#.h
#property (weak, nonatomic) IBOutlet UIImageView *headerImage;
#.m
[self.headerImage setImage:[UIImage imageWithContentsOfFile:[[NSBundle mainBundle] pathForResource:fileName ofType:nil inDirectory:nil]]];
By this way i saved a lot of memory but i still have leaks. So i decided to delete the image in my UIImageView after use :
dispatch_after(dispatch_time(DISPATCH_TIME_NOW, (int64_t)(1.0 * NSEC_PER_SEC)), dispatch_get_main_queue(), ^(void){
self.headerImage.image = nil;
});
Now, my app only use 8mb of memory and manage memory correctly.
Thx for helping !

Memory increases when merging multiple high resolution images into single image, iOS

I have to merge multiple images in to single (all of high resolution), It acquires lots of memory. I saved original images to local directory and set resized images to imageviews, placed on different locations on main image. Now at the time of saving final merged image, I then read the original images from local directory. here the memory increases, that cause error (crash due to memory) for higher number of images.
here is code: retrieving original image from local directory
UIImage *originalImage = [UIImage imageWithContentsOfFile:[self getOriginalImagePath:imageview.tag]];
Is there any other way to get images from local directory without loading it into memory.
Thanks in advance
There is no way to load an image without it going into memory. With some image formats you could, in theory, implement your own reader that scales the image down while reading the file, so that the original size never ends up in memory, but that would require a lot of work for little gain.
Overall you would be better off just saving the different sizes of images as separate files and loading only the correct size (you seem to be scaling them based on the screen size, so there are not that many different versions required).
If you do keep to resizing them on the fly, try to ensure that you get rid of the original versions as soon as possible, i.e., don't keep any image reference no longer required, and perhaps wrap the whole thing in #autoreleasepool (assuming ARC is being used):
#autoreleasepool {
UIImage *originalImage = [UIImage imageWithContentsOfFile:[self getOriginalImagePath:imageview.tag]];
UIImage *pThumbsImage = [self scaleImageToSize:CGSizeMake(AppScreenBound.size.width, AppScreenBound.size.height) imageWithImage:pOrignalImage];
originalImage = nil;
imageView.image = pThumbImage;
pThumbImage = nil;
// … ?
}
Similarly treat any other image handling that creates intermediate versions, i.e., get rid of references no longer required as soon as possible (such as by assigning nil or having them fall out of scope), and put #autoreleasepool { … } around subsections that may generate temporary objects.
Found a solution, posting it as an answer to my own question, might help other people. reference from Image I/O Programming Guide
An alternative to "imageWithContentsOfFile:", one can use an Image Source
here is a code how I use it.
UIImage *originalWMImage = [self createCGImageFromFile:your-image-path];
the method createCGImageFromFile: get an image content without loading it to memory
-(UIImage*) createCGImageFromFile :(NSString*)path
{
// Get the URL for the pathname passed to the function.
NSURL *url = [NSURL fileURLWithPath:path];
CGImageRef myImage = NULL;
CGImageSourceRef myImageSource;
CFDictionaryRef myOptions = NULL;
CFStringRef myKeys[2];
CFTypeRef myValues[2];
// Set up options if you want them. The options here are for
// caching the image in a decoded form and for using floating-point
// values if the image format supports them.
myKeys[0] = kCGImageSourceShouldCache;
myValues[0] = (CFTypeRef)kCFBooleanTrue;
myKeys[1] = kCGImageSourceShouldAllowFloat;
myValues[1] = (CFTypeRef)kCFBooleanTrue;
// Create the dictionary
myOptions = CFDictionaryCreate(NULL, (const void **) myKeys,
(const void **) myValues, 2,
&kCFTypeDictionaryKeyCallBacks,
& kCFTypeDictionaryValueCallBacks);
// Create an image source from the URL.
myImageSource = CGImageSourceCreateWithURL((CFURLRef)url, myOptions);
CFRelease(myOptions);
// Make sure the image source exists before continuing
if (myImageSource == NULL){
fprintf(stderr, "Image source is NULL.");
return NULL;
}
// Create an image from the first item in the image source.
myImage = CGImageSourceCreateImageAtIndex(myImageSource,
0,
NULL);
CFRelease(myImageSource);
// Make sure the image exists before continuing
if (myImage == NULL){
fprintf(stderr, "Image not created from image source.");
return NULL;
}
return [UIImage imageWithCGImage:myImage];
}
Here is code: resized image and simply assigned to imageview. Then i perform scaling and rotation on imageview.
UIImage *pThumbsImage = [self scaleImageToSize:CGSizeMake(AppScreenBound.size.width, AppScreenBound.size.height) imageWithImage:pOrignalImage];
[imageView setImage:pThumbImage];
here when saving:this code is within for loop: (upto number of images to merge on main image)
// get size of the second image
CGFloat backgroundWidth = canvasSize.width;
CGFloat backgroundHeight = canvasSize.height;
//Image View: to be merged
UIImageView* imageView = [[UIImageView alloc] initWithImage:stampImage];
[imageView setFrame:CGRectMake(0, 0, stampFrameSize.size.width , stampFrameSize.size.height)];
// Rotate Image View
CGAffineTransform currentTransform = imageView.transform;
CGAffineTransform newTransform = CGAffineTransformRotate(currentTransform, radian);
[imageView setTransform:newTransform];
// Scale Image View
CGRect imageFrame = [imageView frame];
// Create Final Stamp View
UIView *finalStamp = nil;
finalStamp = [[UIView alloc] initWithFrame:CGRectMake(0, 0, imageFrame.size.width, imageFrame.size.height)];
// Set Center of Stamp Image
[imageView setCenter:CGPointMake(imageFrame.size.width /2, imageFrame.size.height /2)];
[finalImageView addSubview:imageView];
// Create Image From image View;
UIGraphicsBeginImageContext(finalStamp.frame.size);
[finalStamp.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage *viewImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
UIImage *pfinalMainImage = nil;
// Create Final Image With Stamp
UIGraphicsBeginImageContext(CGSizeMake(backgroundWidth, backgroundHeight));
[canvasImage drawInRect:CGRectMake(0, 0, backgroundWidth, backgroundHeight)];
[viewImage drawInRect:CGRectMake(stampFrameSize.origin.x , stampFrameSize.origin.y , stampImageFrame.size.width , stampImageFrame.size.height) blendMode:kCGBlendModeNormal alpha:fAlphaValue];
pfinalImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
}
everything is okay here. the problem occurs while saving it or generating merged image.
This is an old question, but I had to face something like that recently... so there is my answer.
I had to merge a lot of images into one, and had the same problem. The memory increased until the app crashes. The functions that I created, returned UIImage and that was the problem. The ARC was not releasing at time, so I had to change to return CGImageRef and release them at properly time.

Memory leak with a lot of big images in iPad

I'm trying to store UIImage datas in NSArray, actually 60 images, each with size of 300kb. Then, I'm trying to animate that images in UIImageView.
My code:
NSMutableArray *arr = [[NSMutableArray alloc] init];
for (int i=0; i<61; ++i) {
[arr addObject:[UIImage imageNamed:[NSString stringWithFormat:#"%i.png", i]]];
}
img.animationImages = arr;
img.animationDuration = 1.5;
img.contentMode = UIViewContentModeBottomLeft;
[img startAnimating];
When I test it in iPad Simulator 4.3, it works fine.
When I want to test it on my device (iPad 1), application crashes.
Note: App does not crashes if I comment this code: [img startAnimating];
1. What could be the problem? I think it is memory problem...?!
2. Can I store a loooots of UIImages in NSArray?
Your png files may have 300KB, but this is a compressing format.
To get an idea of the size of the Image itself you should multiply the width, height and bytes per pixel.
i.e. If an image has the size 1024 * 1024 and RGBA model, the images itself has an size of 4MB in the memory. And this is just one image. If you have 300 it is about 120MB.
Note: this is only a rough rule of thumb, but it gives you an idea.
So you should keep path-names in the array and load the image only if need, and thumbnails should be resized and stored on the disk as files. Do not just scale the UIImageView.
Here's a great article with code about resizing.
I solved the app crash problem with alternative way: I tried to use simple NSTimer. For each iteration I set n'th image to UIImageView:
I have UIImageView in xib file, pointing to img variable.
NSTimer *timer = [NSTimer scheduledTimerWithTimeInterval:.03
target:self
selector:#selector(tiktak)
userInfo:nil repeats:YES];
int n = 0;
-(void)tiktak {
n++;
img.image = [UIImage imageNamed:
[NSString stringWithFormat:#"Blue Menu_%i.png", n]];
if(nn == 60) nn = 0;
}
Animation is amazing. I've also tested it in Activity Monitor, CPU usage is 6% and less.
I would not advise you store that many UIImages in an NSArray, its way too heavy and inefficient. From where are these images from (bundle, internet, device's gallery)?
Edit: I don't know how you want it to look, but you can build the array with the names of the UIImages, and then load the UIImage and show it. Like this:
for(i=0;i<60;i++){
NSString *nameOfPicture=[NSString stringWithFormat:#"picture_%d",i+1]
[array addObject:nameOfPicture];
}

Resources