UIImageJPEGRepresentation received memory warning - ios

I receive a memory warning when using UIImageJPEGRepresentation, is there any way to avoid this? It doesn't crash the app but I'd like to avoid it if possible. It does intermittently not run the [[UIApplication sharedApplication] openURL:url];
- (void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info
{
UIImage *image = [info valueForKey:UIImagePickerControllerOriginalImage];
NSData *imageToUpload = UIImageJPEGRepresentation(image, 1.0);
// code that sends the image to a web service (omitted)
// on success from the service
// this sometime does not get run, I assume it has to do with the memory warning?
[[UIApplication sharedApplication] openURL:url];
}

Using UIImageJPEGRepresentation (in which you are round-tripping the asset through a UIImage) can be problematic, because using a compressionQuality of 1.0, the resulting NSData can actually be considerably larger than the original file. (Plus, you're holding a second copy of the image in the UIImage.)
For example, I just picked a random image from my iPhone's photo library and the original asset was 1.5mb, but the NSData produced by UIImageJPEGRepresentation with a compressionQuality of 1.0 required 6.2mb. And holding the image in UIImage, itself, might take even more memory (because if uncompressed, it can require, for example, four bytes per pixel).
Instead, you can get the original asset using the getBytes method:
static NSInteger kBufferSize = 1024 * 10;
- (void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info
{
NSURL *url = info[UIImagePickerControllerReferenceURL];
[self.library assetForURL:url resultBlock:^(ALAsset *asset) {
ALAssetRepresentation *representation = [asset defaultRepresentation];
long long remaining = representation.size;
NSString *filename = representation.filename;
long long representationOffset = 0ll;
NSError *error;
NSMutableData *data = [NSMutableData data];
uint8_t buffer[kBufferSize];
while (remaining > 0ll) {
NSInteger bytesRetrieved = [representation getBytes:buffer fromOffset:representationOffset length:sizeof(buffer) error:&error];
if (bytesRetrieved <= 0) {
NSLog(#"failed getBytes: %#", error);
return;
} else {
remaining -= bytesRetrieved;
representationOffset += bytesRetrieved;
[data appendBytes:buffer length:bytesRetrieved];
}
}
// you can now use the `NSData`
} failureBlock:^(NSError *error) {
NSLog(#"assetForURL error = %#", error);
}];
}
This avoids staging the image in a UIImage and the resulting NSData can be (for photos, anyway) considerably smaller. Note, this also has an advantage that it preserves the meta data associated with the image, too.
By the way, while the above represents a significant memory improvement, you can probably see a more dramatic memory reduction opportunity: Specifically, rather than loading the entire asset into a NSData at one time, you can now stream the asset (subclass NSInputStream to use this getBytes routine to fetch bytes as they're needed, rather than loading the whole thing into memory at one time). There are some annoyances involved with this process (see BJ Homer's article on the topic), but if you're looking for dramatic reduction in the memory footprint, that's the way. There are a couple of approaches here (BJ's, using some staging file and streaming from that, etc.), but the key is that streaming can dramatically reduce your memory footprint.
But by avoiding UIImage in UIImageJPEGRepresentation (which avoids the memory taken up by the image as well as the larger NSData that UIImageJPEGRepresentation yields), you might be able to make considerably headway. Also, you might want to make sure that you don't have redundant copies of this image data in memory at one time (e.g. don't load the image data into a NSData, and then build a second NSData for the HTTPBody ... see if you can do it in one fell swoop). And if worst comes to worse, you can pursue streaming approaches.

in ARC: Just put your code inside small block of #autoreleasepool
#autoreleasepool {
NSData *data = UIImageJPEGRepresentation(img, 0.5);
// something with data
}

Presented as an answer for formatting and images.
Use instruments to check for leaks and memory loss due to retained but not leaked memory. The latter is unused memory that is still pointed to. Use Mark Generation (Heapshot) in the Allocations instrument on Instruments.
For HowTo use Heapshot to find memory creap, see: bbum blog
Basically the method is to run Instruments allocate tool, take a heapshot, run an iteration of your code and take another heapshot repeating 3 or 4 times. This will indicate memory that is allocated and not released during the iterations.
To figure out the results disclose to see the individual allocations.
If you need to see where retains, releases and autoreleases occur for an object use instruments:
Run in instruments, in Allocations set "Record reference counts" on (For Xcode 5 and lower you have to stop recording to set the option). Cause the app to run, stop recording, drill down and you will be able to see where all retains, releases and autoreleases occurred.

Related

<Error>: ImageIO: CGImageReadGetBytesAtOffset in UIImagePNGRepresentation

The only other information I could find on this error was here, which wasn't helpful.
I get the following error when I try to save images. This seems to only happen when I have several images (~6) at once. It also seems to be completely random as to when it occurs. Sometimes everything is fine, sometimes it'll fail on 1 image, sometimes 3, and sometimes the app will completely crash in a EXC_BAD_ACCESS error.
Error: ImageIO: CGImageReadGetBytesAtOffset : ^^^ ERROR ^^^ CGImageSource was created with data size: 1144891 - current size is only: 1003855
Here is the code that saves the image:
- (void)saveWithImage:(UIImage *)anImage andFileName:(NSString *)aFileName {
NSString *subDirectory = #"Images";
NSString *fileName = [aFileName stringByAppendingString:#".png"];
NSString *documentsPath = [[CMAStorageManager sharedManager] documentsSubDirectory:subDirectory].path;
NSString *imagePath = [subDirectory stringByAppendingPathComponent:fileName];
__block NSString *path = [documentsPath stringByAppendingPathComponent:fileName];
__block NSData *data = UIImagePNGRepresentation(anImage);
self.image = anImage;
self.tableCellImage = anImage;
self.galleryCellImage = anImage;
self.imagePath = imagePath; // stored path has to be relative, not absolute (iOS8 changes UUID every run)
dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_BACKGROUND, 0), ^{
if (![data writeToFile:path atomically:YES])
NSLog(#"Error saving image to path: %#", path);
dispatch_async(dispatch_get_main_queue(), ^{
});
});
}
I get that error and as a result my images aren't saved (or only half of them are saved), which completely messes up the UI display, and any subsequent app launches. I've narrowed it down to the UIImagePNGRepresentation call.
On a related note, that code locks up the UI; I think because of the UIImagePNGRepresentation call; however, as far as I know UIImagePNGRepresentation is no thread safe, so I can't do it in the background. Does anyone know a way around this?
Thanks!
In case anyone comes across a similar issue, this is what fixed it for me.
iPhone iOS saving data obtained from UIImageJPEGRepresentation() fails second time: ImageIO: CGImageRead_mapData 'open' failed
and this is the solution I used to save UIImages in a background thread:
Convert UIImage to NSData without using UIImagePngrepresentation or UIImageJpegRepresentation
I was getting this error because the api I called was throwing a 500 and returning html error page in the tmp >> NSData file I was trying to convert to PNG.
You may wish to check the file your trying to open is a image at all before conversion.
If youre downloading the file with dowloadTask then check statusCode is 200 before moving tmp > /Document/.png
heres my answer in other SO

Play Motion JPG stream in iOS out of memory

I am trying to play video that is coming from an IP camera in iOS, but currently I tried 2 methods and they both seem to be filling up the memory of my iOS device really fast. I am using ARC for this project.
My IP camera uses Videostream.cgi (Foscam), which is a well-known way for IP cameras to stream 'video' through the browser.
So, I tried 3 ways, which all end up in crashing my iOS app, with an out-of-memory exception.
1. Putting an UIWebView on my UIViewController and call the CGI directly using a NSURLRequest.
NSString* url = [NSString stringWithFormat:#"http://%#:%#/videostream.cgi?user=%#&pwd=%#&rate=0&resolution=%ld", camera.ip, camera.port, camera.username, camera.password, (long)_resolution];
NSURLRequest* request = [NSURLRequest requestWithURL:[NSURL URLWithString:url]];
webView = [[UIWebView alloc] init];
[webView loadRequest:request];
2. Putting an UIWebView on my UIViewController and creating a piece of HTML (in code) which includes a <img> tag which has a source to the CGI mentioned before. (see: IP camera stream with UIWebview works on IOS 5 but not on IOS 6)
NSString* imgHtml = [NSString stringWithFormat:#"<img src='%#'>", url];
webView = [[UIWebView alloc] init];
[webView loadHTMLString:imgHtml];
3. Using a custom control, based on a UIImageView, which fetches data continuously. https://github.com/mateagar/Motion-JPEG-Image-View-for-iOS
All of these things burn through memory and even when I try to remove them and re-add them after a certain period of time, but this does not seem to fix the problem. Memory won't be released and the iPad crashes.
UPDATE:
I am currently modifying option 3 of the solutions I tried. It is based on a NSURLConnection and the data it retrieves.
- (void)connection:(NSURLConnection *)connection didReceiveData:(NSData *)data {
if (!_receivedData) {
_receivedData = [NSMutableData new];
}
[_receivedData appendData:data];
NSRange endRange = [_receivedData rangeOfData:_endMarkerData
options:0
range:NSMakeRange(0, _receivedData.length)];
NSUInteger endLocation = endRange.location + endRange.length;
if (_receivedData.length >= endLocation) {
NSData *imageData = [_receivedData subdataWithRange:NSMakeRange(0, endLocation)];
UIImage *receivedImage = [UIImage imageWithData:imageData];
if (receivedImage) {
NSLog(#"_receivedData length: %d", [_receivedData length]);
self.image = receivedImage;
_receivedData = nil;
_receivedData = [NSMutableData new];
}
}
if (_shouldStop) {
[connection cancel];
}
}
_receivedData is a NSMutableData object. which I try to "empty" once an image is retrieved from the stream. The part in if (receivedImage) is called when it is supposed to be called. The length of the _receivedData object is also not increasing, it stays around the same size (~ 14k), so that seems to work.
But somehow, with every didReceiveData the memory my app is using increases, even when I disable the line self.image = receivedImage.
UPDATE
As iosengineer suggested, I have been playing with autorelease pools, but this does not solve the problem.
Using Instruments I found out that most of the allocations are done by CFNetwork, in the method HTTPBodyData::appendBytes(unsigned char const*, long). (This allocates 64KB at a time and keeps them alive).
The next step I'd take would be to analyse the request/response patterns using Charlie, step through the source using Xcode and probably write my own solution using NSURLSession and NSURLRequest.
Streams don't just create themselves - something is pulling in data from the responses and not getting rid of it fast enough.
Here's my guess on what is possibly happening:
When you download something using NSURLRequest, you create an instance of NSMutableData to collect the responses in chunks until you are ready to save it to disk. In this case, the stream never ends and so the store grows massive and then bails.
A custom solution to this would have to know when it's safe to ditch the store based on the end of a frame (for example). Good Luck! Instruments is your friend.
P.S. Beware of autoreleased memory - use autoreleasepools wisely
In your revised question, the code sample shows a few objects that are created using autoreleased memory. The appropriate use of autorelease pools should fix this. It should be fairly straight-forward to see which object is causing the most problems and if your problems have been solved by profiling the app using Instruments (allocation tool).
Of particular interest, the UIImage imageWithData: call should definitely be wrapped, as that is creating a new image object every time.
Also subdataWithRange creates a new object which is only released once the pool is flushed.
I don't use the "new" syntax for creation ever so I can't recall how that one actually works. I always use alloc init.
Wrap MOST of this whole routine with this:
#autoreleasepool
{
ROUTINE
}
That will make it so that at each chunk of data received, the pool will be drained and you will mop up any autoreleased memory objects.
I rewrote the MotionJpegImageView thing, which was causing all my problems:
- (void)connection:(NSURLConnection *)connection didReceiveData:(NSData *)data {
if (!_receivedData) {
_receivedData = [NSMutableData new];
}
[_receivedData appendData:data];
NSRange endRange = [_receivedData rangeOfData:_endMarkerData
options:0
range:NSMakeRange(0, _receivedData.length)];
if (endRange.location == NSNotFound) {
return;
}
#autoreleasepool {
UIImage *receivedImage = [UIImage imageWithData:_receivedData];
if (receivedImage) {
self.image = receivedImage;
}
else {
DDLogVerbose(#"Invalid image data");
}
}
[_receivedData setLength:0];
if (_shouldStop) {
[connection cancel];
DDLogVerbose(#"Should stop connection");
}
}
Also, my connections were opened multiple times in the end by not correctly canceling the old ones. Pretty stupid mistake, but for the people wanting to know how it works. Code is mentioned above.

Most memory efficient way to save a photo to disk on iPhone?

From profiling with Instruments I have learned that the way I am saving images to disk is resulting in memory spikes to ~60MB. This results in the App emitting low memory warnings, which (inconsistently) leads crashes on the iPhone4S running iOS7.
I need the most efficient way to save an image to disk.
I am currently using this code
+ (void)saveImage:(UIImage *)image withName:(NSString *)name {
NSData *data = UIImageJPEGRepresentation(image, 1.0);
DLog(#"*** SIZE *** : Saving file of size %lu", (unsigned long)[data length]);
NSFileManager *fileManager = [NSFileManager defaultManager];
NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
NSString *documentsDirectory = [paths objectAtIndex:0];
NSString *fullPath = [documentsDirectory stringByAppendingPathComponent:name];
[fileManager createFileAtPath:fullPath contents:data attributes:nil];
}
Notes:
Reducing the value of the compressionQuality argument in UIImageJPEGRepresentation does not reduce the memory spike significantly enough.
e.g.
compressionQuality = 0.8, reduced the memory spike by 3MB on average over 100 writes.
However, it does reduce the size of the data on disk (obviously)but this does not help me.
UIImagePNGRepresentation in place of UIImageJPEGRepresentation is worse for this. It is slower and results in higher spikes.
Is it possible that this approach with ImageIO would be more efficient? If so why?
If anyone has any suggestions it would be great. Thanks
Edit:
Notes on some of the points outlined in the questions below.
a) Although I was saving multiple images, I was not saving them in a loop. I did a bit of reading around and testing and found that an autorelease pool wouldn't help me.
b) The photos were not 60Mb in size each. They were photos taken on the iPhone 4S.
With this in mind I went back to trying to overcome what I thought the problem was; the line NSData *data = UIImageJPEGRepresentation(image, 1.0);.
The memory spikes that were causing the crash can be seen in the screenshot below. They corresponded to when UIImageJPEGRepresentation was called. I also ran Time Profiler and System Usage which pointed me in the same direction.
Long story short, I moved over to AVFoundation and took the photo image data using
photoData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageSampleBuffer];
Which returns an object of type NSData, I then used this as the data to write using NSFileManager.
This removes the spikes in memory completely.
i.e
[self saveImageWithData:photoData];
where
+ (void)saveImageWithData:(NSData *)imageData withName:(NSString *)name {
NSData *data = imageData;
DLog(#"*** SIZE *** : Saving file of size %lu", (unsigned long)[data length]);
NSFileManager *fileManager = [NSFileManager defaultManager];
NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
NSString *documentsDirectory = [paths objectAtIndex:0];
NSString *fullPath = [documentsDirectory stringByAppendingPathComponent:name];
[fileManager createFileAtPath:fullPath contents:data attributes:nil];
}
PS: I have not put this as an answer to the question incase people feel it does not answer the Title "Most memory efficient way to save a photo to disk on iPhone?". However, if the consensus is that it should be I can update it.
Thanks.
Using UIImageJPEGRepresentation requires that you have the original and final image in memory at the same time. It may also cache the fully rendered image for a while, which would use a lot of memory.
You could try using a CGImageDestination. I do not know how memory efficient it is, but it has the potential to stream the image directly to disk.
+(void) writeImage:(UIImage *)inImage toURL:(NSURL *)inURL withQuality:(double)inQuality {
CGImageDestinationRef destination = CGImageDestinationCreateWithURL( (CFURLRef)inURL , kUTTypeJPEG , 1 , NULL );
CFDictionaryRef properties = (CFDictionaryRef)[NSDictionary dictionaryWithObject:[NSNumber numberWithDouble:inQuality] forKey:kCGImageDestinationLossyCompressionQuality];
CGImageDestinationAddImage( destination , [inImage CGImage] , properties );
CGImageDestinationFinalize( destination );
CFRelease( destination );
}
Are your images actually 60MB compressed, each? If they are, there's not a lot you can do if you want to save them as a single JPEG file. You can try rendering them down to smaller images, or tile them and save them to separate files.
I don't expect your ImageIO code snippet to improve anything. If there were a two-line fix, then UIImageJPEGRepresentation would be using it internally.
But I'm betting that you don't get 60MB from a single image. I'm betting you get 60MB from multiple images saved in a loop. And if that's the case, then there is likely something you can do. Put an #autoreleasepool{} inside your loop. It is quite possible that you're accumulating autoreleased objects, and that's leading to the spike. Adding a pool inside your loop allows it to drain.
Try to use NSAutoReleasePool and drain the pool once u finish writing the data.

Importing multiple images of original resolution : low memory warning issue

I am using ChuteSDK to import multiple images from photo library something like this:
-(void)doneSelected{
NSMutableArray *returnArray = [NSMutableArray array];
[self showHUD];
dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_HIGH, 0), ^(void){
for(id object in [self selectedAssets]){
NSAutoreleasePool *pool = [[NSAutoreleasePool alloc] init];
if([object isKindOfClass:[GCAsset class]]){
ALAsset *asset = [object alAsset];
NSMutableDictionary* temp = [NSMutableDictionary dictionary];
[temp setObject:[[asset defaultRepresentation] UTI] forKey:UIImagePickerControllerMediaType];
[temp setObject:[UIImage imageWithCGImage:[[asset defaultRepresentation] fullScreenImage] scale:1 orientation:(UIImageOrientation)[[asset defaultRepresentation] orientation]] forKey:UIImagePickerControllerOriginalImage];
[temp setObject:[[asset defaultRepresentation] url] forKey:UIImagePickerControllerReferenceURL];
[returnArray addObject:temp];
}
[pool release];
}
dispatch_async(dispatch_get_main_queue(), ^(void) {
if(delegate && [delegate respondsToSelector:#selector(PhotoPickerPlusController:didFinishPickingArrayOfMediaWithInfo:)])
[delegate PhotoPickerPlusController:[self P3] didFinishPickingArrayOfMediaWithInfo:returnArray];
[self hideHUD];
});
});
}
But fullScreenImage is giving me a scaled down version of the original image and if I use fullResolutionImage it is causing low memory warning issue due to which the app is crashing.
How can I get the image with original resolution without causing memory problems.
P.S: I'm not using ARC in my project.
'returnArray' variable you declared is outside the autorelease pool block.
Now you are adding your image to a dictionary 'temp' which is inside auto release pool but ultimately you are adding this 'temp' to 'returnArray' and hence its retain count increased which is causing a leak actually.
Even do keep in mind one more thing while working with images. When you are using an image it doesn't take the memory what it shows as file size as many would be expecting (ie., some thing less than 3MB for a 2048 x 1536 ) . Instead it is actually loaded in raw format taking memory based on a calculation as follows:
width x height x n bytes where n is number of bits being taken to represent colors per pixel mostly it is 4.
so for same 2048 x 1536 image it is going to take 12MB.
So now check what is the original resolution of the image you are talking about and calculate how much MB its gonna take, and change your code as per that.
You are decompressing all the images at once. Wait until you absolutely need them.

Cause of big memory allocation (memory leak?) when reading UIImage from camera roll

I try to modify FGallery (https://github.com/gdavis/FGallery-iPhone).
I need it to read images from the camera roll, but I get memory leak.
Old code (path is file location):
#autoreleasepool {
NSString *path = [NSString stringWithFormat:#"%#/%#", [[NSBundle mainBundle] bundlePath],_thumbUrl];
_thumbnail = [UIImage imageWithContentsOfFile:path];
_hasThumbLoaded = YES;
_isThumbLoading = NO;
[self performSelectorOnMainThread:#selector(didLoadThumbnail) withObject:nil waitUntilDone:YES];
}
My code (path is assert library url):
#autoreleasepool {
ALAssetsLibraryAssetForURLResultBlock resultblock = ^(ALAsset *myasset) {
ALAssetRepresentation *rep = [myasset defaultRepresentation];
CGImageRef iref = [rep fullResolutionImage];
if (iref) {
_thumbnail = [UIImage imageWithCGImage:iref];
_hasThumbLoaded = YES;
_isThumbLoading = NO;
[self performSelectorOnMainThread:#selector(didLoadThumbnail) withObject:nil waitUntilDone:YES];
}
};
ALAssetsLibraryAccessFailureBlock failureblock = ^(NSError *myerror) {
NSLog(#"booya, cant get image - %#",[myerror localizedDescription]);
};
NSURL *asseturl = [NSURL URLWithString:_thumbUrl];
[assetslibrary assetForURL:asseturl
resultBlock:resultblock
failureBlock:failureblock];
}
}
For the same images, I get a big memory allocation (-didReceiveMemoryWarning) that crash the program in my code, but not while using the original code.
Any ideas why?
P.S. I use ARC, and did the automatic transition for FGallery. It works fine for local app images, but as said, I can't make it to work for camera roll images.
edit 1: the program crash
i think i got it.
the "ALAssetsLibraryAssetForURLResultBlock resultblock" is running on a different thread.
and so the "#autoreleasepool" does not apply to it. (each thread have it's own autorelease pool). hence, the memory footprint is much higher due to lot of "autoreleased" allocations (images).
adding "#autoreleasepool" inside the block stopped the crashs and big memory allocations.
in short:
ALAssetsLibraryAssetForURLResultBlock resultblock = ^(ALAsset *myasset) {
#autoreleasepool {
ALAssetRepresentation *rep = [myasset defaultRepresentation];
CGImageRef iref = [rep fullResolutionImage];
if (iref) {
_thumbnail = [UIImage imageWithCGImage:iref];
_hasThumbLoaded = YES;
_isThumbLoading = NO;
[self performSelectorOnMainThread:#selector(didLoadThumbnail) withObject:nil waitUntilDone:YES];
}
}
};
thanks to all who replied.
Unless there is some need for the full resolution image, you would likely be better off using:
CGImageRef iref = [rep fullScreenImage];
This call return a CGImage of the representation that is appropriate for displaying full screen rather than the biggest, best representation available, unadjusted in any way.
Doing so will save loads of memory.
Getting a memory warning (-didReceiveMemoryWarning) is not the same as having a memory leak.It just means you have a lot of memory allocated and it's putting pressure on the system to where the OS interprets this as a potential problem that may occur soon.
A memory leak happens when you have unreferenced objects that did not get released. You can use the compiler analysis tool to see where potential leaks are. That won't find them all, so you can use instruments to see where any others may be happening. But until you have checked with those tools, you can't say for sure that you have a leak that isn't obvious by looking at the code.
You didn't mention if your code is crashing, but if it is, it is not necessarily due to a memory leak. That could happen when the OS decides something has to be removed to reduce memory pressure.
UPDATE
Show the code for the class ALAssetRepresentation . You may not be releasing something in there.
As user Picciano mentioned in his post, if possible you should use the [rep fullScreenImage]; call instead of requesting the full-size image. This will save a lot of space.
However, in my case this wasn't possible because I need to send a higher-res image to an external server later.
What you can do is use the scale to resize it as much as possible:
CGFloat originalRatio = assetRepresentation.dimensions.width / assetRepresentation.dimensions.height;
CGFloat wantedRatio = maxSize.width / maxSize.height;
CGFloat scale = 1;
if (originalRatio < wantedRatio)
{
scale = maxSize.height / assetRepresentation.dimensions.height;
}
else
{
scale = maxSize.width / assetRepresentation.dimensions.width;
}
CGImageRef ref = [assetRepresentation fullResolutionImage];
UIImage *image = [UIImage imageWithCGImage:ref
scale:scale orientation:orientation];
What this basically does it determine the amount we can scale the image (defined by maxSize). This saved us enough to prevent memory leaks.

Resources