I am trying to upload images data to the server, it uploaded successfully, but getting memory peaks on per image upload. And also uploading more than 20 images getting App shut's down and receiving memory warnings.
How to resolve this issue?
Edit:
I am using NSURLConnection.
image = [params valueForKey:key];
partData = UIImageJPEGRepresentation([ImageUtility getQualityFilterImage:[params valueForKey:key]], 0.8f);
if([partData isKindOfClass:[NSString class]])
partData=[((NSString*)partData) dataUsingEncoding:NSUTF8StringEncoding];
[body appendData:partData];
[body appendData:[[NSString stringWithFormat:#"\r\n--%#--\r\n", boundary] dataUsingEncoding:NSUTF8StringEncoding]];
[urlRequest setHTTPBody:body];
A couple of thoughts:
If uploading multiple images, you might want to constrain how many you perform concurrently.
For example, this is my memory when I issued 20 uploads of roughly 5mb per image, memory spiked up to 113 mb:
If, however, I constrained this to only doing no more than 4 at a time (using operation queues), the memory usage was improved, maxing out at 55mb (i.e. roughly 38mb over baseline):
Others have suggested that you might consider not doing more than one upload at a time, and that further reduces the peak memory usage further (in my example, peak memory usage was 27mb, only 10mb over baseline), but recognize that you pay a serious performance penalty for that.
If using NSURLSession, if you use at upload task with the fromFile option rather than loading the asset into a NSData, even when doing 4 concurrently it used dramatically less memory, less than 1 mb over baseline:
I notice that you're using the Xcode gauges to analyze memory usage. I'd suggest using Instruments' Allocations tool (as shown above). The reason I suggest this is not only do I believe it to be more accurate, but more importantly, you seem to have some pattern of memory not falling completely back down to your baseline after performing some actions. This suggests that you might have some leaks. And only Instruments will help you identifying these leaks.
I'd suggest watching WWDC 2013 video Fixing Memory Issues or WWDC 2012 video iOS App Performance: Memory which illustrate (amongst other things) how to use Instruments to identify memory problems.
I notice that you're extracting the NSData from the UIImage objects. I don't know how you're getting the images, but if they're from your ALAssetsLibrary (or if you downloaded them from another resource), you might want to grab the original asset rather than loading it into a UIImage and then creating a NSData from that. If you use UIImageJPEGRepresentation, you invariably either (a) make the NSData larger than the original asset (making the memory issue worse); or (b) reduce the quality unnecessarily in an effort to make the NSData smaller. Plus you strip out much of the meta data.
If you have to use UIImageJPEGRepresentation (e.g. it's a programmatically created UIImage), then so be it, do what you have to do. But if you have access to the original digital asset (e.g. see https://stackoverflow.com/a/27709329/1271826 for ALAssetsLibrary example), then do that. Quality may be maximized without making the asset larger than it needs to be.
Bottom line, you first want to make sure that you don't have any memory leaks, and then you want to minimize how much you hold in memory at any given time (not using any NSData if you can).
Memory peaks are harmless. You are going to get them when you are performing heavy uploads and downloads. But it should go back to its normal level later on, which is the case in the image you have posted. So i feel it is harmless and obvious in your case.
You should only worry when your application uses memory and doesnt release it later...
If you are performing heavy upload.. have a look at answer in this thread.
Can you show your code ? Have you tried with NSURLSession ? I don't have problems with the memory if I upload photos with NSURLSession.
NSURL *url = [NSURL URLWithString:#"SERVER"];
NSURLSessionConfiguration *configuration = [NSURLSessionConfiguration defaultSessionConfiguration];
NSURLSession *session = [NSURLSession sessionWithConfiguration:configuration];
for (UIImage *image in self.arrayImages) {
UIImage *imageToUPload = image;
NSData *data = UIImagePNGRepresentation(imageToUPload);
NSURLSessionUploadTask *task = [session uploadTaskWithRequest:[NSURLRequest requestWithURL:url] fromData:data completionHandler:^(NSData *data, NSURLResponse *response, NSError *error) {
}];
[task resume];
}
Related
WHAT IM DOING I am trying to get an audio file (could be up to an hour long. eg. a Podcast) that I've recorded with AVAudioRecorder to be uploaded to our backend. In addition to being uploaded to the server it needs to be able to be "Paused" and "Resumed" if the user chooses. Because of this, I believe, I need to use dataWithBytesNoCopy:buffer on the NSData class to achieve this.
WHERE IM AT I know for a fact I can get the data with using the passed self.mediaURL property:
if (self.mediaURL) {
NSData *audioData = [NSData dataWithContentsOfURL:self.mediaURL];
if (audioData) {
[payloadDic setObject:audioData forKey:#"audioData"];
}
}
However, this will not give me the desired functionality. I am trying to keep track of the bytes uploaded so that I can resume if the user pauses.
QUESTION How can I use the provided self.mediaURL so that I can retrieve the file and be able to calculate the byte length like this example?
Byte *buffer = (Byte*)malloc((long)audioFile.size);
NSUInteger buffered =[rep getBytes:buffer fromOffset:0.0 length:(long)rep.size error:nil];
NSMutableData *body = [[NSMutableData alloc] init];
body = [NSMutableData dataWithBytesNoCopy:buffer length:buffered freeWhenDone:YES];
Instead of making things more complicated for yourself by trying to reinvent the wheel, use what the system gives you. NSURLSession lets you do a background upload. You hand the task to the session (created using the background session configuration) and just walk away. The upload takes place in pieces, when it can. No "pause" or "resume" needed; the system takes care of everything. Your app doesn't even have to be running. If authentication is needed, your app will be woken up in the background as required. This architecture is just made for the situation you describe.
If the problem is that you want random access to file data without having to read the whole thing into a massive NSData, use NSFileHandle.
Currently I'm using the following code to upload files to the server
NSURLRequest *urlRequest = [[AFHTTPRequestSerializer serializer] multipartFormRequestWithMethod:#"POST" URLString:[[entity uploadUrl]absoluteString] parameters:entity.params constructingBodyWithBlock:^(id<AFMultipartFormData> formData) {
// Get file url
[UploadModel getAassetUrl:entity.asset resultHandler:^(NSURL *fileUrl) {
NSError *fileappenderror;
// Append
[formData appendPartWithFileURL:fileUrl name:#"data" error:&fileappenderror];
if (fileappenderror) {
[Sys MyLog: [fileappenderror localizedDescription] ];
}
}];
} error:&urlRequestError];
/*getAassetUrl */
+(void)getAassetUrl: (PHAsset*)mPhasset resultHandler:(void(^)(NSURL *imageUrl))dataResponse{
PHImageRequestOptions * requestOption = [[PHImageRequestOptions alloc] init];
requestOption.synchronous = YES;
requestOption.deliveryMode = PHImageRequestOptionsDeliveryModeFastFormat;
[[PHImageManager defaultManager] requestImageDataForAsset:mPhasset options:requestOption resultHandler:^(NSData *imageData, NSString *dataUTI, UIImageOrientation orientation, NSDictionary *info) {
dataResponse([info objectForKey:#"PHImageFileURLKey"]);
}];
}
This approach works on a simulator, but fails on a real device: empty files are uploaded to the server most likely due to failure to read from the local storage.
Log shows the notice
Notice: Sandbox: MyApp(213) deny file-read-data
/private/var/mobile/Media/DCIM/101APPLE/IMG_1570.PNG
I believe this note means that app can't access the file by specified path.
Also I've tried an alternative approach uploading file by appending with NSData which is returned from request PHAsset data. but this approach is unusable in case of large media files. since the entire file is loaded into the memory.
Any thoughts?
You shouldn't use requestImageDataForAsset(_:options:resultHandler:) for large files. Reason being you don't want to load the entire media file into memory, you will quickly run out of memory and the app will crash. This typically means you shouldn't use it for large images or pretty much any video.
In my experience, attempting to upload directly from a PHAsset resource url will fail. Apple doesn't seem to grant us the permissions required to upload direct from PHAsset source files. See forum post here. This is a pain because it forces us to use a ton of extra disk space if we want to upload a video.
In order to get a local file url for a video file that you intend to upload, you'll want to use either:
requestExportSessionForVideo(_:options:exportPreset:resultHandler:)
or
requestAVAssetForVideo(_:options:resultHandler:)
You will use these methods to export a copy of the video file to a location on disk that you control. And upload from that file. Bonus feature: both of these methods will download the file from iCloud if necessary.
Check out the VimeoUpload library for details on all things related to video upload. Disclaimer: I'm one of the authors of the library.
Even if you're not uploading to Vimeo servers, you can use the PHAssetExportSessionOperation and ExportOperation classes included in VimeoUpload to do exactly what you're looking to do. See the repo README for details on obtaining a file url for a PHAsset. It also includes tools for obtaining a file url for an ALAsset.
If you're not interested in using PHAssetExportSessionOperation or ExportOperation, check out their implementations for details on how to use the Apple classes under the hood.
The NSData object returned by requestImageDataForAsset is memory mapped - so the entire file is not loaded into memory. So this method will for without any issues for images.
For videos you should use the appropriate methods requestExportSessionForVideo or requestAVAssetForVideo
If you can limit your deployment target to iOS 9, you should also take a look at the methods of PHAssetResourceManager
I am using the UIImageView+AFNetworkingin order to download and display an image from a URL in a UITableView. Everything is working great and my question is around the caching elements & allocations that are implemented with this call.
In using instruments to track memory I see that as I scroll my largest memory allocation quickly becomes this VM: Foundation which appears as if it is caching the images I am downloading in some way.
Which is great for user when the view the same image but I never see it release the memory. (I have not been able to get it to a memory warning yet). I just want to make sure then when needed this allocation will get released when needed. Below is the stack track of those VM : Foundation
Is there any need for me to monitor this and release the memory for the cache when needed to ensure smooth scrolling? Or is there anything else I should be watching to ensure this is handled properly?
Here is how I am calling for the images in cellForRowAtIndexPath I am calling the setImageWithURLRequest. Any advice or improvements would be appreciated.
NSString *imageURL = [NSString stringWithFormat:#"https://IMAGE-URL/%#",imageURL];
__weak MainTableViewCell *weakCell = cell;
NSURLRequest *urlRequest = [[NSURLRequest alloc]initWithURL:[NSURL URLWithString:imageURL]];
[cell.postImage setImageWithURLRequest:urlRequest placeholderImage:nil success:^(NSURLRequest *request, NSHTTPURLResponse *response, UIImage *image) {
MainTableViewCell *strongCell = weakCell;
strongCell.postImage.image = image;
[UIView animateWithDuration:.15 animations:^{
strongCell.postImage.alpha = 1;
}];
} failure:^(NSURLRequest *request, NSHTTPURLResponse *response, NSError *error) {
}];
You really shouldn't worry about clearing image cache as AFNetworking handle this for you automatically.
It stores images internally in NSCache that as the docs say :
incorporates various auto-removal policies, which ensure that it does not use too much of the system’s memory. The system automatically carries out these policies if memory is needed by other applications. When invoked, these policies remove some items from the cache, minimizing its memory footprint.
However, during AFNetworking development there were some troubles with this automatic removal behaviour, so finally manual cache clearing was added when receiving UIApplicationDidReceiveMemoryWarningNotification. But this all happens internally and you shouldn't do it yourself.
I just updated to AFNetworking 2.0 and I am re-writing my code to download data & insert it into Core Data.
I download JSON data files (anywhere from 10-200mb files), write them to disk, then pass them off to background threads to process the data. Below is the code that downloads the JSON & write it to disk. If I just let this run (without even processing the data), the app uses up memory until it is killed.
I assume as the data is coming in, it is being stored in memory, but once I save to disk why would it stay in memory? Shouldn't the autorelease pool take care of this? I also set the responseData, and downloadData to nil. Is there something blatantly obvious that I am doing wrong here?
#autoreleasepool
{
for(int i = 1; i <= totalPages; i++)
{
NSString *path = ....
NSURLRequest *request = [NSURLRequest requestWithURL:[NSURL URLWithString:path]];
AFHTTPRequestOperation *op = [[AFHTTPRequestOperation alloc] initWithRequest:request];
op.responseSerializer =[AFJSONResponseSerializer serializer];
[op setCompletionBlockWithSuccess:^(AFHTTPRequestOperation *operation, id responseObject)
{
//convert dictionary to data
NSData *downloadData = [NSKeyedArchiver archivedDataWithRootObject:responseObject];
//save to disk
NSError *saveError = nil;
if (![fileManager fileExistsAtPath:targetPath isDirectory:false])
{
[downloadData writeToFile:targetPath options:NSDataWritingAtomic error:&saveError];
if (saveError != nil)
{
NSLog(#"Download save failed! Error: %#", [saveError description]);
}
}
responseObject = nil;
downloadData = nil;
} failure:^(AFHTTPRequestOperation *operation, NSError *error) {
DLog(#"Error: %#", error);
}];
}
[mutableOperations addObject:op];
}
NSArray *operations = [AFURLConnectionOperation batchOfRequestOperations:mutableOperations progressBlock:^(NSUInteger numberOfFinishedOperations, NSUInteger totalNumberOfOperations) {
DLog(#"%lu of %lu complete", (unsigned long)numberOfFinishedOperations, (unsigned long)totalNumberOfOperations);
} completionBlock:^(NSArray *operations) {
DLog(#"All operations in batch complete");
}];
mutableOperations = nil;
[manager.operationQueue addOperations:operations waitUntilFinished:NO];
Thanks!
EDIT #1
Adding an #autoreleasepool within my complete block seemed to slow the memory usage a bit, but it still builds up and eventually crashes the app.
If your JSON files are really 10-200mb each, this would definitely cause memory problems, because this sort of request is going to load the responses in memory (rather than streaming them to persistent storage). Worse, because your using JSON, I think the problem is twice as bad, because you're going to be loading this into a dictionary/array, which also takes up memory. So, if you have four 100mb downloads going on, your peak memory usage could be of the order of magnitude of 800mb (100mb for the NSData plus ~100mb for the array/dictionary (possibly much larger), times four for the four concurrent requests). You could quickly run out of memory.
So, a couple of reactions:
When dealing with this volume of data, you'd want to pursue a streaming interface (a NSURLConnection or NSURLSessionDataTask where you write the data as it comes in, rather than holding it in memory; or use NSURLSessionDownloadTask which does this for you), one that writes the data directly to persistent storage (rather than trying to hold it in a NSData in RAM as it's being downloaded).
If you use NSURLSessionDownloadTask, this is really simple. If you need to support iOS versions prior to 7.0, I'm not sure if AFNetworking supports streaming of the responses directly to persistent storage. I'd wager you could write your own response serializer that does that, but I haven't tried it. I've always written my own NSURLConnectionDataDelegate methods that download directly to persistent storage (e.g. something like this).
You might not want to use JSON for this (because NSJSONSerialization will load the whole resource into memory, and then parse it to a NSArray/NSDictionary, also in memory), but rather use a format that lends itself to streamed parsing of the response (e.g. XML) and write a parser that stores the data to your data store (Core Data or SQLite) as it's being parsed, rather than trying to load the whole thing in RAM.
Note, even NSXMLParser is surprisingly memory inefficient (see this question). In the XMLPerformance sample, Apple demonstrates how you can use the more cumbersome LibXML2 to minimize the memory footprint of your XML parser.
By the way, I don't know if your JSON includes any binary data that you have encoded (e.g. base 64 or the like), but if so, you might want to consider a binary transfer format that doesn't have to do this conversion. Using base-64 or uuencode or whatever can increase your bandwidth and memory requirements. (If you're not dealing with binary data that has been encoded, then ignore this point.)
As an aside, you might want to use Reachability to confirm the user's connection type (Wifi vs cellular), because it is considered bad form to download that much data over cellular (at least not without the user's permission), not only because of speed issues, but also the risk of using up an excessive portion of their carrier's monthly data plan. I've even heard that Apple historically rejected apps that tried to download too much data over cellular.
So here's the deal. I recently started using AFNetworking to download a few files on start using the following code:
NSMutableURLRequest* rq = [api requestWithMethod:#"GET" path:#"YOUR/URL/TO/FILE" parameters:nil];
AFHTTPRequestOperation *operation = [[[AFHTTPRequestOperation alloc] initWithRequest:rq] autorelease];
NSString* path=[#"/PATH/TO/APP" stringByAppendingPathComponent: imageNameToDisk];
operation.outputStream = [NSOutputStream outputStreamToFileAtPath:path append:NO];
[operation setCompletionBlockWithSuccess:^(AFHTTPRequestOperation *operation, id responseObject) {
NSLog(#"SUCCCESSFULL IMG RETRIEVE to %#!",path)
} failure:^(AFHTTPRequestOperation *operation, NSError *error) {
// Deal with failure
}];
With my path actually plugged into the path variable (Sorry, not on the right computer right now to actually copy pasta the text, but it's exactly the same as the above with different path stuffs)
And everything is working great! I'm getting the file successfully downloaded and everything. My current issue is that I'm trying to get caching to work, but I'm having a lot of difficulties. Basically, I'm not sure what I actually have to do client side as of AFNetworking 2.0. Do I still need to set up the NSURlCache? Do I need to set the caching type header on the request operation differently? I thought that maybe it was just entirely built in, but I'm receiving a status of 200 every time the code runs, even with no changes in the file. If I do have to use the NSUrlCache, do I have to manually save the e-tag on the success blocks requestoperation myself and then feed that back in? Any help on how to progress would be much appreciated. Thanks guys!
AFNetworking uses NSURLCache for caching by default. From the FAQ:
AFNetworking takes advantage of the caching functionality already provided by NSURLCache and any of its subclasses. So long as your NSURLRequest objects have the correct cache policy, and your server response contains a valid Cache-Control header, responses will be automatically cached for subsequent requests.
Note that this mechanism caches NSData, so every time you retrieve from this cache you need to perform a somewhat expensive NSData-to-UIImage operation. This is not performant enough for rapid display, for example if you're showing images in a UITableView or UICollectionView.
If this is the case, look at UIImageView+AFNetworking, which adds downloads and caching of UIImage objects to UIImageView. For some applications you can just use the out-of-the-box implementation, but it is very basic. You may want to look at the source code for this class (it's not very long) and use it as a starting point for your own caching mechanism.