Caching on AFNetworking 2.0 - ios

So here's the deal. I recently started using AFNetworking to download a few files on start using the following code:
NSMutableURLRequest* rq = [api requestWithMethod:#"GET" path:#"YOUR/URL/TO/FILE" parameters:nil];
AFHTTPRequestOperation *operation = [[[AFHTTPRequestOperation alloc] initWithRequest:rq] autorelease];
NSString* path=[#"/PATH/TO/APP" stringByAppendingPathComponent: imageNameToDisk];
operation.outputStream = [NSOutputStream outputStreamToFileAtPath:path append:NO];
[operation setCompletionBlockWithSuccess:^(AFHTTPRequestOperation *operation, id responseObject) {
NSLog(#"SUCCCESSFULL IMG RETRIEVE to %#!",path)
} failure:^(AFHTTPRequestOperation *operation, NSError *error) {
// Deal with failure
}];
With my path actually plugged into the path variable (Sorry, not on the right computer right now to actually copy pasta the text, but it's exactly the same as the above with different path stuffs)
And everything is working great! I'm getting the file successfully downloaded and everything. My current issue is that I'm trying to get caching to work, but I'm having a lot of difficulties. Basically, I'm not sure what I actually have to do client side as of AFNetworking 2.0. Do I still need to set up the NSURlCache? Do I need to set the caching type header on the request operation differently? I thought that maybe it was just entirely built in, but I'm receiving a status of 200 every time the code runs, even with no changes in the file. If I do have to use the NSUrlCache, do I have to manually save the e-tag on the success blocks requestoperation myself and then feed that back in? Any help on how to progress would be much appreciated. Thanks guys!

AFNetworking uses NSURLCache for caching by default. From the FAQ:
AFNetworking takes advantage of the caching functionality already provided by NSURLCache and any of its subclasses. So long as your NSURLRequest objects have the correct cache policy, and your server response contains a valid Cache-Control header, responses will be automatically cached for subsequent requests.
Note that this mechanism caches NSData, so every time you retrieve from this cache you need to perform a somewhat expensive NSData-to-UIImage operation. This is not performant enough for rapid display, for example if you're showing images in a UITableView or UICollectionView.
If this is the case, look at UIImageView+AFNetworking, which adds downloads and caching of UIImage objects to UIImageView. For some applications you can just use the out-of-the-box implementation, but it is very basic. You may want to look at the source code for this class (it's not very long) and use it as a starting point for your own caching mechanism.

Related

PHAsset + AFNetworking. Unable to upload files to the server on a real device

Currently I'm using the following code to upload files to the server
NSURLRequest *urlRequest = [[AFHTTPRequestSerializer serializer] multipartFormRequestWithMethod:#"POST" URLString:[[entity uploadUrl]absoluteString] parameters:entity.params constructingBodyWithBlock:^(id<AFMultipartFormData> formData) {
// Get file url
[UploadModel getAassetUrl:entity.asset resultHandler:^(NSURL *fileUrl) {
NSError *fileappenderror;
// Append
[formData appendPartWithFileURL:fileUrl name:#"data" error:&fileappenderror];
if (fileappenderror) {
[Sys MyLog: [fileappenderror localizedDescription] ];
}
}];
} error:&urlRequestError];
/*getAassetUrl */
+(void)getAassetUrl: (PHAsset*)mPhasset resultHandler:(void(^)(NSURL *imageUrl))dataResponse{
PHImageRequestOptions * requestOption = [[PHImageRequestOptions alloc] init];
requestOption.synchronous = YES;
requestOption.deliveryMode = PHImageRequestOptionsDeliveryModeFastFormat;
[[PHImageManager defaultManager] requestImageDataForAsset:mPhasset options:requestOption resultHandler:^(NSData *imageData, NSString *dataUTI, UIImageOrientation orientation, NSDictionary *info) {
dataResponse([info objectForKey:#"PHImageFileURLKey"]);
}];
}
This approach works on a simulator, but fails on a real device: empty files are uploaded to the server most likely due to failure to read from the local storage.
Log shows the notice
Notice: Sandbox: MyApp(213) deny file-read-data
/private/var/mobile/Media/DCIM/101APPLE/IMG_1570.PNG
I believe this note means that app can't access the file by specified path.
Also I've tried an alternative approach uploading file by appending with NSData which is returned from request PHAsset data. but this approach is unusable in case of large media files. since the entire file is loaded into the memory.
Any thoughts?
You shouldn't use requestImageDataForAsset(_:options:resultHandler:) for large files. Reason being you don't want to load the entire media file into memory, you will quickly run out of memory and the app will crash. This typically means you shouldn't use it for large images or pretty much any video.
In my experience, attempting to upload directly from a PHAsset resource url will fail. Apple doesn't seem to grant us the permissions required to upload direct from PHAsset source files. See forum post here. This is a pain because it forces us to use a ton of extra disk space if we want to upload a video.
In order to get a local file url for a video file that you intend to upload, you'll want to use either:
requestExportSessionForVideo(_:options:exportPreset:resultHandler:)
or
requestAVAssetForVideo(_:options:resultHandler:)
You will use these methods to export a copy of the video file to a location on disk that you control. And upload from that file. Bonus feature: both of these methods will download the file from iCloud if necessary.
Check out the VimeoUpload library for details on all things related to video upload. Disclaimer: I'm one of the authors of the library.
Even if you're not uploading to Vimeo servers, you can use the PHAssetExportSessionOperation and ExportOperation classes included in VimeoUpload to do exactly what you're looking to do. See the repo README for details on obtaining a file url for a PHAsset. It also includes tools for obtaining a file url for an ALAsset.
If you're not interested in using PHAssetExportSessionOperation or ExportOperation, check out their implementations for details on how to use the Apple classes under the hood.
The NSData object returned by requestImageDataForAsset is memory mapped - so the entire file is not loaded into memory. So this method will for without any issues for images.
For videos you should use the appropriate methods requestExportSessionForVideo or requestAVAssetForVideo
If you can limit your deployment target to iOS 9, you should also take a look at the methods of PHAssetResourceManager

Uploading UIImage data to the server getting memory peaks?

I am trying to upload images data to the server, it uploaded successfully, but getting memory peaks on per image upload. And also uploading more than 20 images getting App shut's down and receiving memory warnings.
How to resolve this issue?
Edit:
I am using NSURLConnection.
image = [params valueForKey:key];
partData = UIImageJPEGRepresentation([ImageUtility getQualityFilterImage:[params valueForKey:key]], 0.8f);
if([partData isKindOfClass:[NSString class]])
partData=[((NSString*)partData) dataUsingEncoding:NSUTF8StringEncoding];
[body appendData:partData];
[body appendData:[[NSString stringWithFormat:#"\r\n--%#--\r\n", boundary] dataUsingEncoding:NSUTF8StringEncoding]];
[urlRequest setHTTPBody:body];
A couple of thoughts:
If uploading multiple images, you might want to constrain how many you perform concurrently.
For example, this is my memory when I issued 20 uploads of roughly 5mb per image, memory spiked up to 113 mb:
If, however, I constrained this to only doing no more than 4 at a time (using operation queues), the memory usage was improved, maxing out at 55mb (i.e. roughly 38mb over baseline):
Others have suggested that you might consider not doing more than one upload at a time, and that further reduces the peak memory usage further (in my example, peak memory usage was 27mb, only 10mb over baseline), but recognize that you pay a serious performance penalty for that.
If using NSURLSession, if you use at upload task with the fromFile option rather than loading the asset into a NSData, even when doing 4 concurrently it used dramatically less memory, less than 1 mb over baseline:
I notice that you're using the Xcode gauges to analyze memory usage. I'd suggest using Instruments' Allocations tool (as shown above). The reason I suggest this is not only do I believe it to be more accurate, but more importantly, you seem to have some pattern of memory not falling completely back down to your baseline after performing some actions. This suggests that you might have some leaks. And only Instruments will help you identifying these leaks.
I'd suggest watching WWDC 2013 video Fixing Memory Issues or WWDC 2012 video iOS App Performance: Memory which illustrate (amongst other things) how to use Instruments to identify memory problems.
I notice that you're extracting the NSData from the UIImage objects. I don't know how you're getting the images, but if they're from your ALAssetsLibrary (or if you downloaded them from another resource), you might want to grab the original asset rather than loading it into a UIImage and then creating a NSData from that. If you use UIImageJPEGRepresentation, you invariably either (a) make the NSData larger than the original asset (making the memory issue worse); or (b) reduce the quality unnecessarily in an effort to make the NSData smaller. Plus you strip out much of the meta data.
If you have to use UIImageJPEGRepresentation (e.g. it's a programmatically created UIImage), then so be it, do what you have to do. But if you have access to the original digital asset (e.g. see https://stackoverflow.com/a/27709329/1271826 for ALAssetsLibrary example), then do that. Quality may be maximized without making the asset larger than it needs to be.
Bottom line, you first want to make sure that you don't have any memory leaks, and then you want to minimize how much you hold in memory at any given time (not using any NSData if you can).
Memory peaks are harmless. You are going to get them when you are performing heavy uploads and downloads. But it should go back to its normal level later on, which is the case in the image you have posted. So i feel it is harmless and obvious in your case.
You should only worry when your application uses memory and doesnt release it later...
If you are performing heavy upload.. have a look at answer in this thread.
Can you show your code ? Have you tried with NSURLSession ? I don't have problems with the memory if I upload photos with NSURLSession.
NSURL *url = [NSURL URLWithString:#"SERVER"];
NSURLSessionConfiguration *configuration = [NSURLSessionConfiguration defaultSessionConfiguration];
NSURLSession *session = [NSURLSession sessionWithConfiguration:configuration];
for (UIImage *image in self.arrayImages) {
UIImage *imageToUPload = image;
NSData *data = UIImagePNGRepresentation(imageToUPload);
NSURLSessionUploadTask *task = [session uploadTaskWithRequest:[NSURLRequest requestWithURL:url] fromData:data completionHandler:^(NSData *data, NSURLResponse *response, NSError *error) {
}];
[task resume];
}

Queue of AFHTTPRequestOperations creating Memory Buildup

I just updated to AFNetworking 2.0 and I am re-writing my code to download data & insert it into Core Data.
I download JSON data files (anywhere from 10-200mb files), write them to disk, then pass them off to background threads to process the data. Below is the code that downloads the JSON & write it to disk. If I just let this run (without even processing the data), the app uses up memory until it is killed.
I assume as the data is coming in, it is being stored in memory, but once I save to disk why would it stay in memory? Shouldn't the autorelease pool take care of this? I also set the responseData, and downloadData to nil. Is there something blatantly obvious that I am doing wrong here?
#autoreleasepool
{
for(int i = 1; i <= totalPages; i++)
{
NSString *path = ....
NSURLRequest *request = [NSURLRequest requestWithURL:[NSURL URLWithString:path]];
AFHTTPRequestOperation *op = [[AFHTTPRequestOperation alloc] initWithRequest:request];
op.responseSerializer =[AFJSONResponseSerializer serializer];
[op setCompletionBlockWithSuccess:^(AFHTTPRequestOperation *operation, id responseObject)
{
//convert dictionary to data
NSData *downloadData = [NSKeyedArchiver archivedDataWithRootObject:responseObject];
//save to disk
NSError *saveError = nil;
if (![fileManager fileExistsAtPath:targetPath isDirectory:false])
{
[downloadData writeToFile:targetPath options:NSDataWritingAtomic error:&saveError];
if (saveError != nil)
{
NSLog(#"Download save failed! Error: %#", [saveError description]);
}
}
responseObject = nil;
downloadData = nil;
} failure:^(AFHTTPRequestOperation *operation, NSError *error) {
DLog(#"Error: %#", error);
}];
}
[mutableOperations addObject:op];
}
NSArray *operations = [AFURLConnectionOperation batchOfRequestOperations:mutableOperations progressBlock:^(NSUInteger numberOfFinishedOperations, NSUInteger totalNumberOfOperations) {
DLog(#"%lu of %lu complete", (unsigned long)numberOfFinishedOperations, (unsigned long)totalNumberOfOperations);
} completionBlock:^(NSArray *operations) {
DLog(#"All operations in batch complete");
}];
mutableOperations = nil;
[manager.operationQueue addOperations:operations waitUntilFinished:NO];
Thanks!
EDIT #1
Adding an #autoreleasepool within my complete block seemed to slow the memory usage a bit, but it still builds up and eventually crashes the app.
If your JSON files are really 10-200mb each, this would definitely cause memory problems, because this sort of request is going to load the responses in memory (rather than streaming them to persistent storage). Worse, because your using JSON, I think the problem is twice as bad, because you're going to be loading this into a dictionary/array, which also takes up memory. So, if you have four 100mb downloads going on, your peak memory usage could be of the order of magnitude of 800mb (100mb for the NSData plus ~100mb for the array/dictionary (possibly much larger), times four for the four concurrent requests). You could quickly run out of memory.
So, a couple of reactions:
When dealing with this volume of data, you'd want to pursue a streaming interface (a NSURLConnection or NSURLSessionDataTask where you write the data as it comes in, rather than holding it in memory; or use NSURLSessionDownloadTask which does this for you), one that writes the data directly to persistent storage (rather than trying to hold it in a NSData in RAM as it's being downloaded).
If you use NSURLSessionDownloadTask, this is really simple. If you need to support iOS versions prior to 7.0, I'm not sure if AFNetworking supports streaming of the responses directly to persistent storage. I'd wager you could write your own response serializer that does that, but I haven't tried it. I've always written my own NSURLConnectionDataDelegate methods that download directly to persistent storage (e.g. something like this).
You might not want to use JSON for this (because NSJSONSerialization will load the whole resource into memory, and then parse it to a NSArray/NSDictionary, also in memory), but rather use a format that lends itself to streamed parsing of the response (e.g. XML) and write a parser that stores the data to your data store (Core Data or SQLite) as it's being parsed, rather than trying to load the whole thing in RAM.
Note, even NSXMLParser is surprisingly memory inefficient (see this question). In the XMLPerformance sample, Apple demonstrates how you can use the more cumbersome LibXML2 to minimize the memory footprint of your XML parser.
By the way, I don't know if your JSON includes any binary data that you have encoded (e.g. base 64 or the like), but if so, you might want to consider a binary transfer format that doesn't have to do this conversion. Using base-64 or uuencode or whatever can increase your bandwidth and memory requirements. (If you're not dealing with binary data that has been encoded, then ignore this point.)
As an aside, you might want to use Reachability to confirm the user's connection type (Wifi vs cellular), because it is considered bad form to download that much data over cellular (at least not without the user's permission), not only because of speed issues, but also the risk of using up an excessive portion of their carrier's monthly data plan. I've even heard that Apple historically rejected apps that tried to download too much data over cellular.

Download Image with AFHTTP and save it to ALAsset

Does anyone know how to save an image to the Asset Library? I know that saving it to the Asset, the image will be available also in the Gallery of the iPad.
I know how to get the file:
AFHTTPRequestOperation *requestOperation = [[AFHTTPRequestOperation alloc] initWithRequest:downloadRequest];
[requestOperation setCompletionBlockWithSuccess:^(AFHTTPRequestOperation *operation, id responseObject) {
// How to make the filepath?
operation.outputStream = [NSOutputStream outputStreamToFileAtPath:filePath append:NO];
[operation.responseData writeToFile:filePath atomically:YES];
}
In order to write an image into the assets library, you can use any of the three methods below (defined in class ALAssetsLibrary):
– writeImageToSavedPhotosAlbum:orientation:completionBlock:
– writeImageDataToSavedPhotosAlbum:metadata:completionBlock:
– writeImageToSavedPhotosAlbum:metadata:completionBlock:
This requires that your image is either represented as a UIImage, a CGImageRef or a NSData object.
For example, you might save your image using a NSOutputStream associated to a temporary file. Then, create and initialize a UIImage with that file and write it into the assets library using the above method.
Alternatively, load the image as a NSData object (if it fits nicely into memory) and again use the appropriate method above, along with meta information.
See also: ALAssetsLibrary Class Reference
Edit:
If you want to get more information about how to setup the metadata parameter, you may find this blog post valuable: Adding metadata to iOS images the easy way.
And if you want to add Geolocations to your image metadata on SO: Saving Geotag info with photo on iOS4.1

Empty the cache programmatically in iOS

Does anyone by coincidence know how I can empty the cache memory of the iOS app that I am developing, in the moment it goes to the background (applicationDidEnterBackground)? I have investigated about NSCache but I am still not able to understand how could I retrieve the cache to basically remove/free it?
Is this what you're talking about?
[[NSURLCache sharedURLCache] removeAllCachedResponses];
You can also modify the cache behavior of your requests to selectively cache responses. If you're using AFNetworking by chance, you can use setCacheResponseBlock. E.g. in one project I set it to return nil for all large video and audio files. But allow it to cache smaller image files.
[streamingOperation setCacheResponseBlock:^NSCachedURLResponse *(NSURLConnection *connection, NSCachedURLResponse *cachedResponse) {
return nil; // Ensures we are not unecessarily caching asset data to Cache.db
}];

Resources