Download Image with AFHTTP and save it to ALAsset - ios

Does anyone know how to save an image to the Asset Library? I know that saving it to the Asset, the image will be available also in the Gallery of the iPad.
I know how to get the file:
AFHTTPRequestOperation *requestOperation = [[AFHTTPRequestOperation alloc] initWithRequest:downloadRequest];
[requestOperation setCompletionBlockWithSuccess:^(AFHTTPRequestOperation *operation, id responseObject) {
// How to make the filepath?
operation.outputStream = [NSOutputStream outputStreamToFileAtPath:filePath append:NO];
[operation.responseData writeToFile:filePath atomically:YES];
}

In order to write an image into the assets library, you can use any of the three methods below (defined in class ALAssetsLibrary):
– writeImageToSavedPhotosAlbum:orientation:completionBlock:
– writeImageDataToSavedPhotosAlbum:metadata:completionBlock:
– writeImageToSavedPhotosAlbum:metadata:completionBlock:
This requires that your image is either represented as a UIImage, a CGImageRef or a NSData object.
For example, you might save your image using a NSOutputStream associated to a temporary file. Then, create and initialize a UIImage with that file and write it into the assets library using the above method.
Alternatively, load the image as a NSData object (if it fits nicely into memory) and again use the appropriate method above, along with meta information.
See also: ALAssetsLibrary Class Reference
Edit:
If you want to get more information about how to setup the metadata parameter, you may find this blog post valuable: Adding metadata to iOS images the easy way.
And if you want to add Geolocations to your image metadata on SO: Saving Geotag info with photo on iOS4.1

Related

PHAsset + AFNetworking. Unable to upload files to the server on a real device

Currently I'm using the following code to upload files to the server
NSURLRequest *urlRequest = [[AFHTTPRequestSerializer serializer] multipartFormRequestWithMethod:#"POST" URLString:[[entity uploadUrl]absoluteString] parameters:entity.params constructingBodyWithBlock:^(id<AFMultipartFormData> formData) {
// Get file url
[UploadModel getAassetUrl:entity.asset resultHandler:^(NSURL *fileUrl) {
NSError *fileappenderror;
// Append
[formData appendPartWithFileURL:fileUrl name:#"data" error:&fileappenderror];
if (fileappenderror) {
[Sys MyLog: [fileappenderror localizedDescription] ];
}
}];
} error:&urlRequestError];
/*getAassetUrl */
+(void)getAassetUrl: (PHAsset*)mPhasset resultHandler:(void(^)(NSURL *imageUrl))dataResponse{
PHImageRequestOptions * requestOption = [[PHImageRequestOptions alloc] init];
requestOption.synchronous = YES;
requestOption.deliveryMode = PHImageRequestOptionsDeliveryModeFastFormat;
[[PHImageManager defaultManager] requestImageDataForAsset:mPhasset options:requestOption resultHandler:^(NSData *imageData, NSString *dataUTI, UIImageOrientation orientation, NSDictionary *info) {
dataResponse([info objectForKey:#"PHImageFileURLKey"]);
}];
}
This approach works on a simulator, but fails on a real device: empty files are uploaded to the server most likely due to failure to read from the local storage.
Log shows the notice
Notice: Sandbox: MyApp(213) deny file-read-data
/private/var/mobile/Media/DCIM/101APPLE/IMG_1570.PNG
I believe this note means that app can't access the file by specified path.
Also I've tried an alternative approach uploading file by appending with NSData which is returned from request PHAsset data. but this approach is unusable in case of large media files. since the entire file is loaded into the memory.
Any thoughts?
You shouldn't use requestImageDataForAsset(_:options:resultHandler:) for large files. Reason being you don't want to load the entire media file into memory, you will quickly run out of memory and the app will crash. This typically means you shouldn't use it for large images or pretty much any video.
In my experience, attempting to upload directly from a PHAsset resource url will fail. Apple doesn't seem to grant us the permissions required to upload direct from PHAsset source files. See forum post here. This is a pain because it forces us to use a ton of extra disk space if we want to upload a video.
In order to get a local file url for a video file that you intend to upload, you'll want to use either:
requestExportSessionForVideo(_:options:exportPreset:resultHandler:)
or
requestAVAssetForVideo(_:options:resultHandler:)
You will use these methods to export a copy of the video file to a location on disk that you control. And upload from that file. Bonus feature: both of these methods will download the file from iCloud if necessary.
Check out the VimeoUpload library for details on all things related to video upload. Disclaimer: I'm one of the authors of the library.
Even if you're not uploading to Vimeo servers, you can use the PHAssetExportSessionOperation and ExportOperation classes included in VimeoUpload to do exactly what you're looking to do. See the repo README for details on obtaining a file url for a PHAsset. It also includes tools for obtaining a file url for an ALAsset.
If you're not interested in using PHAssetExportSessionOperation or ExportOperation, check out their implementations for details on how to use the Apple classes under the hood.
The NSData object returned by requestImageDataForAsset is memory mapped - so the entire file is not loaded into memory. So this method will for without any issues for images.
For videos you should use the appropriate methods requestExportSessionForVideo or requestAVAssetForVideo
If you can limit your deployment target to iOS 9, you should also take a look at the methods of PHAssetResourceManager

PhotoKit iOS8 - Retrieve image using the "PHImageFileURLKey"

Is there anyway I can use the Path returned from the "PHImageFileURLKey" to go into the photo library and retrieve the image?
The path returned is in this format:
"file:///var/mobile/Media/DCIM/102APPLE/IMG_2607.JPG"
My plan is to store this path in the database and use it to fetch the image when I need to get it back.
Any help is appreciated. Thank you!
I think your solution of retrieving Photo Kit asset from the URL is wrong.
Here is what I would do (supposing you have access to PHAsset):
Store the localIdentifier:
PHAsset *asset = /*Your asset */
NSString *localIdentifier = asset.localIdentifier;
//Now save this local identifier, or an array of them
When it is time to retrieve them you simply do:
PHFetchResult *savedAssets = [PHAsset fetchAssetsWithLocalIdentifiers:savedLocalIdentifiers options:nil];
[savedAssets enumerateObjectsUsingBlock:^(PHAsset *asset, NSUInteger idx, BOOL *stop) {
//this gets called for every asset from its localIdentifier you saved
}];
If you only have access to “PHImageFileURLKey” then disregard this answer.
This isn't documented, so I'd strongly advise against using that URL for anything more than a prototype app. That said, this does appear to work:
dispatch_queue_t queue = dispatch_queue_create("photoLoadQueue", 0);
dispatch_async(queue, ^{
NSURL *privateUrl = [NSURL URLWithString:#"file:///var/mobile/Media/DCIM/102APPLE/IMG_2607.JPG";
NSData *imageData = [NSData dataWithContentsOfURL:privateUrl];
dispatch_async(dispatch_get_main_queue(), ^{
self.imageView.image = [UIImage imageWithData:imageData];
});
});
Naturally you'll need to replace the string used to initiate the url with one which is valid for your phone.
There are probably a load of issues with doing this - it's just not how the framework is meant to be used. Here are some off the top of my head:
When running in the simulator, the root path changes regularly between launches of the app, so if you store absoluteUrls like this your database will quickly become full of dead URLs. This will be inconvenient to say the least.
Worse, the URLs for the images may change on a real device - you don't have control over it, and once they change it's your app's fault for making the user reselect them or whatever.
You're not going to ever find out about changes to the PHAsset which the photo came from.
This may be circumventing user permission for photo access - what happens if your app's permission to access photos is revoked? This is probably an issue with lots of approaches to storing photos for later use, however.
You don't control the file - what if the user deletes it?
If I were you, I would retrieve the image properly from the photos framework, using PHImageManager requestImageForAsset: targetSize: contentMode: options: resultHandler:, and store it in a file within your app's directory, at a sensible resolution for whatever you're doing with it. This still doesn't give you asset changes, but is a pretty good solution.
If you want to store the assets themselves and only request the images when you actually need them, it might be worth investigating transient asset collections, though I've not used them so that might not work for what you need.

Caching on AFNetworking 2.0

So here's the deal. I recently started using AFNetworking to download a few files on start using the following code:
NSMutableURLRequest* rq = [api requestWithMethod:#"GET" path:#"YOUR/URL/TO/FILE" parameters:nil];
AFHTTPRequestOperation *operation = [[[AFHTTPRequestOperation alloc] initWithRequest:rq] autorelease];
NSString* path=[#"/PATH/TO/APP" stringByAppendingPathComponent: imageNameToDisk];
operation.outputStream = [NSOutputStream outputStreamToFileAtPath:path append:NO];
[operation setCompletionBlockWithSuccess:^(AFHTTPRequestOperation *operation, id responseObject) {
NSLog(#"SUCCCESSFULL IMG RETRIEVE to %#!",path)
} failure:^(AFHTTPRequestOperation *operation, NSError *error) {
// Deal with failure
}];
With my path actually plugged into the path variable (Sorry, not on the right computer right now to actually copy pasta the text, but it's exactly the same as the above with different path stuffs)
And everything is working great! I'm getting the file successfully downloaded and everything. My current issue is that I'm trying to get caching to work, but I'm having a lot of difficulties. Basically, I'm not sure what I actually have to do client side as of AFNetworking 2.0. Do I still need to set up the NSURlCache? Do I need to set the caching type header on the request operation differently? I thought that maybe it was just entirely built in, but I'm receiving a status of 200 every time the code runs, even with no changes in the file. If I do have to use the NSUrlCache, do I have to manually save the e-tag on the success blocks requestoperation myself and then feed that back in? Any help on how to progress would be much appreciated. Thanks guys!
AFNetworking uses NSURLCache for caching by default. From the FAQ:
AFNetworking takes advantage of the caching functionality already provided by NSURLCache and any of its subclasses. So long as your NSURLRequest objects have the correct cache policy, and your server response contains a valid Cache-Control header, responses will be automatically cached for subsequent requests.
Note that this mechanism caches NSData, so every time you retrieve from this cache you need to perform a somewhat expensive NSData-to-UIImage operation. This is not performant enough for rapid display, for example if you're showing images in a UITableView or UICollectionView.
If this is the case, look at UIImageView+AFNetworking, which adds downloads and caching of UIImage objects to UIImageView. For some applications you can just use the out-of-the-box implementation, but it is very basic. You may want to look at the source code for this class (it's not very long) and use it as a starting point for your own caching mechanism.

Cannot download images from remote server using SDWebImageManager on iOS

Want to achieve image gallery through UICollectionView. I have a set of image url's in an array, say around 20-30 url's. Using SDWebImageManager to download images and cache it and display on the collection view.
See my code below:
for(int i=0;i<[imagePath count];i++) {
[manager downloadWithURL:[NSURL URLWithString:[imagePath objectAtIndex:i]] options:0 progress:^(NSUInteger receivedSize, long long expectedSize) {
} completed:^(UIImage *image, NSError *error, SDImageCacheType cacheType, BOOL finished) {
if(image){
NSLog(#"SUCCESS");
NSString *localKey = [NSString stringWithFormat:#"Item-%d", i];
NSLog(#"%#",localKey);
[[SDImageCache sharedImageCache] storeImage:image forKey:localKey];
}
}];
}
All it does is, display the first image that has the url link indexed 0 in my imagePath array and the rest of the cell's remain blank. When tried to print it just displays SUCCESS once and Item-0. I guess its not going any further. It downloads just one image(first url's image in the array). Please help me with this. I am breaking my head on this from a long time. Not sure if i am on right track. Or please do suggest me other alternatives of achieving image gallery through multiple url's stored in an array.
I have to wonder what your cellForItemAtIndexPath is doing. If it's trying to retrieve the images from the cache directly, you might see precisely the behavior you describe (where the above code is running asynchronously and thus the images are not yet downloaded by the time cellForItemAtIndexPath tries to retrieve them from the cache).
Personally, I'd suggest that you do not even bother trying to populate your cache in advance. Just have cellForItemAtIndexPath just use the UIImageView+WebCache category method setImageWithURL and remove the for loop in your question altogether. It will automatically fill the cache for you and the problem of missing images will probably go away.
If you're having troubles retrieving your images, you should use one of the SDWebImage methods that passes a NSError object back to the block. For example:
[cell.imageView setImageWithURL:url completed:^(UIImage *image, NSError *error, SDImageCacheType cacheType) {
if (error)
NSLog(#"%s: setImageWithURL error: %#", __FUNCTION__, error);
}];
If the URL address is not valid, you may see a 404 error.
If you're determined to use your for loop, a couple of other thoughts:
BTW, you are logging only if image is non-nil. You might want an else statement for your if statement fails (e.g. perhaps the error object can tell you something interesting). If an API provides an error object, you should avail yourself of it.
It's a little curious that are taking a downloaded image and adding it to the cache with another key; now you have two copies of the image in your cache, which is a wasteful use of space. Once you've downloaded an image, it's already in the cache, and if you try to retrieve the image again using the same URL, it will pull it from the cache. There's no point in creating another entry in the cache with your own key.
If you're not going to use the progress block, you can just pass nil for that parameter.

iOS: Preserve JFIF/EXIF data when saving to camera roll

Original Problem
I am trying to save an image into the camera roll while preserving all the original EXIF/JFIF data. I'd like the image as saved to the camera roll to be identical to the original file byte-for-byte. When I save the image via [ALAssetsLibrary writeImageDataToSavedPhotosAlbum:metadata:completionBlock:], the original EXIF data is preserved, but JFIF is stripped out.
ALAssetsLibrary *library = [[[ALAssetsLibrary alloc] init] autorelease];
[library writeImageDataToSavedPhotosAlbum:imageData metadata:nil completionBlock:^(NSURL *assetURL, NSError *error) {
[self imageDidFinishSavingToCameraRollWithError:error];
}];
I tried using the iphone-exif project to parse out the JFIF data and explicitly pass it in via the metadata parameter:
EXFJpeg *jpegScanner = [[EXFJpeg alloc] init];
[jpegScanner scanImageData:imageData];
EXFJFIF *jfif = [jpegScanner jfif];
NSMutableDictionary *jfifMetadata = [[NSMutableDictionary alloc] init];
[jfifMetadata setObject:[jfif version] forKey:(NSString *)kCGImagePropertyJFIFVersion];
...
NSMutableDictionary *metadata = [[NSMutableDictionary alloc] init];
[metadata setObject:jfifMetadata forKey:(NSString*)kCGImagePropertyJFIFDictionary];
But doing so results in the same file, byte-for-byte, as passing a nil metadata dictionary, meaning the JFIF data is still being stripped by iOS.
According to Wikipedia:
Formally, the Exif and JFIF standards are incompatible. This is because both specify that their particular application segment (APP0 for JFIF, APP1 for Exif) must be the first in the image file. In practice, many programs and digital cameras produce files with both application segments included. This will not affect the image decoding for most decoders, but poorly designed JFIF or Exif parsers may not recognise the file properly.
Is there any way to save the original file to the camera roll byte-for-byte, or will iOS always ignore JFIF data if EXIF is present?
Update: 2/26/13
I've filed <rdar://13291591> with Apple. I also created a sample project demonstrating the issue: http://spolet.to/3J0l1u3w0R2e
The sample app has three buttons: one for a JPEG with JFIF data, one for a JPEG without JFIF data, and one for a PNG. When the button is tapped, the corresponding image will be hashed and saved to the photo library. The resulting ALAsset will then be hashed as well.
Results:
The ALAsset for the saved PNG has the same hash as the original file.
The ALAssets for the saved JPEGs do not have the same hashes as their original counterparts.
Digging in to this, it appears that the 'Brightness Value', 'Components Configuration', 'Thumbnail Length' and 'Thumbnail Image' attributes within the ALAsset EXIF data have been modified by the OS. Further, the original JFIF data (for the image with JFIF) has been stripped.
Expected Results:
All three files should have the same hash when saved into the photo library.

Resources