Memory problems while reading iphone camera roll with AlAssetsLibrary - ios

I'm trying to get last picture from iphone camera roll. I use the following code:
UIImage* __block image = [[UIImage alloc] init];
ALAssetsLibrary* library = [[ALAssetsLibrary alloc] init];
[library enumerateGroupsWithTypes:ALAssetsGroupSavedPhotos usingBlock:^(ALAssetsGroup* group, BOOL* stop) {
[group setAssetsFilter:[ALAssetsFilter allPhotos]];
if ([group numberOfAssets] > 0) {
[group enumerateAssetsAtIndexes:[NSIndexSet indexSetWithIndex:[group numberOfAssets]-1] options:NSEnumerationConcurrent usingBlock:^(ALAsset* alAsset, NSUInteger index, BOOL* innerStop) {
if (alAsset) {
ALAssetRepresentation* rawImage = [alAsset defaultRepresentation];
image = [UIImage imageWithCGImage:[rawImage fullScreenImage]];
[self doTheJobWithImage:image];
// Inside doTheJobWithImage: a segue is also performed at the end
}
}];
}
} failureBlock:^(NSError* error) {
NSLog(#"Error: %#", [error localizedDescription]);
}];
It works but with flaws (I'm using Instruments and Zombies to debug it):
It seems to read every picture inside camera roll. So, first question: doesn't enumerateAssetsAtIndexes: only retrieve the specified images (atIndexes)? What's wrong with my code?
Adding to previous problem, memory becomes a problem when a lot of pictures are in camera roll. My app crashes if too many or, if it works, it seems retrieved images are NEVER deallocated, so the second or third time i call this code, it crashes anyway due to memory leaks. So second question: how do I force my code to deallocate everything after calling doTheJobWithImage: ?
I'm using xcode 4.6, ios 6 and ARC. Thanks in advance.

After few hours, I figured out that nothing is wrong with the code here. It is the right code to extract last picture in camera roll (if somebody ever needs it). The problem was inside doTheJobWithImage: , I solved it replacing
[self doTheJobWithImage:image];
and storing the resulting image somewhere else, and THEN performing doTheJobWithImage: after picture enumeration was done.. Saying this just in case somebody else is interested.

Related

iOS ALAsset is not refreshed after image is modified in photo library

My app can take image from photo library. For the first time app open, it can take latest image.
But after I open app, go to background, open photo library and modify image, and come back to app, when user choose photo, it is not the latest image.
When I log image url in assets, it show the same one before and after modify image. Is it supposed to be same? Or different after image is modified?
Asset = ALAsset - Type:Photo,
URLs:assets-library://asset/asset.PNG?id=4E226E36-2D9C-449C-92AE-5036938603A9&ext=PNG
- (void) loadAssetsForGroup: (ALAssetsGroup*) group withCompletionHandler: (void (^)(NSArray*)) completionHandler {
__strong NSMutableArray* assets = [NSMutableArray array];
[group enumerateAssetsUsingBlock: ^(ALAsset* result, NSUInteger index, BOOL* stop) {
if (!result) {
completionHandler(assets);
return;
}
[assets addObject: result];
}];
}

No slow motion effect in exported from camera roll video on iPhone 6/6+

I'm developing app that works with video. It makes short movies from recorded or exported from camera roll videos. I need help with some unexpected behavior.
When I export video recorded with apple slow motion effect - such effect is lost in video in my app.
This's reproduced on iPhone 6 and 6+ and I assume on iPhone 5s too. On iPhone 5s/6/6+ Simulator, everything is ok. To export video I use iOS SDK ALAssetsLibrary API, code:
NSMutableArray* allVideos = [[NSMutableArray alloc] init];
self.assetLibrary = [[ALAssetsLibrary alloc] init];
[self.assetLibrary enumerateGroupsWithTypes: ALAssetsGroupAll
usingBlock: ^(ALAssetsGroup* group, BOOL* stop1){
if (group) {
[group setAssetsFilter: [ALAssetsFilter allVideos]];
[group enumerateAssetsUsingBlock: ^(ALAsset* asset, NSUInteger index, BOOL* stop2){
if (asset) {
[allVideos addObject: asset];
}
}];
}
else {
//sort by last shooted video
self.view.videoAssetRepresentations = [allVideos sortedArrayUsingComparator: ^NSComparisonResult (ALAsset* obj1, ALAsset* obj2) {
return [[obj1 valueForProperty: ALAssetPropertyDate] timeIntervalSince1970] < [[obj2 valueForProperty: ALAssetPropertyDate] timeIntervalSince1970];
}];
}
}
failureBlock: ^(NSError* error){
DbgLog(#"error enumerating AssetLibrary groups %#\n", error);
}];
To play exported video I use AVPlayer instance.
Please help me - how can I solve my problem?
PS - Instagram app can do this, tested on iPhone 6. Exported video contains slow motion effect inside Instagram app.
See: https://devforums.apple.com/message/1025773#1025773
It seems that you cannot do this with the ALAssetsLibrary. However, with the new Photos framework for iOS 8+ you can use PHAssetMediaSubtypeVideoHighFrameRate

iOS Photos framework unable to get image data / UTI

When requesting images from the Photos Framework I manage to get all but the last 64 correctly. The last ones always return nil for the dataUTI and imageData in the following code. Whilst attempting to figure out what was going on I found that the PHAsset knows exactly what the UTI is, but is reporting it to me as nil.
Anyone else seen this?
You can see I've made my code access the asset's UTI when it's reported as nil so that my app can determine if it's a gif or not but this isn't an advisable way of doing it and I never get the imageData anyway so it's not a huge amount of help!
PHFetchOptions* fetchOptions = [[PHFetchOptions alloc] init];
fetchOptions.sortDescriptors = #[[NSSortDescriptor sortDescriptorWithKey:#"creationDate" ascending:NO]];
PHFetchResult *allPhotosResult = [PHAsset fetchAssetsWithMediaType:PHAssetMediaTypeImage options: fetchOptions];
[allPhotosResult enumerateObjectsUsingBlock:^(PHAsset *asset, NSUInteger idx, BOOL *stop) {
PHImageRequestOptions* options = [[PHImageRequestOptions alloc] init];
options.synchronous = NO;
options.networkAccessAllowed = YES;
options.deliveryMode = PHImageRequestOptionsDeliveryModeHighQualityFormat;
[[PHImageManager defaultManager] requestImageDataForAsset: asset options: options resultHandler:^(NSData *imageData, NSString *dataUTI, UIImageOrientation orientation, NSDictionary *info) {
NSString* val = [asset valueForKey: #"uniformTypeIdentifier"];
if( !dataUTI )
{
dataUTI = val;
}
}];
}];
EDIT:
I forgot to mention that the missing image creation dates aren't the most recent images and seem spread out. Actually, even the Photos app doesn't seem to show them, based on their creation date. But there doesn't seem to be anything that should be in that their positions looking at the neighboring images of where their creation dates would place them.
Not much of an answer here so happy for someone else to take a bash at explaining it!
Looking at the creation dates of the missing assets I managed to track one down in the Photos app that was missing from my app. It had a thumbnail but when I selected it it did the circular download indicator to pull down the data but then trying to open it in my app's Action Extension (just let's you preview the gif's animation in the Photos app or elsewhere) a popup appeared that said there was an error preparing it. Which I've not seen before but clearly something was going wonky with iCloud.
Previously I was requesting the PHImageRequestOptionsVersionUnadjusted in my app but switching it to PHImageRequestOptionsVersionOriginal seems to have fixed it....?

How can I get the path of the most recently taken image on iOS?

In apps such as Tweetbot there is a "choose most recent picture" function. I would like to know what the path is for the most recent image in the camera roll.
I know that when I pick an image from camera roll manually, the code to retrieve said image looks like this:
UIImage* image = [info valueForKey:#"UIImagePickerControllerOriginalImage"];
What I do not know is what to put in the info valueForKey for the newest picture. I googled extensively and found nada. I know the rules- I am supposed to show what I have tried, but I am at a loss as to what to try, as I am finding zilch in my research and am not familiar with file path related programming.
I read the docs and found nothing of use. Thank you!
In this example, the most recent photo is stored in the UIImage named latestPhoto. Hope this helps!
ALAssetsLibrary *library = [[ALAssetsLibrary alloc] init];
[library enumerateGroupsWithTypes:ALAssetsGroupSavedPhotos usingBlock:^(ALAssetsGroup *group, BOOL *stop)
{
[group setAssetsFilter:[ALAssetsFilter allPhotos]];
[group enumerateAssetsWithOptions:NSEnumerationReverse usingBlock:^(ALAsset *alAsset, NSUInteger index, BOOL *innerStop)
{
if (alAsset)
{
ALAssetRepresentation *representation = [alAsset defaultRepresentation];
UIImage *latestPhoto = [UIImage imageWithCGImage:[representation fullScreenImage]];
}
}];
}
failureBlock: ^(NSError *error)
{
// an error has occurred
}];

How to modify image metadata ( EXIF ) in iOS without duplicating?

Currently the code I am using can write the updated metadata but creates a duplicate image. Here is the code :
if( [self.textView.text length] != 0 && ![self.userComments isEqualToString: self.textView.text])
{
// This code works but creates a duplicate image
NSMutableDictionary *userCommentDictionary = [NSMutableDictionary dictionary];
[userCommentDictionary setValue:self.textView.text forKey:(NSString *)kCGImagePropertyExifUserComment];
NSMutableDictionary *dict = [NSMutableDictionary dictionary];
[dict setValue:userCommentDictionary forKey:(NSString *)kCGImagePropertyExifDictionary];
ALAssetsLibrary *al = [[ALAssetsLibrary alloc] init];
[al writeImageToSavedPhotosAlbum:[self.imageView.image CGImage]
metadata:dict
completionBlock:^(NSURL *assetURL, NSError *error) {
if (error == nil) {
NSLog(#"Image saved.");
self.userComments = self.textView.text;
} else {
NSLog(#"Error saving image.");
}
}];
}
Is there anyway to avoid duplication ?
Thanks for your time
As noted in the comment, I don't believe this is possible.
AssetsLibrary doesn't allow modifying of the original asset at all, everything is saved as a new asset with a reference to the original.
With the new PhotoKit library in iOS 8 they do allow modifying of the asset, but I do not see anything there that allows you to modify the metadata either.
Taking a glance at ImageIO there are methods to modify metadata, but again, nothing to save it to the photo library.
With this, however, you could probably replace a file on disk with another one with modified exif data.
edit to elaborate:
According to answers here it seems like ALAssets provide a URL that does not point to the disk. I believe that means that you have no way of getting the acutual URL of the image to overwrite it, though not in the photos library.
I would suggest you file this as an enhancement to Apple if its that important, if many people request the same thing, they might add it in the future! It does seem that they don't want people messing with this stuff though..

Resources