Photos Framework requestImageDataForAsset occasionally fails - ios

I'm using the photos framework on iOS8.1 and requesting the image data for the asset using requestImageDataForAsset... Most of the time it works and I get the image data and a dictionary containing what you see below. But sometimes the call completes, but the data is nil and the dictionary contains three generic looking entries.
The calls are performed sequentially and on the same thread. It is not specific to any particular image. The error will happen on images I've successfully opened in the past. Has anyone encountered this?
+ (NSData *)retrieveAssetDataPhotosFramework:(NSURL *)urlMedia resolution:(CGFloat)resolution imageOrientation:(ALAssetOrientation*)imageOrientation {
__block NSData *iData = nil;
PHFetchResult *result = [PHAsset fetchAssetsWithALAssetURLs:#[urlMedia] options:nil];
PHAsset *asset = [result firstObject];
PHImageManager *imageManager = [PHImageManager defaultManager];
PHImageRequestOptions *options = [[PHImageRequestOptions alloc]init];
options.synchronous = YES;
options.version = PHImageRequestOptionsVersionCurrent;
#autoreleasepool {
[imageManager requestImageDataForAsset:asset options:options resultHandler:^(NSData *imageData, NSString *dataUTI, UIImageOrientation orientation, NSDictionary *info) {
iData = [imageData copy];
NSLog(#"requestImageDataForAsset returned info(%#)", info);
*imageOrientation = (ALAssetOrientation)orientation;
}];
}
assert(iData.length != 0);
return iData;
}
This is the desired result where I get image data and the dictionary of meta data:
requestImageDataForAsset returned info({
PHImageFileDataKey = <PLXPCShMemData: 0x1702214a0> bufferLength=1753088 dataLength=1749524;
PHImageFileOrientationKey = 1;
PHImageFileSandboxExtensionTokenKey = "6e14948c4d0019fbb4d14cc5e021199f724f0323;00000000;00000000;000000000000001a;com.apple.app-sandbox.read;00000001;01000003;000000000009da80;/private/var/mobile/Media/DCIM/107APPLE/IMG_7258.JPG";
PHImageFileURLKey = "file:///var/mobile/Media/DCIM/107APPLE/IMG_7258.JPG";
PHImageFileUTIKey = "public.jpeg";
PHImageResultDeliveredImageFormatKey = 9999;
PHImageResultIsDegradedKey = 0;
PHImageResultIsInCloudKey = 0;
PHImageResultIsPlaceholderKey = 0;
PHImageResultWantedImageFormatKey = 9999;
})
Here's what I get occasionally. image data is nil. Dictionary contains not so much.
requestImageDataForAsset returned info({
PHImageResultDeliveredImageFormatKey = 9999;
PHImageResultIsDegradedKey = 0;
PHImageResultWantedImageFormatKey = 9999;
})

I had a problem with similar symptoms where requestImageDataForAsset returned nil image data but was also accompanied by a console error message like this:
[Generic] Failed to load image data for asset <PHAsset: 0x13d041940> 87CCAFDC-A0E3-4AC9-AD1C-3F57B897A52E/L0/001 mediaType=1/0, sourceType=2, (113x124), creationDate=2015-06-29 04:56:34 +0000, location=0, hidden=0, favorite=0 with format 9999
In my case, the problem suddenly started happening on a specific device only with assets in iCloud shared albums after upgrading from iOS 10.x to 11.0.3, and since then through to 11.2.5. Thinking that maybe requestImageDataForAsset was trying to use files locally cached in /var/mobile/Media/PhotoData/PhotoCloudSharingData/ (from the info dictionary's PHImageFileURLKey key) and that the cache may be corrupt I thought about how to clear that cache.
Toggling the 'iCloud Photo Sharing' switch in iOS' Settings -> Accounts & Passwords -> iCloud -> Photos seems to have done the trick. requestImageDataForAsset is now working for those previously failing assets.
Update 9th March 2018
I can reproduce this problem now. It seems to occur after restoring a backup from iTunes:
Use the iOS app and retrieve photos from an iCloud shared album.
Backup the iOS device using iTunes.
Restore the backup using iTunes.
Using the app again to retrieve the same photos from the iCloud shared album now fails with the above console message.
Toggling the 'iCloud Photo Sharing' switch fixes it still. Presumably the restore process somehow corrupts some cache. I've reported it as Bug 38290463 to Apple.

You are likely iterating through an array, and memory is not freed timely, you can try the below code. Make sure theData is marked by __block.
#autoreleasepool {
[imageManager requestImageDataForAsset:asset options:options resultHandler:^(NSData *imageData, NSString *dataUTI, UIImageOrientation orientation, NSDictionary *info) {
NSLog(#"requestImageDataForAsset returned info(%#)", info);
theData = [imageData copy];
}];
}

Getting back to this after a long while, I have solved a big part of my problem. No mystery, just bad code:
PHFetchResult *result = [PHAsset fetchAssetsWithALAssetURLs:#[urlMedia] options:nil];
PHAsset *asset = [result firstObject];
if (asset != nil) { // the fix
PHImageManager *imageManager = [PHImageManager defaultManager];
PHImageRequestOptions *options = [[PHImageRequestOptions alloc]init];
...
}
The most common cause for me was a problem with the media URL passed to fetchAssetsWithALAssetURLs causing asset to be nil and requestImageDataForAsset return a default info object.

The following code maybe help. I think the class PHImageRequestOptions has a bug, so I pass nil , and then fix the bug.
dispatch_semaphore_t sema = dispatch_semaphore_create(0);
[[PHImageManager defaultManager] requestImageDataForAsset:asset options:nil resultHandler:^(NSData * _Nullable imageData, NSString * _Nullable dataUTI, UIImageOrientation orientation, NSDictionary * _Nullable info) {
assetModel.size = imageData.length;
NSString *filename = [asset valueForKey:#"filename"];
assetModel.fileName = filename;
dispatch_semaphore_signal(sema);
}];
dispatch_semaphore_wait(sema, DISPATCH_TIME_FOREVER);

Related

iOS Photokit - PHAsset pixelWidth and pixelHeight does not match high-resolution image

my company is having a big problem with getting correct size metadata by fetching PHAssets.
We have developed an iOS applications that lets customers choose pictures from library, get the size (in pixel) for each of them, calculate coordinates for adjusting to gadgets we sell, then upload high quality version of picture to our server to print gadgets.
For some of our customers, the problem is that the size in pixel of some of the high-quality versions of pictures sent, does not match pixelWidth and pixelHeight given by the PHAsset object.
To make an example, we have a picture that:
is reported to be 2096x3724 by PHAsset object
but, when full size image is requested, a 1536x2730 picture is generated
The picture is not in iCloud, and is sent by an iPhone 6 SE running iOS 10.2.
This is the code to get full size image version:
PHImageRequestOptions *imgOpts = [[PHImageRequestOptions alloc] init];
imgOpts.deliveryMode = PHImageRequestOptionsDeliveryModeHighQualityFormat;
imgOpts.networkAccessAllowed = YES;
imgOpts.resizeMode = PHImageRequestOptionsResizeModeExact;
imgOpts.version = PHImageRequestOptionsVersionCurrent;
PHCachingImageManager *imageManager = [[PHCachingImageManager alloc] init];
[imageManager requestImageForAsset:imageAsset targetSize:PHImageManagerMaximumSize contentMode:PHImageContentModeDefault options:imgOpts resultHandler:^(UIImage * result, NSDictionary * info) {
NSData * imageData = UIImageJPEGRepresentation(result, 0.92f);
//UPLOAD OF imageData TO SERVER HERE
}]
Also tried with requestImageDataForAsset method, but with no luck:
PHImageRequestOptions *imgOpts = [[PHImageRequestOptions alloc] init];
imgOpts.deliveryMode = PHImageRequestOptionsDeliveryModeHighQualityFormat;
imgOpts.networkAccessAllowed = YES;
imgOpts.resizeMode = PHImageRequestOptionsResizeModeExact;
imgOpts.version = PHImageRequestOptionsVersionCurrent;
PHCachingImageManager *imageManager = [[PHCachingImageManager alloc] init];
[imageManager requestImageDataForAsset:imageAsset options:imgOpts resultHandler:^(NSData * imageData, NSString * dataUTI, UIImageOrientation orientation, NSDictionary * info) {
//UPLOAD OF imageData TO SERVER HERE
}]
Getting exact size from high-resolution version of every picture, before doing upload, is not an option for us, 'cause it would degrade a lot performance when selecting a large amount of assets from the library.
Are we missing or doing something wrong?
Is there a way to get asset size in pixel without loading full-resolution image into memory?
Thanks for helping
This is due to a bug in Photos framework. Details about the bug can be found here.
Sometimes, after a photo is edited, a smaller version is created. This only occurs for some larger photos.
Calling either requestImageForAsset: (with PHImageManagerMaximumSize) or requestImageDataForAsset: (with PHImageRequestOptionsDeliveryModeHighQualityFormat) will read the data from the smaller file version, when trying to retrieve the edited version (PHImageRequestOptionsVersionCurrent).
The info in the callback of the above methods will point the path to the image. As an example:
PHImageFileURLKey = "file:///[...]DCIM/100APPLE/IMG_0006/Adjustments/IMG_0006.JPG";
Inspecting that folder, I was able to find another image, FullSizeRender.jpg - this one has the full size and contains the latest edits. Thus, one way would be to try and read from the FullSizeRender.jpg, when such a file is present.
Starting with iOS 9, it's also possible to fetch the latest edit, at highest resolution, using the PHAssetResourceManager:
// if (#available(iOS 9.0, *)) {
// check if a high quality edit is available
NSArray<PHAssetResource *> *resources = [PHAssetResource assetResourcesForAsset:_asset];
PHAssetResource *hqResource = nil;
for (PHAssetResource *res in resources) {
if (res.type == PHAssetResourceTypeFullSizePhoto) {
// from my tests so far, this is only present for edited photos
hqResource = res;
break;
}
}
if (hqResource) {
PHAssetResourceRequestOptions *options = [[PHAssetResourceRequestOptions alloc] init];
options.networkAccessAllowed = YES;
long long fileSize = [[hqResource valueForKey:#"fileSize"] longLongValue];
NSMutableData *fullData = [[NSMutableData alloc] initWithCapacity:fileSize];
[[PHAssetResourceManager defaultManager] requestDataForAssetResource:hqResource options:options dataReceivedHandler:^(NSData * _Nonnull data) {
// append the data that we're receiving
[fullData appendData:data];
} completionHandler:^(NSError * _Nullable error) {
// handle completion, using `fullData` or `error`
// uti == hqResource.uniformTypeIdentifier
// orientation == UIImageOrientationUp
}];
}
else {
// use `requestImageDataForAsset:`, `requestImageForAsset:` or `requestDataForAssetResource:` with a different `PHAssetResource`
}
can you try this to fetch camera Roll pics:
__weak __typeof(self) weakSelf = self;
PHFetchResult<PHAssetCollection *> *albums = [PHAssetCollection fetchAssetCollectionsWithType:PHAssetCollectionTypeSmartAlbum subtype:PHAssetCollectionSubtypeSmartAlbumSelfPortraits options:nil];
[albums enumerateObjectsUsingBlock:^(PHAssetCollection * _Nonnull album, NSUInteger idx, BOOL * _Nonnull stop) {
PHFetchOptions *options = [[PHFetchOptions alloc] init];
options.wantsIncrementalChangeDetails = YES;
options.predicate = [NSPredicate predicateWithFormat:#"mediaType == %d",PHAssetMediaTypeImage];
options.sortDescriptors = #[[NSSortDescriptor sortDescriptorWithKey:#"creationDate" ascending:NO]];
PHFetchResult<PHAsset *> *assets = [PHAsset fetchAssetsInAssetCollection:album options:options];
if(assets.count>0)
{
[assets enumerateObjectsUsingBlock:^(PHAsset * _Nonnull asset, NSUInteger idx, BOOL * _Nonnull stop) {
if(asset!=nil)
{
[[PHImageManager defaultManager] requestImageForAsset:asset targetSize:PHImageManagerMaximumSize contentMode:PHImageContentModeAspectFill options:nil resultHandler:^(UIImage *result, NSDictionary *info)
{
dispatch_async(dispatch_get_main_queue(), ^{
[weakSelf addlocalNotificationForFilters:result];
// [weakSelf.buttonGalery setImage:result forState:UIControlStateNormal];
});
}];
*stop = YES;
}
else{
[weakSelf getlatestAferSelfie];
}
}];
}

Getting modified image metadata from iCloud Photo Library

I'm currently retrieving images (and associated metadata) from the iCloud Photo Library using the Photos framework (PhotoKit) and the following code:
PHAsset *asset = ...;
PHImageRequestOptions *options = [PHImageRequestOptions new];
options.networkAccessAllowed = YES;
options.version = PHImageRequestOptionsVersionCurrent;
options.resizeMode = PHImageRequestOptionsResizeModeNone;
options.deliveryMode = PHImageRequestOptionsDeliveryModeHighQualityFormat; // ignored by requestImageDataForAsset
self.requestID = [self.imageManager requestImageDataForAsset:asset options:options resultHandler:^(NSData *imageData, NSString *dataUTI, UIImageOrientation orientation, NSDictionary *info) {
if (imageData) {
NSDictionary *metadata = nil;
CGImageSourceRef sourceRef = CGImageSourceCreateWithData((CFDataRef)imageData, NULL);
if (sourceRef) {
CFDictionaryRef dictionaryRef = CGImageSourceCopyPropertiesAtIndex(sourceRef, 0, NULL);
if (dictionaryRef) {
metadata = (__bridge NSDictionary *)dictionaryRef;
CFRelease (dictionaryRef);
}
CFRelease (sourceRef);
}
NSLog("metadata = %#", metadata);
self.requestID = PHInvalidAssetResourceDataRequestID;
}
}];
The problem I'm having is that the metadata that's returned is from the original file. It does not contain any modifications for title, description, or keywords done with the Photos app (these items are stored in the {IPTC} category.)
The header documentation for -requestImageDataForAsset:options:resultHandler states that PHImageRequestOptionsVersionCurrent can be used to retrieve the adjusted rendered image but this doesn't seem to apply to adjustments to the metadata.
Is there another way to get the modified metadata from the Photos framework? Customers who use the Photos app on the Mac to add titles, descriptions, and keywords to their images would like to see the results on iOS.

Generating video thumbnails of all videos from gallery gives memory issues

I am working on an application where I need to show the thumbnails of all the videos in my gallery(viewed as collection view).Now I am using AVAssetImageGenerator to generate the thumbnails from videos in gallery but I am getting memory issues.Here is my code that I am using:
PHFetchResult *fetchResult = [PHAsset fetchAssetsWithMediaType:PHAssetMediaTypeVideo options:nil];
PHImageManager *imageManager = [PHImageManager new];
for(NSInteger i=0 ; i < fetchResult.count ; i++){
__weak SAVideosViewController *weakSelf = self;
[imageManager requestAVAssetForVideo:fetchResult[i] options:nil resultHandler:^(AVAsset * _Nullable asset, AVAudioMix * _Nullable audioMix, NSDictionary * _Nullable info) {
[collectionViewData addObject:asset];
//my method to generate video thumbnail...
[self generateThumbnailForAsset:asset];
if(i == fetchResult.count-1){
collectionViewDataFilled = YES;
dispatch_async(dispatch_get_main_queue(), ^{
[weakSelf.myCollectionView reloadData];
});
}
}];
}
Here is the method called above:
-(void)generateThumbnailForAsset:(AVAsset*)asset_{
AVAssetImageGenerator *imageGenerator = [AVAssetImageGenerator assetImageGeneratorWithAsset:asset_];
CMTime time = CMTimeMakeWithSeconds(1,1);
CMTimeShow(time);
CGImageRef img = [imageGenerator copyCGImageAtTime:time actualTime:NULL error:NULL];
if(img != nil){
NSLog(#"image");
[thumbnails addObject:[UIImage imageWithCGImage:img]];
}
CGImageRelease(img);
}
I want to know why I'm getting memory issues here and how can I resolve it.
Seems like no one knows the correct answer . So here I am answering my own question after some research .
NOTE : To do such a thing,you do not need to use AVAssetImageGenerator.You the following methods instead(PHOTOS FRAMEWORK).
PHCachingImageManager *cachingImageManager;
[cachingImageManager startCachingImagesForAssets:collectionViewData targetSize:cellSize contentMode:PHImageContentModeAspectFill options:nil];
The class used here is PHCachingImageManager.Read about this in apple docs.
Then after this,use this second method to retrieve data from cache.
[cachingImageManager requestImageForAsset:collectionViewData[indexPath.row] targetSize:AssetGridThumbnailSize contentMode:PHImageContentModeAspectFill options:nil resultHandler:^(UIImage * _Nullable result, NSDictionary * _Nullable info) {
cell.myImageView.image = result;
}];
The callback handler gives an image that you can use in your collection View cells.use this in cellForItemAtIndexPath method.
For a complete code and reference,refer to this example by APPLE.
https://developer.apple.com/library/ios/samplecode/UsingPhotosFramework/Introduction/Intro.html

How to generate thumbnails of images on iCloud Drive?

This appears easy, but the lack of documentation makes this question impossible to guess.
I have pictures and videos on my app's icloud drive and I want to create thumbnails of these assets. I am talking about assets on iCloud Drive, not the iCloud photo stream inside the camera roll. I am talking about the real iCloud Drive folder.
Creating thumbnails from videos are "easy" compared to images. You just need 2 weeks to figure out how it works, having in mind the poor documentation Apple wrote but thumbnails from images seem impossible.
What I have now is an array of NSMetadataItems each one describing one item on the iCloud folder.
These are the methods I have tried so far that don't work:
METHOD 1
[fileURL startAccessingSecurityScopedResource];
NSFileCoordinator *coordinator = [[NSFileCoordinator alloc] init];
__block NSError *error;
[coordinator coordinateReadingItemAtURL:fileURL
options:NSFileCoordinatorReadingImmediatelyAvailableMetadataOnly
error:&error
byAccessor:^(NSURL *newURL) {
NSDictionary *thumb;
BOOL success = [newURL getResourceValue:&thumb forKey:NSURLThumbnailDictionaryKey error:&error];
UIImage *thumbnail = thumb[NSThumbnail1024x1024SizeKey];
}];
[fileURL stopAccessingSecurityScopedResource];
The results of this method are fantastic. Ready for that? Here we go: success = YES, error = nil and thumbnail = nil.
ANOTHER METHOD
AVURLAsset *asset = [[AVURLAsset alloc] initWithURL:fileURL
options:nil];
AVAssetImageGenerator *imageGenerator = [[AVAssetImageGenerator alloc] initWithAsset:asset];
imageGenerator.appliesPreferredTrackTransform = YES;
CMTime time = CMTimeMake(0, 60); // time range in which you want
NSValue *timeValue = [NSValue valueWithCMTime:time];
[imageGenerator generateCGImagesAsynchronouslyForTimes:#[timeValue] completionHandler:^(CMTime requestedTime, CGImageRef image, CMTime actualTime, AVAssetImageGeneratorResult result, NSError * error) {
thumbnail = [[UIImage alloc] initWithCGImage:image];
}];
error = The requested URL was not found on this server. and thumbnail = nil
This method appears to be just for videos. I was trying this just in case. Any equivalent of this method to images?
PRIMITIVE METHOD
NSData *tempData = [NSData dataWithContentsOfUrl:tempURL];
NOPE - data = nil
METHOD 4
The fourth possible method would be using ALAsset but this was deprecated on iOS 9.
I think that all these methods fail because they just work (bug or not) if the resource is local. Any ideas on how to download the image so I can get the thumbnail?
Any other ideas?
thanks
EDIT: after several tests I see that Method 1 is the only one that seems to be in the right direction. This method works poorly, sometimes grabbing the icon but most part of the time not working.
Another point is this. Whatever people suggests me, they always say about downloading the whole image to get the thumbnail. I don't think this is the way to go. Just see how getting thumbnails of video work. You don't download the whole video to get its thumbnail.
So this
question remains open.
The Photos-Framework or AssetsLibrary will not work here as you would have to import your iCloud Drive Photos first to the PhotoLibrary to use any methods of these two frameworks.
What you should look at is ImageIO:
Get the content of the iCloud Drive Photo as NSData and then proceed like this:
CGImageSourceRef source = CGImageSourceCreateWithData((__bridge CFDataRef)(imageData), NULL );
NSDictionary* thumbOpts = [NSDictionary dictionaryWithObjectsAndKeys:
(id)kCFBooleanTrue, (id)kCGImageSourceCreateThumbnailWithTransform,
(id)kCFBooleanTrue, (id)kCGImageSourceCreateThumbnailFromImageAlways,
[NSNumber numberWithInt:160],(id)kCGImageSourceThumbnailMaxPixelSize,
nil];
CGImageRef thumbImageRef = CGImageSourceCreateThumbnailAtIndex(source,0,(__bridge CFDictionaryRef)thumbOpts);
UIImage *thumbnail = [[UIImage alloc] initWithCGImage: thumbImageRef];
After testing several solutions, the one that seems to work better is this one:
[fileURL startAccessingSecurityScopedResource];
NSFileCoordinator *coordinator = [[NSFileCoordinator alloc] init];
__block NSError *error;
[coordinator coordinateReadingItemAtURL:fileURL
options:NSFileCoordinatorReadingImmediatelyAvailableMetadataOnly
error:&error
byAccessor:^(NSURL *newURL) {
NSDictionary *thumb;
BOOL success = [newURL getResourceValue:&thumb forKey:NSURLThumbnailDictionaryKey error:&error];
UIImage *thumbnail = thumb[NSThumbnail1024x1024SizeKey];
}];
[fileURL stopAccessingSecurityScopedResource];
This solution is not perfect. It will fail to bring the thumbnails sometimes but I was not able to find any other solution that works 100%. Others are worst than that.
This works for me.It has a little bit different
func genereatePreviewForOnce(at size: CGSize,completionHandler: #escaping (UIImage?) -> Void) {
_ = fileURL.startAccessingSecurityScopedResource()
let fileCoorinator = NSFileCoordinator.init()
fileCoorinator.coordinate(readingItemAt: fileURL, options: .immediatelyAvailableMetadataOnly, error: nil) { (url) in
if let res = try? url.resourceValues(forKeys: [.thumbnailDictionaryKey]),
let dict = res.thumbnailDictionary {
let image = dict[.NSThumbnail1024x1024SizeKey]
completionHandler(image)
} else {
fileURL.removeCachedResourceValue(forKey: .thumbnailDictionaryKey)
completionHandler(nil)
}
fileURL.stopAccessingSecurityScopedResource()
}
}
It looks like you are generating your thumbnail after the fact. If this is your document and you are using UIDocument, override fileAttributesToWriteToURL:forSaveOperation:error: to insert the thumbnail when the document is saved.

How to know if a PHAsset has been modified?

More specifically, how can you know whether a PHAsset has current version of the underlying asset different than the original?
My user should only need to choose between the current or original asset when necessary. And then I need their answer for PHImageRequestOptions.version.
As of iOS 16, PHAsset has a hasAdjustments property which indicates if the asset has been edited.
For previous releases, you can get an array of data resources for a given asset via PHAssetResource API - it will have an adjustment data resource if that asset has been edited.
let isEdited = PHAssetResource.assetResources(for: asset).contains(where: { $0.type == .adjustmentData })
Note that if you want to actually work with a resource file, you have to fetch its data using a PHAssetResourceManager API. Also note that this method returns right away - there's no waiting for an async network request, unlike other answers here.
I have found two ways of checking PHAsset for modifications.
- (void)tb_checkForModificationsWithEditingInputMethodCompletion:(void (^)(BOOL))completion {
PHContentEditingInputRequestOptions *options = [PHContentEditingInputRequestOptions new];
options.canHandleAdjustmentData = ^BOOL(PHAdjustmentData *adjustmentData) { return YES; };
[self requestContentEditingInputWithOptions:options completionHandler:^(PHContentEditingInput *contentEditingInput, NSDictionary *info) {
if (completion) completion(contentEditingInput.adjustmentData != nil);
}];
}
- (void)tb_checkForModificationsWithAssetPathMethodCompletion:(void (^)(BOOL))completion {
PHVideoRequestOptions *options = [PHVideoRequestOptions new];
options.deliveryMode = PHVideoRequestOptionsDeliveryModeFastFormat;
[[PHImageManager defaultManager] requestAVAssetForVideo:self options:options resultHandler:^(AVAsset *asset, AVAudioMix *audioMix, NSDictionary *info) {
if (completion) completion([[asset description] containsString:#"/Mutations/"]);
}];
}
EDIT: I was at the point where I needed the same functionality for PHAsset with an image. I used this:
- (void)tb_checkForModificationsWithAssetPathMethodCompletion:(void (^)(BOOL))completion {
[self requestContentEditingInputWithOptions:nil completionHandler:^(PHContentEditingInput *contentEditingInput, NSDictionary *info) {
NSString *path = (contentEditingInput.avAsset) ? [contentEditingInput.avAsset description] : contentEditingInput.fullSizeImageURL.path;
completion([path containsString:#"/Mutations/"]);
}];
}
Take a look at PHImageRequestOptionsVersion
PHImageRequestOptionsVersionCurrent
Request the most recent version of the image asset (the one that reflects all edits).
The resulting image is the rendered output from all previously made adjustments.
PHImageRequestOptionsVersionUnadjusted
Request a version of the image asset without adjustments.
If the asset has been edited, the resulting image reflects the state of the asset before any edits were performed.
PHImageRequestOptionsVersionOriginal
Request the original, highest-fidelity version of the image asset. The
resulting image is originally captured or imported version of the
asset, regardless of any edits made.
If you ask user before retrieving assets, you know which version user specified. If you get a phasset from elsewhere, you can do a revertAssetContentToOriginal to get the original asset. And PHAsset has modificationDate and creationDate properties, you can use this to tell if a PHAsset is modified.
I found this code the only one working for now, and it handles most of the edge cases. It may not be the fastest one but works well for most images types. It takes the smallest possible original and modified image and compare their data content.
#implementation PHAsset (Utilities)
- (void)checkEditingHistoryCompletion:(void (^)(BOOL edited))completion
{
PHImageManager *manager = [PHImageManager defaultManager];
CGSize compareSize = CGSizeMake(64, 48);
PHImageRequestOptions *requestOptions = [PHImageRequestOptions new];
requestOptions.synchronous = YES;
requestOptions.deliveryMode = PHImageRequestOptionsDeliveryModeFastFormat;
requestOptions.version = PHImageRequestOptionsVersionOriginal;
[manager requestImageForAsset:self
targetSize:compareSize
contentMode:PHImageContentModeAspectFit
options:requestOptions
resultHandler:^(UIImage *originalResult, NSDictionary *info) {
UIImage *currentImage = originalResult;
requestOptions.version = PHImageRequestOptionsVersionCurrent;
[manager requestImageForAsset:self
targetSize:currentImage.size
contentMode:PHImageContentModeAspectFit
options:requestOptions
resultHandler:^(UIImage *currentResult, NSDictionary *info) {
NSData *currData = UIImageJPEGRepresentation(currentResult, 0.1);
NSData *orgData = UIImageJPEGRepresentation(currentImage, 0.1);
if (completion) {
//handle case when both images cannot be retrived it also mean no edition
if ((currData == nil) && (orgData == nil)) {
completion(NO);
return;
}
completion(([currData isEqualToData:orgData] == NO));
}
}];
}];
}
#end

Resources