When requesting images from the Photos Framework I manage to get all but the last 64 correctly. The last ones always return nil for the dataUTI and imageData in the following code. Whilst attempting to figure out what was going on I found that the PHAsset knows exactly what the UTI is, but is reporting it to me as nil.
Anyone else seen this?
You can see I've made my code access the asset's UTI when it's reported as nil so that my app can determine if it's a gif or not but this isn't an advisable way of doing it and I never get the imageData anyway so it's not a huge amount of help!
PHFetchOptions* fetchOptions = [[PHFetchOptions alloc] init];
fetchOptions.sortDescriptors = #[[NSSortDescriptor sortDescriptorWithKey:#"creationDate" ascending:NO]];
PHFetchResult *allPhotosResult = [PHAsset fetchAssetsWithMediaType:PHAssetMediaTypeImage options: fetchOptions];
[allPhotosResult enumerateObjectsUsingBlock:^(PHAsset *asset, NSUInteger idx, BOOL *stop) {
PHImageRequestOptions* options = [[PHImageRequestOptions alloc] init];
options.synchronous = NO;
options.networkAccessAllowed = YES;
options.deliveryMode = PHImageRequestOptionsDeliveryModeHighQualityFormat;
[[PHImageManager defaultManager] requestImageDataForAsset: asset options: options resultHandler:^(NSData *imageData, NSString *dataUTI, UIImageOrientation orientation, NSDictionary *info) {
NSString* val = [asset valueForKey: #"uniformTypeIdentifier"];
if( !dataUTI )
{
dataUTI = val;
}
}];
}];
EDIT:
I forgot to mention that the missing image creation dates aren't the most recent images and seem spread out. Actually, even the Photos app doesn't seem to show them, based on their creation date. But there doesn't seem to be anything that should be in that their positions looking at the neighboring images of where their creation dates would place them.
Not much of an answer here so happy for someone else to take a bash at explaining it!
Looking at the creation dates of the missing assets I managed to track one down in the Photos app that was missing from my app. It had a thumbnail but when I selected it it did the circular download indicator to pull down the data but then trying to open it in my app's Action Extension (just let's you preview the gif's animation in the Photos app or elsewhere) a popup appeared that said there was an error preparing it. Which I've not seen before but clearly something was going wonky with iCloud.
Previously I was requesting the PHImageRequestOptionsVersionUnadjusted in my app but switching it to PHImageRequestOptionsVersionOriginal seems to have fixed it....?
Related
I am using Photos framework to select photos from the Camera roll. After selecting the assets from the grid, I am using PHImageManager to access each of the selected images and then storing these images in array to show in a collection view of mine.
I am using this piece of code to achieve that:-
-(void)extractFullSizeImagesFromAssets{
PHImageRequestOptions* options = [[PHImageRequestOptions alloc] init];
options.version = PHImageRequestOptionsVersionCurrent;
options.deliveryMode = PHImageRequestOptionsDeliveryModeHighQualityFormat;
options.resizeMode = PHImageRequestOptionsResizeModeExact;
options.networkAccessAllowed = TRUE;
for (int i = 0; i < self.assets.count; i++) {
PHAsset * asset = [self.assets objectAtIndex:i];
CGSize fullSizeImage = CGSizeMake(1000, (asset.pixelHeight / asset.pixelWidth) * 1000);
[[PHImageManager defaultManager] requestImageForAsset:asset
targetSize:fullSizeImage
contentMode:PHImageContentModeAspectFit
options:options
resultHandler:^(UIImage *image, NSDictionary *info){
// [self.arr_images addObject:image];
[_arr_fullSizeImages addObject:image];
}];
}
}
Now my array "arr_fullSizeImages" contains the extracted images in some different random order than the way I did select while picking up the assets. For Example If I have selected 5 images from the camera roll then sometimes the selected image which was at index 3 in Camera Roll is saved on index 5 in the arr_fullSizeImages.
I am not able to track the reason for this behaviour. Please identify the source of the mistake and how t solve this error also.
Thanks.
This is the expected behaviour as requestImageForAsset executed by default asynchronously.
If you want a synchronous behaviour (and no random order), just set
options.synchronous = YES;
This appears easy, but the lack of documentation makes this question impossible to guess.
I have pictures and videos on my app's icloud drive and I want to create thumbnails of these assets. I am talking about assets on iCloud Drive, not the iCloud photo stream inside the camera roll. I am talking about the real iCloud Drive folder.
Creating thumbnails from videos are "easy" compared to images. You just need 2 weeks to figure out how it works, having in mind the poor documentation Apple wrote but thumbnails from images seem impossible.
What I have now is an array of NSMetadataItems each one describing one item on the iCloud folder.
These are the methods I have tried so far that don't work:
METHOD 1
[fileURL startAccessingSecurityScopedResource];
NSFileCoordinator *coordinator = [[NSFileCoordinator alloc] init];
__block NSError *error;
[coordinator coordinateReadingItemAtURL:fileURL
options:NSFileCoordinatorReadingImmediatelyAvailableMetadataOnly
error:&error
byAccessor:^(NSURL *newURL) {
NSDictionary *thumb;
BOOL success = [newURL getResourceValue:&thumb forKey:NSURLThumbnailDictionaryKey error:&error];
UIImage *thumbnail = thumb[NSThumbnail1024x1024SizeKey];
}];
[fileURL stopAccessingSecurityScopedResource];
The results of this method are fantastic. Ready for that? Here we go: success = YES, error = nil and thumbnail = nil.
ANOTHER METHOD
AVURLAsset *asset = [[AVURLAsset alloc] initWithURL:fileURL
options:nil];
AVAssetImageGenerator *imageGenerator = [[AVAssetImageGenerator alloc] initWithAsset:asset];
imageGenerator.appliesPreferredTrackTransform = YES;
CMTime time = CMTimeMake(0, 60); // time range in which you want
NSValue *timeValue = [NSValue valueWithCMTime:time];
[imageGenerator generateCGImagesAsynchronouslyForTimes:#[timeValue] completionHandler:^(CMTime requestedTime, CGImageRef image, CMTime actualTime, AVAssetImageGeneratorResult result, NSError * error) {
thumbnail = [[UIImage alloc] initWithCGImage:image];
}];
error = The requested URL was not found on this server. and thumbnail = nil
This method appears to be just for videos. I was trying this just in case. Any equivalent of this method to images?
PRIMITIVE METHOD
NSData *tempData = [NSData dataWithContentsOfUrl:tempURL];
NOPE - data = nil
METHOD 4
The fourth possible method would be using ALAsset but this was deprecated on iOS 9.
I think that all these methods fail because they just work (bug or not) if the resource is local. Any ideas on how to download the image so I can get the thumbnail?
Any other ideas?
thanks
EDIT: after several tests I see that Method 1 is the only one that seems to be in the right direction. This method works poorly, sometimes grabbing the icon but most part of the time not working.
Another point is this. Whatever people suggests me, they always say about downloading the whole image to get the thumbnail. I don't think this is the way to go. Just see how getting thumbnails of video work. You don't download the whole video to get its thumbnail.
So this
question remains open.
The Photos-Framework or AssetsLibrary will not work here as you would have to import your iCloud Drive Photos first to the PhotoLibrary to use any methods of these two frameworks.
What you should look at is ImageIO:
Get the content of the iCloud Drive Photo as NSData and then proceed like this:
CGImageSourceRef source = CGImageSourceCreateWithData((__bridge CFDataRef)(imageData), NULL );
NSDictionary* thumbOpts = [NSDictionary dictionaryWithObjectsAndKeys:
(id)kCFBooleanTrue, (id)kCGImageSourceCreateThumbnailWithTransform,
(id)kCFBooleanTrue, (id)kCGImageSourceCreateThumbnailFromImageAlways,
[NSNumber numberWithInt:160],(id)kCGImageSourceThumbnailMaxPixelSize,
nil];
CGImageRef thumbImageRef = CGImageSourceCreateThumbnailAtIndex(source,0,(__bridge CFDictionaryRef)thumbOpts);
UIImage *thumbnail = [[UIImage alloc] initWithCGImage: thumbImageRef];
After testing several solutions, the one that seems to work better is this one:
[fileURL startAccessingSecurityScopedResource];
NSFileCoordinator *coordinator = [[NSFileCoordinator alloc] init];
__block NSError *error;
[coordinator coordinateReadingItemAtURL:fileURL
options:NSFileCoordinatorReadingImmediatelyAvailableMetadataOnly
error:&error
byAccessor:^(NSURL *newURL) {
NSDictionary *thumb;
BOOL success = [newURL getResourceValue:&thumb forKey:NSURLThumbnailDictionaryKey error:&error];
UIImage *thumbnail = thumb[NSThumbnail1024x1024SizeKey];
}];
[fileURL stopAccessingSecurityScopedResource];
This solution is not perfect. It will fail to bring the thumbnails sometimes but I was not able to find any other solution that works 100%. Others are worst than that.
This works for me.It has a little bit different
func genereatePreviewForOnce(at size: CGSize,completionHandler: #escaping (UIImage?) -> Void) {
_ = fileURL.startAccessingSecurityScopedResource()
let fileCoorinator = NSFileCoordinator.init()
fileCoorinator.coordinate(readingItemAt: fileURL, options: .immediatelyAvailableMetadataOnly, error: nil) { (url) in
if let res = try? url.resourceValues(forKeys: [.thumbnailDictionaryKey]),
let dict = res.thumbnailDictionary {
let image = dict[.NSThumbnail1024x1024SizeKey]
completionHandler(image)
} else {
fileURL.removeCachedResourceValue(forKey: .thumbnailDictionaryKey)
completionHandler(nil)
}
fileURL.stopAccessingSecurityScopedResource()
}
}
It looks like you are generating your thumbnail after the fact. If this is your document and you are using UIDocument, override fileAttributesToWriteToURL:forSaveOperation:error: to insert the thumbnail when the document is saved.
I've been stuck with this for couple of days. I've bee trying to get the image within the call back but I always get nil. These the options which I used:
let options = PHImageRequestOptions()
options.deliveryMode = .HighQualityFormat
options.resizeMode = .None
I also tried with options set to nil without any luck. This is the data I got in the info value passed to the block.
[PHImageResultIsInCloudKey: 0,
PHImageResultDeliveredImageFormatKey: 9999,
PHImageFileURLKey: file:///var/mobile/Media/DCIM/100APPLE/IMG_0052.JPG,
PHImageResultRequestIDKey: 84,
PHImageResultIsDegradedKey: 0,
PHImageResultWantedImageFormatKey: 9999,
PHImageResultIsPlaceholderKey: 0,
PHImageFileSandboxExtensionTokenKey:
64b47b046511a340c57aa1e3e6e07994c1a13853;00000000;00000000;0000001a;com.apple.app-sandbox.read;;00000000;00000000;0000000000000000;/private/var/mobile/Media/DCIM/100APPLE/IMG_0051.JPG]
I also tried using requestImageForAsset and I got the same result. I thought by using requestImageDataForAsset I'll get more control on the data.
Also, I thought the file exists in the cloud, but it is not as PHImageResultIsInCloudKey value is set 0; otherwise, I'd download it.
by the way, I am able to get a smaller version of the image with predefined size of 200x200 inside another view; however, when I try to get the larger version of it, I get nil. I know that the image exists on the phone with higher res (I can see it in the Photos app)
Any help will be appreciated.
I had a similar issue and tried setting the networkAccessAllowed option on the PHImageRequestOptions object to YES - that seemed to fix it. For me the issue was definitely that the images were in the cloud as images on the camera roll worked fine but those in the cloud did not.
I have the same question. I think there is a bug in PHImageRequestOptions class , so we pass nil in the bellow code ,it's helpful for me.
dispatch_semaphore_t sema = dispatch_semaphore_create(0);
[[PHImageManager defaultManager] requestImageDataForAsset:asset options:nil resultHandler:^(NSData * _Nullable imageData, NSString * _Nullable dataUTI, UIImageOrientation orientation, NSDictionary * _Nullable info) {
assetModel.size = imageData.length;
NSString *filename = [asset valueForKey:#"filename"];
assetModel.fileName = filename;
dispatch_semaphore_signal(sema);
}];
dispatch_semaphore_wait(sema, DISPATCH_TIME_FOREVER);
I am using the following method to request a number of photos and add them to an array for later use:
-(void) fetchImages{
self.assets = [[PHFetchResult alloc]init];
PHFetchOptions *options = [[PHFetchOptions alloc] init];
options.sortDescriptors = #[[NSSortDescriptor sortDescriptorWithKey:#"creationDate" ascending:YES]];
self.assets = [PHAsset fetchAssetsWithMediaType:PHAssetMediaTypeImage options:options];
self.photosToVideofy = [[NSMutableArray alloc]init];
CGSize size = CGSizeMake(640, 480);
PHImageRequestOptions *photoRequestOptions = [[PHImageRequestOptions alloc]init];
photoRequestOptions.synchronous = YES;
for (NSInteger i = self.firstPhotoIndex; i < self.lastPhotoIndex; i++)
{
PHAsset *asset = self.assets[i];
[[PHImageManager defaultManager] requestImageForAsset:asset targetSize:size contentMode:PHImageContentModeAspectFit options:photoRequestOptions resultHandler:^(UIImage *result, NSDictionary *info) {
if (result) {
[self.photosToVideofy addObject:result];
}
}];
}
NSLog(#"There are %lu photos to Videofy", (unsigned long)self.photosToVideofy.count);
}
This works fine when the number of photos is less than 50. After that memory jumps to 150-160mb, I get the message Connection to assetsd was interrupted or assetsd died and the app crashes.
How can I release the assets (PHFetchResult) from memory after I get the ones I want?(do i need to?)
I would like to be able to add up to 150 photos.
Any ideas?
Thanks
You should not put the results from PHFetchResult into an Array. The idea of PHFetchResult is to point to many images from the Photos library without storing them all in RAM, (I'm not sure how exactly it does this) just use the PHFetchResult object like an array and it handles the memory issues for you. For example, connect a collectionViewController to the PHFetchResult object directly and use the PHImageManager to request images only for visible cells etc.
From apple documentation:
"Unlike an NSArray object, however, a PHFetchResult object dynamically loads its contents from the Photos library as needed, providing optimal performance even when handling a large number of results."
https://developer.apple.com/library/ios/documentation/Photos/Reference/PHFetchResult_Class/
Your code inside fetchImages method needs some refactoring, take a look on this suggestion:
-(void) fetchImages {
PHFetchOptions *options = [[PHFetchOptions alloc] init];
options.sortDescriptors = #[[NSSortDescriptor sortDescriptorWithKey:#"creationDate" ascending:YES]];
PHFetchResult *assets = [PHAsset fetchAssetsWithMediaType:PHAssetMediaTypeImage options:options];
self.photosToVideofy = [[NSMutableArray alloc] init];
CGSize size = CGSizeMake(640, 480);
PHImageRequestOptions *photoRequestOptions = [[PHImageRequestOptions alloc] init];
photoRequestOptions.synchronous = YES;
for (NSInteger i = self.firstPhotoIndex; i < self.lastPhotoIndex; i++)
{
PHAsset *asset = assets[i];
[[PHImageManager defaultManager] requestImageForAsset:asset targetSize:size contentMode:PHImageContentModeAspectFit options:photoRequestOptions resultHandler:^(UIImage *result, NSDictionary *info) {
if (result) {
[self.photosToVideofy addObject:result];
}
}];
}
NSLog(#"There are %lu photos to Videofy", (unsigned long)self.photosToVideofy.count);
}
But the problem is memory consumption. Lets make some calculations.
Single image, using ARGB and 4 bytes per pixel:
640x480x4 = 1.2MB
And now, you want to store in the RAM 150 images, so:
150x1.2MB = 180MB
For example, iPhone 4 with 512 MB will crash if you use more that about 300 MB, but it can be less if other apps are also consuming a lot of RAM.
I think, you should consider storing images to files instead to RAM.
This might be intentional (can't tell without looking at the rest of your code), but self.photosToVideofy is never released: since you're accessing it in a block, the object to which you pass the block ([PHImageManager defaultManager]) will always have a reference to the array.
Try explicitly clearing your array when you're done with the images. The array itself still won't be released, but the objects it contains will (or can be if they're not referenced anywhere else).
The best solution is to remove the array from the block. But, that would require changing the logic of your code.
You have to set
photoRequestOptions.synchronous = NO;
instead of
photoRequestOptions.synchronous = YES;
Worked for me, iOS 10.2
Currently the code I am using can write the updated metadata but creates a duplicate image. Here is the code :
if( [self.textView.text length] != 0 && ![self.userComments isEqualToString: self.textView.text])
{
// This code works but creates a duplicate image
NSMutableDictionary *userCommentDictionary = [NSMutableDictionary dictionary];
[userCommentDictionary setValue:self.textView.text forKey:(NSString *)kCGImagePropertyExifUserComment];
NSMutableDictionary *dict = [NSMutableDictionary dictionary];
[dict setValue:userCommentDictionary forKey:(NSString *)kCGImagePropertyExifDictionary];
ALAssetsLibrary *al = [[ALAssetsLibrary alloc] init];
[al writeImageToSavedPhotosAlbum:[self.imageView.image CGImage]
metadata:dict
completionBlock:^(NSURL *assetURL, NSError *error) {
if (error == nil) {
NSLog(#"Image saved.");
self.userComments = self.textView.text;
} else {
NSLog(#"Error saving image.");
}
}];
}
Is there anyway to avoid duplication ?
Thanks for your time
As noted in the comment, I don't believe this is possible.
AssetsLibrary doesn't allow modifying of the original asset at all, everything is saved as a new asset with a reference to the original.
With the new PhotoKit library in iOS 8 they do allow modifying of the asset, but I do not see anything there that allows you to modify the metadata either.
Taking a glance at ImageIO there are methods to modify metadata, but again, nothing to save it to the photo library.
With this, however, you could probably replace a file on disk with another one with modified exif data.
edit to elaborate:
According to answers here it seems like ALAssets provide a URL that does not point to the disk. I believe that means that you have no way of getting the acutual URL of the image to overwrite it, though not in the photos library.
I would suggest you file this as an enhancement to Apple if its that important, if many people request the same thing, they might add it in the future! It does seem that they don't want people messing with this stuff though..