Wanted to get some feedback to see what I might be missing here. Basically I use a UIImagePickerViewController to take a photo. When I am done i retrieve this image like this:
UIImage *photoTaken = [info objectForKey:#"UIImagePickerControllerOriginalImage"];
After I have taken the photo, at a later time, I need to be able load all the images in my camera roll and highlight the photo(I display all the images from the camera roll) that I just took. Because these photos are different objects in memory (different view controllers), the only way to compare them is by comparing the actual data that represents the images. i..e..
NSData *alreadySelectedPhotoData = UIImageJPEGRepresentation(alreadySelectedPhoto.photoImage, 0.0);
NSData *cameralRollPhotoData = UIImageJPEGRepresentation(cameraRollPhoto.photoImage, 0.0);
if([cameralRollPhotoData isEqualToData:alreadySelectedPhotoData]){
//do something here if they are equal(draw a border, etc)
}
However, the photos never actually were equal based on this comparison, despite the fact that the images displayed are identical.
So I went back to the original code, did some digging and did a data and visual test:
- (void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info {
__block UIImage *photoTaken = [info objectForKey:#"UIImagePickerControllerOriginalImage"];
__block PHObjectPlaceholder *placeholderAsset = nil;
//save our new photo to the camera roll album(successfully)
[[PHPhotoLibrary sharedPhotoLibrary] performChanges:^{
PHAssetChangeRequest *changeRequest = [PHAssetChangeRequest creationRequestForAssetFromImage:photoTaken];
changeRequest.creationDate = creationTimeStamp = [NSDate date];
placeholderAsset = changeRequest.placeholderForCreatedAsset;
}
completionHandler:^(BOOL success, NSError *error){
PHImageManager *manager = [PHImageManager defaultManager];
PHImageRequestOptions *requestOptions = [PHImageRequestOptions new];
requestOptions.resizeMode = PHImageRequestOptionsResizeModeExact;
requestOptions.deliveryMode = PHImageRequestOptionsDeliveryModeHighQualityFormat;
CGFloat dimension = [UIScreen mainScreen].bounds.size.width / 3 * [UIScreen mainScreen].scale;
CGSize targetSize = CGSizeMake(dimension, dimension);
PHFetchResult *savedAssets = [PHAsset fetchAssetsWithLocalIdentifiers:#[placeholderAsset.localIdentifier] options:nil];
[manager requestImageForAsset:savedAssets.firstObject targetSize:targetSize contentMode:PHImageContentModeAspectFill options:requestOptions resultHandler:^(UIImage *result, NSDictionary *info) {
//images are the 'same' but their NSData representations appear to not be. NSLog statement never executes.
NSData *alreadySelectedPhotoData = UIImageJPEGRepresentation(photoTaken, 0.0);
NSData *cameralRollPhotoData = UIImageJPEGRepresentation(result, 0.0);
if([cameralRollPhotoData isEqualToData:alreadySelectedPhotoData]){
NSLog(#"images are equal");
}
}];
}];
So to summarize:
store the image that comes back from the info object in the picker delegate method
use this image and store it in the camera roll by using 'creationRequestForAssetFromImage'
retrieve back the image that we just stored by getting Asset ('fetchAssetsWithLocalIdentifiers')
convert that asset back into an image (PHManager - requestImageForAsset)
convert both the original UIImage that was returned via the picker delegate and the image back from the camera roll that was created to NSData objects.
Result: they do not equal, even though the images on the screen are exactly the same.
Conclusion: It seems to me that this below:
PHAssetChangeRequest *changeRequest = [PHAssetChangeRequest creationRequestForAssetFromImage:photoTaken];
saves to the camera roll successfully, can verify that the image it displays is exactly the image that I got from the UIPickerImage delegate method(visually looks the same), yet when converting both images to NSData objects the comparison fails.
Does anyone have any idea whats going on here? did I miss something or is this a bug?
Related
I am working on an iOS app in which I have to fetch all the images from iPhone gallery and then fetch its exif Value.To get exif value I need to get image using image data.
I am using following code for this:
-(void)getExifDataFromImage:(NSInteger)index{
PHImageRequestOptions *options = [[PHImageRequestOptions alloc] init];
options.synchronous=YES;
options.resizeMode=PHImageRequestOptionsResizeModeFast;
[[PHImageManager defaultManager] requestImageDataForAsset:asset
options:options
resultHandler:^(NSData *imageData, NSString *dataUTI, UIImageOrientation orientation, NSDictionary *info)
{
CIImage* ciImage = [CIImage imageWithData:imageData];
NSLog(#"Metadata : %#", ciImage.properties);
exifCount++;
NSDictionary *pro=ciImage.properties;
NSDictionary *exifDic=[pro objectForKey:#"{Exif}"];
NSDictionary *tiffDic=[pro objectForKey:#"{TIFF}"];
}];}
All works fine for the images around 1200 but if we go beyond that we get memory issue and the application crashes.
I also tried getting image using the below code to get the image and then tried to find the exif value of the resized image.
[self.imageManager requestImageForAsset:asset targetSize:CGSizeMake(360, 360) contentMode:PHImageContentModeAspectFill options:nil resultHandler:^(UIImage *result, NSDictionary *info)
{
dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0), ^{
[self doSomething]
dispatch_async(dispatch_get_main_queue(), ^{
imageView.image=result;
});//
});
}];
But I am unable to get the exif value for the resized images.
Error : libBacktraceRecording.dylib: allocate_free_list_pages() -- virtual memory exhausted!
Please suggest me how to handle this Memory crash issue. Also please suggest if there is another way to get the exif value.
You're loading autoreleased images with CIImage* ciImage = [CIImage imageWithData:imageData];. That will use up a lot of data really quickly because the Autorealeasepool is not drained.
You can wrap the image creation and dictionary access in #autoreleasepool{/* your code*/}. Or just use CIImage* ciImage = [[CIImage alloc] initWithData:imageData]; to let ARC take care of of freeing the memory.
I am writing an app which at the moment should just be able to have images in an album, dedicated to the app. The user is able to add pictures to the folder from inside the app, either by choosing an existing image or taking a new one with the camera. This functionality is however very slow, so I am wondering if I am doing it correctly.
- (void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info
Calls this method with the chosen UIImage:
- (void)addPictureAndUpdate:(UIImage *)image{
PHFetchOptions *albumFetchOptions = [PHFetchOptions new];
albumFetchOptions.predicate = [NSPredicate predicateWithFormat:#"%K like %#", #"title", self.albumName];
PHFetchResult *album = [PHAssetCollection fetchAssetCollectionsWithType:PHAssetCollectionTypeAlbum subtype:PHAssetCollectionSubtypeAlbumRegular options:albumFetchOptions];
PHAssetCollection *assetCollection = album.firstObject;
[[PHPhotoLibrary sharedPhotoLibrary] performChanges:^{
PHAssetChangeRequest *assetChangeRequest = [PHAssetChangeRequest creationRequestForAssetFromImage:image];
PHAssetCollectionChangeRequest *assetCollectionChangeRequest = [PHAssetCollectionChangeRequest changeRequestForAssetCollection:assetCollection];
[assetCollectionChangeRequest addAssets:#[[assetChangeRequest placeholderForCreatedAsset]]];
} completionHandler:^(BOOL success, NSError *error) {
if (!success) {
NSLog(#"Error creating asset: %#", error);
} else {
//Add PHAsset to datasource
//Update view
}
}];
This(the finishing of the ChangeRequest) is very slow, like 20-30 seconds and even more at times.
What am doing wrong? I am quite new to iOS development and obviously to the new Photos framework and I really want to learn how to do this.
Would it be smarter to seperate the two things, the showing the image and moving it? At the moment, I am storing a PHAsset for each image, which is then loaded in the requested size when needed(a thumbnail size for showing in the view and the original size when it is shown in fullscreen). Would it be smarter to always just store a UIImage, and then change the size of that? That way, I would be able to make the request to add the asset to the album, and immediatly show it as I would just add the UIImage to the datasource.
My main concerns about this are memory problems and iamge scaling problems. Would storing UIImages for an entire album be too memory heavy for an app? And, is it easy to resize a UIImage for display?
Thank you
This appears easy, but the lack of documentation makes this question impossible to guess.
I have pictures and videos on my app's icloud drive and I want to create thumbnails of these assets. I am talking about assets on iCloud Drive, not the iCloud photo stream inside the camera roll. I am talking about the real iCloud Drive folder.
Creating thumbnails from videos are "easy" compared to images. You just need 2 weeks to figure out how it works, having in mind the poor documentation Apple wrote but thumbnails from images seem impossible.
What I have now is an array of NSMetadataItems each one describing one item on the iCloud folder.
These are the methods I have tried so far that don't work:
METHOD 1
[fileURL startAccessingSecurityScopedResource];
NSFileCoordinator *coordinator = [[NSFileCoordinator alloc] init];
__block NSError *error;
[coordinator coordinateReadingItemAtURL:fileURL
options:NSFileCoordinatorReadingImmediatelyAvailableMetadataOnly
error:&error
byAccessor:^(NSURL *newURL) {
NSDictionary *thumb;
BOOL success = [newURL getResourceValue:&thumb forKey:NSURLThumbnailDictionaryKey error:&error];
UIImage *thumbnail = thumb[NSThumbnail1024x1024SizeKey];
}];
[fileURL stopAccessingSecurityScopedResource];
The results of this method are fantastic. Ready for that? Here we go: success = YES, error = nil and thumbnail = nil.
ANOTHER METHOD
AVURLAsset *asset = [[AVURLAsset alloc] initWithURL:fileURL
options:nil];
AVAssetImageGenerator *imageGenerator = [[AVAssetImageGenerator alloc] initWithAsset:asset];
imageGenerator.appliesPreferredTrackTransform = YES;
CMTime time = CMTimeMake(0, 60); // time range in which you want
NSValue *timeValue = [NSValue valueWithCMTime:time];
[imageGenerator generateCGImagesAsynchronouslyForTimes:#[timeValue] completionHandler:^(CMTime requestedTime, CGImageRef image, CMTime actualTime, AVAssetImageGeneratorResult result, NSError * error) {
thumbnail = [[UIImage alloc] initWithCGImage:image];
}];
error = The requested URL was not found on this server. and thumbnail = nil
This method appears to be just for videos. I was trying this just in case. Any equivalent of this method to images?
PRIMITIVE METHOD
NSData *tempData = [NSData dataWithContentsOfUrl:tempURL];
NOPE - data = nil
METHOD 4
The fourth possible method would be using ALAsset but this was deprecated on iOS 9.
I think that all these methods fail because they just work (bug or not) if the resource is local. Any ideas on how to download the image so I can get the thumbnail?
Any other ideas?
thanks
EDIT: after several tests I see that Method 1 is the only one that seems to be in the right direction. This method works poorly, sometimes grabbing the icon but most part of the time not working.
Another point is this. Whatever people suggests me, they always say about downloading the whole image to get the thumbnail. I don't think this is the way to go. Just see how getting thumbnails of video work. You don't download the whole video to get its thumbnail.
So this
question remains open.
The Photos-Framework or AssetsLibrary will not work here as you would have to import your iCloud Drive Photos first to the PhotoLibrary to use any methods of these two frameworks.
What you should look at is ImageIO:
Get the content of the iCloud Drive Photo as NSData and then proceed like this:
CGImageSourceRef source = CGImageSourceCreateWithData((__bridge CFDataRef)(imageData), NULL );
NSDictionary* thumbOpts = [NSDictionary dictionaryWithObjectsAndKeys:
(id)kCFBooleanTrue, (id)kCGImageSourceCreateThumbnailWithTransform,
(id)kCFBooleanTrue, (id)kCGImageSourceCreateThumbnailFromImageAlways,
[NSNumber numberWithInt:160],(id)kCGImageSourceThumbnailMaxPixelSize,
nil];
CGImageRef thumbImageRef = CGImageSourceCreateThumbnailAtIndex(source,0,(__bridge CFDictionaryRef)thumbOpts);
UIImage *thumbnail = [[UIImage alloc] initWithCGImage: thumbImageRef];
After testing several solutions, the one that seems to work better is this one:
[fileURL startAccessingSecurityScopedResource];
NSFileCoordinator *coordinator = [[NSFileCoordinator alloc] init];
__block NSError *error;
[coordinator coordinateReadingItemAtURL:fileURL
options:NSFileCoordinatorReadingImmediatelyAvailableMetadataOnly
error:&error
byAccessor:^(NSURL *newURL) {
NSDictionary *thumb;
BOOL success = [newURL getResourceValue:&thumb forKey:NSURLThumbnailDictionaryKey error:&error];
UIImage *thumbnail = thumb[NSThumbnail1024x1024SizeKey];
}];
[fileURL stopAccessingSecurityScopedResource];
This solution is not perfect. It will fail to bring the thumbnails sometimes but I was not able to find any other solution that works 100%. Others are worst than that.
This works for me.It has a little bit different
func genereatePreviewForOnce(at size: CGSize,completionHandler: #escaping (UIImage?) -> Void) {
_ = fileURL.startAccessingSecurityScopedResource()
let fileCoorinator = NSFileCoordinator.init()
fileCoorinator.coordinate(readingItemAt: fileURL, options: .immediatelyAvailableMetadataOnly, error: nil) { (url) in
if let res = try? url.resourceValues(forKeys: [.thumbnailDictionaryKey]),
let dict = res.thumbnailDictionary {
let image = dict[.NSThumbnail1024x1024SizeKey]
completionHandler(image)
} else {
fileURL.removeCachedResourceValue(forKey: .thumbnailDictionaryKey)
completionHandler(nil)
}
fileURL.stopAccessingSecurityScopedResource()
}
}
It looks like you are generating your thumbnail after the fact. If this is your document and you are using UIDocument, override fileAttributesToWriteToURL:forSaveOperation:error: to insert the thumbnail when the document is saved.
My scenario is that I select one of the photos fetched from system album with PhotoKit and presented in my UICollectionView and pass the selected photo (UIImage) to my next UIView and send it to remote server in form of NSData.
But when I put breakpoint before sending to track the NSData I found that the UIImage had data with allocated memory while NSData did not.
Here is the code to fetch the UIImage (Please notice that I didn't specify any PHImageRequestOptions object):
NSMutableArray *images = #[].mutableCopy;
PHImageManager *manager = [PHImageManager defaultManager];
PHAsset *asset = _photoPickerCollectionView.pickedAsset;
CGSize targetSize = CGSizeMake(asset.pixelWidth*0.5, asset.pixelHeight*0.5);
[manager requestImageForAsset:asset targetSize:targetSize contentMode:PHImageContentModeAspectFill options:nil resultHandler:^(UIImage *result, NSDictionary *info) {
_nextView.pickedOne = result;
}];
[self.navigationController dismissViewControllerAnimated:YES completion:nil];
I convert the UIImage to NSData like this:
UIImage *image = _pickedOne;
NSData *imgData = UIImageJPEGRepresentation(image, imageQuality);
When I tracked the variables, the UIImage contains data as below:
But the NSData is nil as below:
And what was ODD is that if I specified an PHImageRequestOptions object for the PHImageManager to request images, the NSData wouldn't be nil. I'm not sure what was changed with or without the PHImageRequestOptions and why it would make such difference.
UPDATE:
What I found is that if you specify PHImageRequestOptions to nil then the default options will force it to fetch photos asynchronously which, in my opinion, can be unstable for NSData, so when I specify options.synchronous = YES; it would work.
But in this case, would it cause any retain cycle or some PhotoKit objects won't get released?
The method by default is asynchronous. You need to set options to explicitly make the image fetching synchronous. My strong suggestion for such kind of conversion is to use this API:
- (PHImageRequestID)requestImageDataForAsset:(PHAsset *)asset options:(nullable PHImageRequestOptions *)options resultHandler:(void(^)(NSData *__nullable imageData, NSString *__nullable dataUTI, UIImageOrientation orientation, NSDictionary *__nullable info))resultHandler;
This returns you the data directly but you will have to make this synchronous as well. This WON'T cause any retain cycles for any objects.
Hope this helps. :)
I'm developing and iOS app for iPad and I'm using a Repository called Grabkit in order to get images from different services like Instagram and Flicker in addition to images from the Camera Roll. The problem is that when the user selects a picture from the roll I get and URL such this: assets-library://asset/asset.JPG?id=DCFB9E49-93AA-49E3-89C8-2EE64AE2C4C6&ext=JPG
I've tried some codes to get the image from this kind of paths but no one has worked, such as the following:
ALAssetsLibrary* library = [[ALAssetsLibrary alloc] init];
// Ask for the "Asset" for the URL. An asset is a representation of an image in the Photo application.
[library assetForURL:originalImage.URL
resultBlock:^(ALAsset *asset) {
// Here, we have the asset, let's retrieve the image from it
CGImageRef imgRef = [[asset defaultRepresentation] fullResolutionImage];
/* Instead of the full res image, you can ask for an image that fits the screen
CGImageRef imgRef = [[asset defaultRepresentation] fullScreenImage];
*/
// From the CGImage, let's build an UIImage
imatgetemporal = [UIImage imageWithCGImage:imgRef];
} failureBlock:^(NSError *error) {
// Something wrong happened.
}];
Is something in my code wrong? Must I try another code?