I am writing an app which at the moment should just be able to have images in an album, dedicated to the app. The user is able to add pictures to the folder from inside the app, either by choosing an existing image or taking a new one with the camera. This functionality is however very slow, so I am wondering if I am doing it correctly.
- (void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info
Calls this method with the chosen UIImage:
- (void)addPictureAndUpdate:(UIImage *)image{
PHFetchOptions *albumFetchOptions = [PHFetchOptions new];
albumFetchOptions.predicate = [NSPredicate predicateWithFormat:#"%K like %#", #"title", self.albumName];
PHFetchResult *album = [PHAssetCollection fetchAssetCollectionsWithType:PHAssetCollectionTypeAlbum subtype:PHAssetCollectionSubtypeAlbumRegular options:albumFetchOptions];
PHAssetCollection *assetCollection = album.firstObject;
[[PHPhotoLibrary sharedPhotoLibrary] performChanges:^{
PHAssetChangeRequest *assetChangeRequest = [PHAssetChangeRequest creationRequestForAssetFromImage:image];
PHAssetCollectionChangeRequest *assetCollectionChangeRequest = [PHAssetCollectionChangeRequest changeRequestForAssetCollection:assetCollection];
[assetCollectionChangeRequest addAssets:#[[assetChangeRequest placeholderForCreatedAsset]]];
} completionHandler:^(BOOL success, NSError *error) {
if (!success) {
NSLog(#"Error creating asset: %#", error);
} else {
//Add PHAsset to datasource
//Update view
}
}];
This(the finishing of the ChangeRequest) is very slow, like 20-30 seconds and even more at times.
What am doing wrong? I am quite new to iOS development and obviously to the new Photos framework and I really want to learn how to do this.
Would it be smarter to seperate the two things, the showing the image and moving it? At the moment, I am storing a PHAsset for each image, which is then loaded in the requested size when needed(a thumbnail size for showing in the view and the original size when it is shown in fullscreen). Would it be smarter to always just store a UIImage, and then change the size of that? That way, I would be able to make the request to add the asset to the album, and immediatly show it as I would just add the UIImage to the datasource.
My main concerns about this are memory problems and iamge scaling problems. Would storing UIImages for an entire album be too memory heavy for an app? And, is it easy to resize a UIImage for display?
Thank you
Related
In iOS 10 I start to replace asset-library to PhotoKit to manage image picker. But there is an issue.
The specific step is that app needs system camera and user shot a photo after that the delegate method imagePickerController:didFinishPickingMediaWithInfo: will be called.
Then, here is my code:
UIImage *pickerImage = [[info objectForKey:UIImagePickerControllerOriginalImage] fixOrientation];
PHPhotoLibrary *photoLibrary = [PHPhotoLibrary sharedPhotoLibrary];
__block NSString *localId;
[photoLibrary performChanges:^{
PHAssetChangeRequest *assetChangeRequest = [PHAssetChangeRequest creationRequestForAssetFromImage:pickerImage];
localId = [[assetChangeRequest placeholderForCreatedAsset] localIdentifier];
} completionHandler:^(BOOL success, NSError * _Nullable error) {
if (success) {
// to get localIdentifier and reload collectionView
}
}];
line 1 that fixOrientation is a custom category that return UIImage instance.
Only call [PHAssetChangeRequest creationRequestForAssetFromImage:] method performs fine. However, when I want to fetch newly created photo to use [assetChangeRequest placeholderForCreatedAsset] , it shows memory leak and app crash.
Above all, any solution to fetch created photo just moment or other methods to solve that by using Photos Framework?
Wanted to get some feedback to see what I might be missing here. Basically I use a UIImagePickerViewController to take a photo. When I am done i retrieve this image like this:
UIImage *photoTaken = [info objectForKey:#"UIImagePickerControllerOriginalImage"];
After I have taken the photo, at a later time, I need to be able load all the images in my camera roll and highlight the photo(I display all the images from the camera roll) that I just took. Because these photos are different objects in memory (different view controllers), the only way to compare them is by comparing the actual data that represents the images. i..e..
NSData *alreadySelectedPhotoData = UIImageJPEGRepresentation(alreadySelectedPhoto.photoImage, 0.0);
NSData *cameralRollPhotoData = UIImageJPEGRepresentation(cameraRollPhoto.photoImage, 0.0);
if([cameralRollPhotoData isEqualToData:alreadySelectedPhotoData]){
//do something here if they are equal(draw a border, etc)
}
However, the photos never actually were equal based on this comparison, despite the fact that the images displayed are identical.
So I went back to the original code, did some digging and did a data and visual test:
- (void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info {
__block UIImage *photoTaken = [info objectForKey:#"UIImagePickerControllerOriginalImage"];
__block PHObjectPlaceholder *placeholderAsset = nil;
//save our new photo to the camera roll album(successfully)
[[PHPhotoLibrary sharedPhotoLibrary] performChanges:^{
PHAssetChangeRequest *changeRequest = [PHAssetChangeRequest creationRequestForAssetFromImage:photoTaken];
changeRequest.creationDate = creationTimeStamp = [NSDate date];
placeholderAsset = changeRequest.placeholderForCreatedAsset;
}
completionHandler:^(BOOL success, NSError *error){
PHImageManager *manager = [PHImageManager defaultManager];
PHImageRequestOptions *requestOptions = [PHImageRequestOptions new];
requestOptions.resizeMode = PHImageRequestOptionsResizeModeExact;
requestOptions.deliveryMode = PHImageRequestOptionsDeliveryModeHighQualityFormat;
CGFloat dimension = [UIScreen mainScreen].bounds.size.width / 3 * [UIScreen mainScreen].scale;
CGSize targetSize = CGSizeMake(dimension, dimension);
PHFetchResult *savedAssets = [PHAsset fetchAssetsWithLocalIdentifiers:#[placeholderAsset.localIdentifier] options:nil];
[manager requestImageForAsset:savedAssets.firstObject targetSize:targetSize contentMode:PHImageContentModeAspectFill options:requestOptions resultHandler:^(UIImage *result, NSDictionary *info) {
//images are the 'same' but their NSData representations appear to not be. NSLog statement never executes.
NSData *alreadySelectedPhotoData = UIImageJPEGRepresentation(photoTaken, 0.0);
NSData *cameralRollPhotoData = UIImageJPEGRepresentation(result, 0.0);
if([cameralRollPhotoData isEqualToData:alreadySelectedPhotoData]){
NSLog(#"images are equal");
}
}];
}];
So to summarize:
store the image that comes back from the info object in the picker delegate method
use this image and store it in the camera roll by using 'creationRequestForAssetFromImage'
retrieve back the image that we just stored by getting Asset ('fetchAssetsWithLocalIdentifiers')
convert that asset back into an image (PHManager - requestImageForAsset)
convert both the original UIImage that was returned via the picker delegate and the image back from the camera roll that was created to NSData objects.
Result: they do not equal, even though the images on the screen are exactly the same.
Conclusion: It seems to me that this below:
PHAssetChangeRequest *changeRequest = [PHAssetChangeRequest creationRequestForAssetFromImage:photoTaken];
saves to the camera roll successfully, can verify that the image it displays is exactly the image that I got from the UIPickerImage delegate method(visually looks the same), yet when converting both images to NSData objects the comparison fails.
Does anyone have any idea whats going on here? did I miss something or is this a bug?
I am developing a camera application which can be used to take pictures and save them in a separate album. I used photos framework to save images and now I need to save GPS data (location where the picture is taken) with the picture (may be in metadata). I searched for any method to do this thing using photos framework but I failed, I couldn't find anything related. Any help would be highly appreciated.
This is the peace of code I used to save pictures
[[PHPhotoLibrary sharedPhotoLibrary] performChanges:^{
PHAssetChangeRequest *assetRequest = [PHAssetChangeRequest creationRequestForAssetFromImage:capturedImage];
placeholder = [assetRequest placeholderForCreatedAsset];
photosAsset = [PHAsset fetchAssetsInAssetCollection:Album options:nil];
PHAssetCollectionChangeRequest *albumChangeRequest = [PHAssetCollectionChangeRequest changeRequestForAssetCollection:Album assets:photosAsset];
[albumChangeRequest addAssets:#[placeholder]];
} completionHandler:^(BOOL success, NSError *error) {
if (success)
{
}
else
{
}
}];
I assume you want the latitude & longitude. Have you tried the location property of PHAsset class?
I am using the following code to attempt to save a new image to a PHAssetCollection, specifically, the Camera Roll (aka User Library) :
[[PHPhotoLibrary sharedPhotoLibrary] performChanges:^{
PHAssetChangeRequest *assetChangeRequest = [PHAssetChangeRequest creationRequestForAssetFromImage:image];
PHFetchResult *fetchResult = [PHAssetCollection fetchAssetCollectionsWithType:PHAssetCollectionTypeSmartAlbum subtype:PHAssetCollectionSubtypeSmartAlbumUserLibrary options:nil];
PHAssetCollection *assetCollection = fetchResult[0];
if (assetCollection) {
PHAssetCollectionChangeRequest *assetCollectionChangeRequest = [PHAssetCollectionChangeRequest changeRequestForAssetCollection:assetCollection];
[assetCollectionChangeRequest addAssets:#[[assetChangeRequest placeholderForCreatedAsset]]];
}
} completionHandler:^(BOOL success, NSError *error) {
if (!success) {
NSLog(#"Error creating asset: %#", error);
}
}];
I always get an error.
All of the objects in the perform block look fine:
(lldb) po image
<UIImage: 0x174289ec0>, {1080, 1466}
(lldb) po assetCollection
<PHAssetCollection: 0x1741d5540> F6705124-D49B-4FDC-9191-7E84CFCCD148/L0/040 Camera Roll assetCollectionType=2/209
(lldb) po assetCollectionChangeRequest
<PHAssetCollectionChangeRequest: 0x170264640> title=(null) hasAssetChanges=1
And the error message is pretty useless:
The operation couldn’t be completed. (Cocoa error -1.)
How can I successfully save my new image to the user's library? Thanks.
In general you're doing things in the wrong order; you should not be doing any fetching inside a performChanges block. And you don't have to, in any case. Do not fetch the collection at all. Just create the photo, plain and simple, exactly as in your first line - except that you don't even need to keep a reference to the change request:
[PHAssetChangeRequest creationRequestForAssetFromImage:image];
...and stop. At that point the photo has been added to the camera roll.
I just tried this and it works perfectly.
(Of course I'm assuming you have already obtained the necessary permissions from the user...!)
When a picture is taken (iOS 8.1) and didFinishPickingMediaWithInfo is called, I'm trying to use PhotoKit to add the GPS data back into the metadata.
This is the code I'm using:
UIImage *image = [info objectForKey:UIImagePickerControllerOriginalImage];
[[PHPhotoLibrary sharedPhotoLibrary] performChanges:^{
PHAssetChangeRequest *newAssetRequest = [PHAssetChangeRequest creationRequestForAssetFromImage:image];
newAssetRequest.creationDate = [NSDate date];
newAssetRequest.location = [self deviceLocation];
NSLog(#"didFinishPickingMediaWithInfo: location:%#", newAssetRequest.location);
PHObjectPlaceholder *placeholderAsset = newAssetRequest.placeholderForCreatedAsset;
PHAssetCollectionChangeRequest *addAssetRequest = [PHAssetCollectionChangeRequest changeRequestForAssetCollection:self.myResults[0]];
[addAssetRequest addAssets:#[placeholderAsset]];
} completionHandler:^(BOOL success, NSError *error) {
if (success) {
NSLog(#"didFinishPickingMediaWithInfo: Success saving picture to album");
} else {
NSLog(#"didFinishPickingMediaWithInfo: error saving picture to album %#", error);
}
}];
[picker dismissViewControllerAnimated:YES completion:nil];
The code takes the image and moves it to an album. This part works. The photo ends up in the correct album.
The problem is the GPS data is not present in the metadata even though the location property is properly set. I know the location data is valid.
Shouldn't this work? Is there an alternate approach to get the desired result? I don't really care about all the other metadata, just the GPS coordinates.