Save UIImage to PHAssetCollection - ios

I am using the following code to attempt to save a new image to a PHAssetCollection, specifically, the Camera Roll (aka User Library) :
[[PHPhotoLibrary sharedPhotoLibrary] performChanges:^{
PHAssetChangeRequest *assetChangeRequest = [PHAssetChangeRequest creationRequestForAssetFromImage:image];
PHFetchResult *fetchResult = [PHAssetCollection fetchAssetCollectionsWithType:PHAssetCollectionTypeSmartAlbum subtype:PHAssetCollectionSubtypeSmartAlbumUserLibrary options:nil];
PHAssetCollection *assetCollection = fetchResult[0];
if (assetCollection) {
PHAssetCollectionChangeRequest *assetCollectionChangeRequest = [PHAssetCollectionChangeRequest changeRequestForAssetCollection:assetCollection];
[assetCollectionChangeRequest addAssets:#[[assetChangeRequest placeholderForCreatedAsset]]];
}
} completionHandler:^(BOOL success, NSError *error) {
if (!success) {
NSLog(#"Error creating asset: %#", error);
}
}];
I always get an error.
All of the objects in the perform block look fine:
(lldb) po image
<UIImage: 0x174289ec0>, {1080, 1466}
(lldb) po assetCollection
<PHAssetCollection: 0x1741d5540> F6705124-D49B-4FDC-9191-7E84CFCCD148/L0/040 Camera Roll assetCollectionType=2/209
(lldb) po assetCollectionChangeRequest
<PHAssetCollectionChangeRequest: 0x170264640> title=(null) hasAssetChanges=1
And the error message is pretty useless:
The operation couldn’t be completed. (Cocoa error -1.)
How can I successfully save my new image to the user's library? Thanks.

In general you're doing things in the wrong order; you should not be doing any fetching inside a performChanges block. And you don't have to, in any case. Do not fetch the collection at all. Just create the photo, plain and simple, exactly as in your first line - except that you don't even need to keep a reference to the change request:
[PHAssetChangeRequest creationRequestForAssetFromImage:image];
...and stop. At that point the photo has been added to the camera roll.
I just tried this and it works perfectly.
(Of course I'm assuming you have already obtained the necessary permissions from the user...!)

Related

Memory leak when using placeholderForCreatedAsset property in PHAssetChangeRequest

In iOS 10 I start to replace asset-library to PhotoKit to manage image picker. But there is an issue.
The specific step is that app needs system camera and user shot a photo after that the delegate method imagePickerController:didFinishPickingMediaWithInfo: will be called.
Then, here is my code:
UIImage *pickerImage = [[info objectForKey:UIImagePickerControllerOriginalImage] fixOrientation];
PHPhotoLibrary *photoLibrary = [PHPhotoLibrary sharedPhotoLibrary];
__block NSString *localId;
[photoLibrary performChanges:^{
PHAssetChangeRequest *assetChangeRequest = [PHAssetChangeRequest creationRequestForAssetFromImage:pickerImage];
localId = [[assetChangeRequest placeholderForCreatedAsset] localIdentifier];
} completionHandler:^(BOOL success, NSError * _Nullable error) {
if (success) {
// to get localIdentifier and reload collectionView
}
}];
line 1 that fixOrientation is a custom category that return UIImage instance.
Only call [PHAssetChangeRequest creationRequestForAssetFromImage:] method performs fine. However, when I want to fetch newly created photo to use [assetChangeRequest placeholderForCreatedAsset] , it shows memory leak and app crash.
Above all, any solution to fetch created photo just moment or other methods to solve that by using Photos Framework?

How to move/copy an image to another album in objective-c?

I am writing an app which at the moment should just be able to have images in an album, dedicated to the app. The user is able to add pictures to the folder from inside the app, either by choosing an existing image or taking a new one with the camera. This functionality is however very slow, so I am wondering if I am doing it correctly.
- (void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info
Calls this method with the chosen UIImage:
- (void)addPictureAndUpdate:(UIImage *)image{
PHFetchOptions *albumFetchOptions = [PHFetchOptions new];
albumFetchOptions.predicate = [NSPredicate predicateWithFormat:#"%K like %#", #"title", self.albumName];
PHFetchResult *album = [PHAssetCollection fetchAssetCollectionsWithType:PHAssetCollectionTypeAlbum subtype:PHAssetCollectionSubtypeAlbumRegular options:albumFetchOptions];
PHAssetCollection *assetCollection = album.firstObject;
[[PHPhotoLibrary sharedPhotoLibrary] performChanges:^{
PHAssetChangeRequest *assetChangeRequest = [PHAssetChangeRequest creationRequestForAssetFromImage:image];
PHAssetCollectionChangeRequest *assetCollectionChangeRequest = [PHAssetCollectionChangeRequest changeRequestForAssetCollection:assetCollection];
[assetCollectionChangeRequest addAssets:#[[assetChangeRequest placeholderForCreatedAsset]]];
} completionHandler:^(BOOL success, NSError *error) {
if (!success) {
NSLog(#"Error creating asset: %#", error);
} else {
//Add PHAsset to datasource
//Update view
}
}];
This(the finishing of the ChangeRequest) is very slow, like 20-30 seconds and even more at times.
What am doing wrong? I am quite new to iOS development and obviously to the new Photos framework and I really want to learn how to do this.
Would it be smarter to seperate the two things, the showing the image and moving it? At the moment, I am storing a PHAsset for each image, which is then loaded in the requested size when needed(a thumbnail size for showing in the view and the original size when it is shown in fullscreen). Would it be smarter to always just store a UIImage, and then change the size of that? That way, I would be able to make the request to add the asset to the album, and immediatly show it as I would just add the UIImage to the datasource.
My main concerns about this are memory problems and iamge scaling problems. Would storing UIImages for an entire album be too memory heavy for an app? And, is it easy to resize a UIImage for display?
Thank you

How to save GPS data with a taken photo in ios

I am developing a camera application which can be used to take pictures and save them in a separate album. I used photos framework to save images and now I need to save GPS data (location where the picture is taken) with the picture (may be in metadata). I searched for any method to do this thing using photos framework but I failed, I couldn't find anything related. Any help would be highly appreciated.
This is the peace of code I used to save pictures
[[PHPhotoLibrary sharedPhotoLibrary] performChanges:^{
PHAssetChangeRequest *assetRequest = [PHAssetChangeRequest creationRequestForAssetFromImage:capturedImage];
placeholder = [assetRequest placeholderForCreatedAsset];
photosAsset = [PHAsset fetchAssetsInAssetCollection:Album options:nil];
PHAssetCollectionChangeRequest *albumChangeRequest = [PHAssetCollectionChangeRequest changeRequestForAssetCollection:Album assets:photosAsset];
[albumChangeRequest addAssets:#[placeholder]];
} completionHandler:^(BOOL success, NSError *error) {
if (success)
{
}
else
{
}
}];
I assume you want the latitude & longitude. Have you tried the location property of PHAsset class?

PHPhotoLibrary error while saving image at url

I create an image at url provided by PHContentEditingOutput. When I load data to UIImage and save it like this - it works.
[[PHPhotoLibrary sharedPhotoLibrary] performChanges:^{
NSData *data = [NSData dataWithContentsOfURL:contentEditingOutput.renderedContentURL]
UIImage *image = [UIImage imageWithData:data];
[PHAssetChangeRequest creationRequestForAssetFromImage:image];
} completionHandler:^(BOOL success, NSError *error) {
...
}];
But when I try approach with url it fails:
[[PHPhotoLibrary sharedPhotoLibrary] performChanges:^{
[PHAssetChangeRequest creationRequestForAssetFromImageAtFileURL:contentEditingOutput.renderedContentURL];
} completionHandler:^(BOOL success, NSError *error) {
...
}];
Error:
Error Domain=NSCocoaErrorDomain Code=-1 "The operation couldn’t be completed. (Cocoa error -1.)"
UPDATE:
The same error when I try to save a modification.
[[PHPhotoLibrary sharedPhotoLibrary] performChanges:^{
PHAssetChangeRequest *request = [PHAssetChangeRequest changeRequestForAsset:asset];
request.contentEditingOutput = contentEditingOutput;
} completionHandler:^(BOOL success, NSError *error) {
...
}];
The method works for a video (creationRequestForAssetFromVideoAtFileURL:), but not for an image. What went wrong?
The problem is in the file format. I was trying to edit PNG screenshot, but renderingContentURL was always tmp/filename.JPG. At first I thought it was a bug, but according to the documentation this is correct behaviour.
renderedContentURL
Read this property to find a URL for writing edited asset content. Then, if editing a photo asset, write the altered photo image to a file in JPEG format at this URL. If editing a video asset, export the video to a QuickTime (.mov) file at this URL.
The solution is to convert the image with function
NSData *UIImageJPEGRepresentation(UIImage *image, CGFloat compressionQuality);
When passing the metadata, I also had the issue consistently showing whenever the Orientation (inside the image metadata) was anything other than CGImagePropertyOrientationUp.
This is also stated inside the renderedContentURL documentation:
Edited asset content must incorporate (or “bake in”) the intended
orientation of the asset. That is, the orientation metadata (if any)
that you write in the output image or video file must declare the “up”
orientation, and the image or video data must appear right-side up
when presented without orientation metadata.
For images, the following metadata keys need to be updated (while the image data is also rotated):
• kCGImagePropertyTIFFDictionary \ kCGImagePropertyTIFFOrientation
• kCGImagePropertyOrientation
• possibly, kCGImagePropertyIPTCImageOrientation

Setting the location property of PHAssetChangeRequest doesn't create a GPS section in image metatdata

When a picture is taken (iOS 8.1) and didFinishPickingMediaWithInfo is called, I'm trying to use PhotoKit to add the GPS data back into the metadata.
This is the code I'm using:
UIImage *image = [info objectForKey:UIImagePickerControllerOriginalImage];
[[PHPhotoLibrary sharedPhotoLibrary] performChanges:^{
PHAssetChangeRequest *newAssetRequest = [PHAssetChangeRequest creationRequestForAssetFromImage:image];
newAssetRequest.creationDate = [NSDate date];
newAssetRequest.location = [self deviceLocation];
NSLog(#"didFinishPickingMediaWithInfo: location:%#", newAssetRequest.location);
PHObjectPlaceholder *placeholderAsset = newAssetRequest.placeholderForCreatedAsset;
PHAssetCollectionChangeRequest *addAssetRequest = [PHAssetCollectionChangeRequest changeRequestForAssetCollection:self.myResults[0]];
[addAssetRequest addAssets:#[placeholderAsset]];
} completionHandler:^(BOOL success, NSError *error) {
if (success) {
NSLog(#"didFinishPickingMediaWithInfo: Success saving picture to album");
} else {
NSLog(#"didFinishPickingMediaWithInfo: error saving picture to album %#", error);
}
}];
[picker dismissViewControllerAnimated:YES completion:nil];
The code takes the image and moves it to an album. This part works. The photo ends up in the correct album.
The problem is the GPS data is not present in the metadata even though the location property is properly set. I know the location data is valid.
Shouldn't this work? Is there an alternate approach to get the desired result? I don't really care about all the other metadata, just the GPS coordinates.

Resources