I am developing a camera application which can be used to take pictures and save them in a separate album. I used photos framework to save images and now I need to save GPS data (location where the picture is taken) with the picture (may be in metadata). I searched for any method to do this thing using photos framework but I failed, I couldn't find anything related. Any help would be highly appreciated.
This is the peace of code I used to save pictures
[[PHPhotoLibrary sharedPhotoLibrary] performChanges:^{
PHAssetChangeRequest *assetRequest = [PHAssetChangeRequest creationRequestForAssetFromImage:capturedImage];
placeholder = [assetRequest placeholderForCreatedAsset];
photosAsset = [PHAsset fetchAssetsInAssetCollection:Album options:nil];
PHAssetCollectionChangeRequest *albumChangeRequest = [PHAssetCollectionChangeRequest changeRequestForAssetCollection:Album assets:photosAsset];
[albumChangeRequest addAssets:#[placeholder]];
} completionHandler:^(BOOL success, NSError *error) {
if (success)
{
}
else
{
}
}];
I assume you want the latitude & longitude. Have you tried the location property of PHAsset class?
Related
In iOS 10 I start to replace asset-library to PhotoKit to manage image picker. But there is an issue.
The specific step is that app needs system camera and user shot a photo after that the delegate method imagePickerController:didFinishPickingMediaWithInfo: will be called.
Then, here is my code:
UIImage *pickerImage = [[info objectForKey:UIImagePickerControllerOriginalImage] fixOrientation];
PHPhotoLibrary *photoLibrary = [PHPhotoLibrary sharedPhotoLibrary];
__block NSString *localId;
[photoLibrary performChanges:^{
PHAssetChangeRequest *assetChangeRequest = [PHAssetChangeRequest creationRequestForAssetFromImage:pickerImage];
localId = [[assetChangeRequest placeholderForCreatedAsset] localIdentifier];
} completionHandler:^(BOOL success, NSError * _Nullable error) {
if (success) {
// to get localIdentifier and reload collectionView
}
}];
line 1 that fixOrientation is a custom category that return UIImage instance.
Only call [PHAssetChangeRequest creationRequestForAssetFromImage:] method performs fine. However, when I want to fetch newly created photo to use [assetChangeRequest placeholderForCreatedAsset] , it shows memory leak and app crash.
Above all, any solution to fetch created photo just moment or other methods to solve that by using Photos Framework?
After looking into the PHPhotoLibrary Framework, I've been able to successfully create and add new image assets and collections but the issue I've run into is not being able to successfully create a new Asset Collection AND add a new Asset to in within the same change block.
If I create an album as an Asset Collection in one change block, and then on completion, create an image as an Asset in another change block it works as expected. Additionally if I have an album already and query that album, I can add an image as an Asset to that album successfully.
The PHAssetCollectionChangeRequest Class Documentation states:
To add assets to the newly created asset collection or change its title, use the methods listed in Modifying Asset Collections. To reference the newly created asset collection later in the same change block or after the change block completes, use the placeholderForCreatedAssetCollection property to retrieve a placeholder object.
I've either misread it or it doesn't actually have the ability to do as it says - to add assets to a newly created asset collection.
This following code completes "successfully" in the completion handler, but when going into the iOS Photos.app, only the Album is created, with no image added (though the image is added to the camera roll as expected).
The thing that's causing the issue is that a PHObjectPlaceholder can't be used as a PHAssetCollection, so the "reference" they speak of can't be used in this way, so that's the underlying problem I've failed to understand:
[[PHPhotoLibrary sharedPhotoLibrary] performChanges:^{
// Create the Asset Creation Request to save the photo to the user's photos - This will add it to the camera roll at the very least
PHAssetChangeRequest *imageCreationRequest = [PHAssetChangeRequest creationRequestForAssetFromImage:image];
// Create the Asset Collection Creation Request to create the new album - This will create the album at the very least
PHAssetCollectionChangeRequest *creationRequest = [PHAssetCollectionChangeRequest creationRequestForAssetCollectionWithTitle:#"New Album"];
PHObjectPlaceholder *collectionPlaceholder = creationRequest.placeholderForCreatedAssetCollection; // Get the placeholder Asset Collection
// Create the Asset Collection Change Request to add the new image Asset to the new album Asset Collection
// Warns about PHObjectPlaceholder* != PHAssetCollection*
PHAssetCollectionChangeRequest *albumChangeRequest = [PHAssetCollectionChangeRequest changeRequestForAssetCollection:collectionPlaceholder];
[albumChangeRequest addAssets:#[imageCreationRequest.placeholderForCreatedAsset]];
} completionHandler:^(BOOL success, NSError * _Nullable error) {
if (success) {
NSLog(#"Saved to iOS Photos after creating album");
}
}];
If it helps, this is the code which works using two change blocks:
__block PHObjectPlaceholder *collectionPlaceholder;
[[PHPhotoLibrary sharedPhotoLibrary] performChanges:^{
// Create the Asset Collection Creation Request to create the new album - This will create the album at the very least
PHAssetCollectionChangeRequest *creationRequest = [PHAssetCollectionChangeRequest creationRequestForAssetCollectionWithTitle:#"New Album"];
collectionPlaceholder = creationRequest.placeholderForCreatedAssetCollection; // Get the placeholder Asset Collection
} completionHandler:^(BOOL success, NSError * _Nullable error) {
if (success) {
[[PHPhotoLibrary sharedPhotoLibrary] performChanges:^{
// Create the Asset Creation Request to save the photo to the user's photos - This will add it to the camera roll at the very least
PHAssetChangeRequest *imageCreationRequest = [PHAssetChangeRequest creationRequestForAssetFromImage:image];
// Get the album collection using the placeholder identifier from the first change block
PHAssetCollection *collection = [PHAssetCollection fetchAssetCollectionsWithLocalIdentifiers:#[collectionPlaceholder.localIdentifier] options:nil].firstObject;
// Create the Asset Collection Change Request to add the new image Asset to the new album Asset Collection
PHAssetCollectionChangeRequest *albumChangeRequest = [PHAssetCollectionChangeRequest changeRequestForAssetCollection:collection];
[albumChangeRequest addAssets:#[imageCreationRequest.placeholderForCreatedAsset]];
} completionHandler:^(BOOL success, NSError * _Nullable error) {
if (success) {
NSLog(#"Saved to iOS Photos after creating album");
} else {
NSLog(#"Couldn't save to iOS Photos after creating album (%#)", error.description);
}
}];
}
}];
Have you tried converting the album's placeholder into an album inside the one change request?
PHAssetCollection *collection = [PHAssetCollection fetchAssetCollectionsWithLocalIdentifiers:#[collectionPlaceholder.localIdentifier] options:nil].firstObject;
PHAssetCollectionChangeRequest *albumChangeRequest = [PHAssetCollectionChangeRequest changeRequestForAssetCollection:collection];
I am writing an app which at the moment should just be able to have images in an album, dedicated to the app. The user is able to add pictures to the folder from inside the app, either by choosing an existing image or taking a new one with the camera. This functionality is however very slow, so I am wondering if I am doing it correctly.
- (void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info
Calls this method with the chosen UIImage:
- (void)addPictureAndUpdate:(UIImage *)image{
PHFetchOptions *albumFetchOptions = [PHFetchOptions new];
albumFetchOptions.predicate = [NSPredicate predicateWithFormat:#"%K like %#", #"title", self.albumName];
PHFetchResult *album = [PHAssetCollection fetchAssetCollectionsWithType:PHAssetCollectionTypeAlbum subtype:PHAssetCollectionSubtypeAlbumRegular options:albumFetchOptions];
PHAssetCollection *assetCollection = album.firstObject;
[[PHPhotoLibrary sharedPhotoLibrary] performChanges:^{
PHAssetChangeRequest *assetChangeRequest = [PHAssetChangeRequest creationRequestForAssetFromImage:image];
PHAssetCollectionChangeRequest *assetCollectionChangeRequest = [PHAssetCollectionChangeRequest changeRequestForAssetCollection:assetCollection];
[assetCollectionChangeRequest addAssets:#[[assetChangeRequest placeholderForCreatedAsset]]];
} completionHandler:^(BOOL success, NSError *error) {
if (!success) {
NSLog(#"Error creating asset: %#", error);
} else {
//Add PHAsset to datasource
//Update view
}
}];
This(the finishing of the ChangeRequest) is very slow, like 20-30 seconds and even more at times.
What am doing wrong? I am quite new to iOS development and obviously to the new Photos framework and I really want to learn how to do this.
Would it be smarter to seperate the two things, the showing the image and moving it? At the moment, I am storing a PHAsset for each image, which is then loaded in the requested size when needed(a thumbnail size for showing in the view and the original size when it is shown in fullscreen). Would it be smarter to always just store a UIImage, and then change the size of that? That way, I would be able to make the request to add the asset to the album, and immediatly show it as I would just add the UIImage to the datasource.
My main concerns about this are memory problems and iamge scaling problems. Would storing UIImages for an entire album be too memory heavy for an app? And, is it easy to resize a UIImage for display?
Thank you
I am using the following code to attempt to save a new image to a PHAssetCollection, specifically, the Camera Roll (aka User Library) :
[[PHPhotoLibrary sharedPhotoLibrary] performChanges:^{
PHAssetChangeRequest *assetChangeRequest = [PHAssetChangeRequest creationRequestForAssetFromImage:image];
PHFetchResult *fetchResult = [PHAssetCollection fetchAssetCollectionsWithType:PHAssetCollectionTypeSmartAlbum subtype:PHAssetCollectionSubtypeSmartAlbumUserLibrary options:nil];
PHAssetCollection *assetCollection = fetchResult[0];
if (assetCollection) {
PHAssetCollectionChangeRequest *assetCollectionChangeRequest = [PHAssetCollectionChangeRequest changeRequestForAssetCollection:assetCollection];
[assetCollectionChangeRequest addAssets:#[[assetChangeRequest placeholderForCreatedAsset]]];
}
} completionHandler:^(BOOL success, NSError *error) {
if (!success) {
NSLog(#"Error creating asset: %#", error);
}
}];
I always get an error.
All of the objects in the perform block look fine:
(lldb) po image
<UIImage: 0x174289ec0>, {1080, 1466}
(lldb) po assetCollection
<PHAssetCollection: 0x1741d5540> F6705124-D49B-4FDC-9191-7E84CFCCD148/L0/040 Camera Roll assetCollectionType=2/209
(lldb) po assetCollectionChangeRequest
<PHAssetCollectionChangeRequest: 0x170264640> title=(null) hasAssetChanges=1
And the error message is pretty useless:
The operation couldn’t be completed. (Cocoa error -1.)
How can I successfully save my new image to the user's library? Thanks.
In general you're doing things in the wrong order; you should not be doing any fetching inside a performChanges block. And you don't have to, in any case. Do not fetch the collection at all. Just create the photo, plain and simple, exactly as in your first line - except that you don't even need to keep a reference to the change request:
[PHAssetChangeRequest creationRequestForAssetFromImage:image];
...and stop. At that point the photo has been added to the camera roll.
I just tried this and it works perfectly.
(Of course I'm assuming you have already obtained the necessary permissions from the user...!)
When a picture is taken (iOS 8.1) and didFinishPickingMediaWithInfo is called, I'm trying to use PhotoKit to add the GPS data back into the metadata.
This is the code I'm using:
UIImage *image = [info objectForKey:UIImagePickerControllerOriginalImage];
[[PHPhotoLibrary sharedPhotoLibrary] performChanges:^{
PHAssetChangeRequest *newAssetRequest = [PHAssetChangeRequest creationRequestForAssetFromImage:image];
newAssetRequest.creationDate = [NSDate date];
newAssetRequest.location = [self deviceLocation];
NSLog(#"didFinishPickingMediaWithInfo: location:%#", newAssetRequest.location);
PHObjectPlaceholder *placeholderAsset = newAssetRequest.placeholderForCreatedAsset;
PHAssetCollectionChangeRequest *addAssetRequest = [PHAssetCollectionChangeRequest changeRequestForAssetCollection:self.myResults[0]];
[addAssetRequest addAssets:#[placeholderAsset]];
} completionHandler:^(BOOL success, NSError *error) {
if (success) {
NSLog(#"didFinishPickingMediaWithInfo: Success saving picture to album");
} else {
NSLog(#"didFinishPickingMediaWithInfo: error saving picture to album %#", error);
}
}];
[picker dismissViewControllerAnimated:YES completion:nil];
The code takes the image and moves it to an album. This part works. The photo ends up in the correct album.
The problem is the GPS data is not present in the metadata even though the location property is properly set. I know the location data is valid.
Shouldn't this work? Is there an alternate approach to get the desired result? I don't really care about all the other metadata, just the GPS coordinates.