app about remove duplicate photos from photos app - ios

I am creating an Application that fetches Images from the photo library of the phone and remove duplicate images from it.I searched a lot but did not find any way to delete the image from the photo library.

i create on demo for that
here is the code
i used UIImageJPEGRepresentation to convert image into data.and compare them it gives me result.is there any other image property that we can compare?
UIImage *img1 = [UIImage imageNamed:#"1.jpg"];
UIImage *img2 = [UIImage imageNamed:#"2.jpg"];
NSData *data1 = UIImageJPEGRepresentation(img1, 1.0);
NSLog(#"%#",data1);
NSData *data2 = UIImageJPEGRepresentation(img2, 1.0);
NSLog(#"%#",data2);
if ([data1 isEqualToData:data2])
{
NSLog(#"yes");
}
else
{
NSLog(#"no");
}

As you know we dont have access to modify anything out side the sandbox.So before ios 8 it was not possible to delete photos from photo library.But in ios 8 and later versions are supported to delete photos from library but before removing it will ask user that you want to delete photos.If user allow then photos will be deleted.
I providing you the CODE which I have used in my app to delete photo from photo library.
if (check system version >= 8.0)
{
PHPhotoLibrary *library = [PHPhotoLibrary sharedPhotoLibrary];
[library performChanges:^{
PHFetchResult *assetsToBeDeleted = [PHAsset fetchAssetsWithALAssetURLs:delet
options:nil];
[PHAssetChangeRequest deleteAssets:assetsToBeDeleted];
} completionHandler:^(BOOL success, NSError *error) {
//do something here when error
}];
}
Where delet is the array of asset url of images you get from library with help of AssetLibrary framework.

PHFetchResult *moments = [PHAssetCollection fetchMomentsWithOptions:nil];
for (PHAssetCollection *moment in moments)
{
PHFetchResult *assetsFetchResults = [PHAsset fetchAssetsInAssetCollection:moment options:nil];
for (PHAsset *asset in assetsFetchResults)
{
// Do something with assets, for example add them to array.
}
}

Related

PhotoKit - didFinishPickingMediaWithInfo vs creationRequestForAssetFromImage (data not equal??)

Wanted to get some feedback to see what I might be missing here. Basically I use a UIImagePickerViewController to take a photo. When I am done i retrieve this image like this:
UIImage *photoTaken = [info objectForKey:#"UIImagePickerControllerOriginalImage"];
After I have taken the photo, at a later time, I need to be able load all the images in my camera roll and highlight the photo(I display all the images from the camera roll) that I just took. Because these photos are different objects in memory (different view controllers), the only way to compare them is by comparing the actual data that represents the images. i..e..
NSData *alreadySelectedPhotoData = UIImageJPEGRepresentation(alreadySelectedPhoto.photoImage, 0.0);
NSData *cameralRollPhotoData = UIImageJPEGRepresentation(cameraRollPhoto.photoImage, 0.0);
if([cameralRollPhotoData isEqualToData:alreadySelectedPhotoData]){
//do something here if they are equal(draw a border, etc)
}
However, the photos never actually were equal based on this comparison, despite the fact that the images displayed are identical.
So I went back to the original code, did some digging and did a data and visual test:
- (void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info {
__block UIImage *photoTaken = [info objectForKey:#"UIImagePickerControllerOriginalImage"];
__block PHObjectPlaceholder *placeholderAsset = nil;
//save our new photo to the camera roll album(successfully)
[[PHPhotoLibrary sharedPhotoLibrary] performChanges:^{
PHAssetChangeRequest *changeRequest = [PHAssetChangeRequest creationRequestForAssetFromImage:photoTaken];
changeRequest.creationDate = creationTimeStamp = [NSDate date];
placeholderAsset = changeRequest.placeholderForCreatedAsset;
}
completionHandler:^(BOOL success, NSError *error){
PHImageManager *manager = [PHImageManager defaultManager];
PHImageRequestOptions *requestOptions = [PHImageRequestOptions new];
requestOptions.resizeMode = PHImageRequestOptionsResizeModeExact;
requestOptions.deliveryMode = PHImageRequestOptionsDeliveryModeHighQualityFormat;
CGFloat dimension = [UIScreen mainScreen].bounds.size.width / 3 * [UIScreen mainScreen].scale;
CGSize targetSize = CGSizeMake(dimension, dimension);
PHFetchResult *savedAssets = [PHAsset fetchAssetsWithLocalIdentifiers:#[placeholderAsset.localIdentifier] options:nil];
[manager requestImageForAsset:savedAssets.firstObject targetSize:targetSize contentMode:PHImageContentModeAspectFill options:requestOptions resultHandler:^(UIImage *result, NSDictionary *info) {
//images are the 'same' but their NSData representations appear to not be. NSLog statement never executes.
NSData *alreadySelectedPhotoData = UIImageJPEGRepresentation(photoTaken, 0.0);
NSData *cameralRollPhotoData = UIImageJPEGRepresentation(result, 0.0);
if([cameralRollPhotoData isEqualToData:alreadySelectedPhotoData]){
NSLog(#"images are equal");
}
}];
}];
So to summarize:
store the image that comes back from the info object in the picker delegate method
use this image and store it in the camera roll by using 'creationRequestForAssetFromImage'
retrieve back the image that we just stored by getting Asset ('fetchAssetsWithLocalIdentifiers')
convert that asset back into an image (PHManager - requestImageForAsset)
convert both the original UIImage that was returned via the picker delegate and the image back from the camera roll that was created to NSData objects.
Result: they do not equal, even though the images on the screen are exactly the same.
Conclusion: It seems to me that this below:
PHAssetChangeRequest *changeRequest = [PHAssetChangeRequest creationRequestForAssetFromImage:photoTaken];
saves to the camera roll successfully, can verify that the image it displays is exactly the image that I got from the UIPickerImage delegate method(visually looks the same), yet when converting both images to NSData objects the comparison fails.
Does anyone have any idea whats going on here? did I miss something or is this a bug?

How to move/copy an image to another album in objective-c?

I am writing an app which at the moment should just be able to have images in an album, dedicated to the app. The user is able to add pictures to the folder from inside the app, either by choosing an existing image or taking a new one with the camera. This functionality is however very slow, so I am wondering if I am doing it correctly.
- (void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info
Calls this method with the chosen UIImage:
- (void)addPictureAndUpdate:(UIImage *)image{
PHFetchOptions *albumFetchOptions = [PHFetchOptions new];
albumFetchOptions.predicate = [NSPredicate predicateWithFormat:#"%K like %#", #"title", self.albumName];
PHFetchResult *album = [PHAssetCollection fetchAssetCollectionsWithType:PHAssetCollectionTypeAlbum subtype:PHAssetCollectionSubtypeAlbumRegular options:albumFetchOptions];
PHAssetCollection *assetCollection = album.firstObject;
[[PHPhotoLibrary sharedPhotoLibrary] performChanges:^{
PHAssetChangeRequest *assetChangeRequest = [PHAssetChangeRequest creationRequestForAssetFromImage:image];
PHAssetCollectionChangeRequest *assetCollectionChangeRequest = [PHAssetCollectionChangeRequest changeRequestForAssetCollection:assetCollection];
[assetCollectionChangeRequest addAssets:#[[assetChangeRequest placeholderForCreatedAsset]]];
} completionHandler:^(BOOL success, NSError *error) {
if (!success) {
NSLog(#"Error creating asset: %#", error);
} else {
//Add PHAsset to datasource
//Update view
}
}];
This(the finishing of the ChangeRequest) is very slow, like 20-30 seconds and even more at times.
What am doing wrong? I am quite new to iOS development and obviously to the new Photos framework and I really want to learn how to do this.
Would it be smarter to seperate the two things, the showing the image and moving it? At the moment, I am storing a PHAsset for each image, which is then loaded in the requested size when needed(a thumbnail size for showing in the view and the original size when it is shown in fullscreen). Would it be smarter to always just store a UIImage, and then change the size of that? That way, I would be able to make the request to add the asset to the album, and immediatly show it as I would just add the UIImage to the datasource.
My main concerns about this are memory problems and iamge scaling problems. Would storing UIImages for an entire album be too memory heavy for an app? And, is it easy to resize a UIImage for display?
Thank you

How to save GPS data with a taken photo in ios

I am developing a camera application which can be used to take pictures and save them in a separate album. I used photos framework to save images and now I need to save GPS data (location where the picture is taken) with the picture (may be in metadata). I searched for any method to do this thing using photos framework but I failed, I couldn't find anything related. Any help would be highly appreciated.
This is the peace of code I used to save pictures
[[PHPhotoLibrary sharedPhotoLibrary] performChanges:^{
PHAssetChangeRequest *assetRequest = [PHAssetChangeRequest creationRequestForAssetFromImage:capturedImage];
placeholder = [assetRequest placeholderForCreatedAsset];
photosAsset = [PHAsset fetchAssetsInAssetCollection:Album options:nil];
PHAssetCollectionChangeRequest *albumChangeRequest = [PHAssetCollectionChangeRequest changeRequestForAssetCollection:Album assets:photosAsset];
[albumChangeRequest addAssets:#[placeholder]];
} completionHandler:^(BOOL success, NSError *error) {
if (success)
{
}
else
{
}
}];
I assume you want the latitude & longitude. Have you tried the location property of PHAsset class?

iOS - Is it possible to rename image file name before saving it into iPhone device gallery from app?

Hey I'm new to iPhone and I have been trying to make an gallery kind of app. Basically, what I want to do is that i need to save all the captured images into a specific folder like a new album "My_App Images" related to our app name in iPhone device gallery, it's working for me, but I am having trouble to change the image file name, i don't know that Is it possible to specify a file name? Using iPhoto, currently i am getting image file name as "IMG_0094.jpg", can we change it with any other file name like "Anyfilename.png" format programmatically?
here is my code for saving images to the specific album :
- (void)imagePickerController:(UIImagePickerController *)picker didFinishPickingImage:(UIImage *)image editingInfo:(NSDictionary *)editingInfo
{
[self.library saveImage:image toAlbum:#"My_App Images" withCompletionBlock:^(NSError *error) {
if (error!=nil) {
NSLog(#"Image saving error: %#", [error description]);
}
}];
[picker dismissViewControllerAnimated:NO completion:nil];
}
Any source or link for reference is appreciated. Thanks for the help!
There is a way to kinda do that, by setting the image IPTC metadata field "Object Name". If you later import the image to iPhoto, then this name will be used as its title.
See details (and code) at http://ootips.org/yonat/how-to-set-the-image-name-when-saving-to-the-camera-roll/ .
Do you meant,
// Build NSData in memory from the btnImage...
NSData* imageData = UIImageJPEGRepresentation(image, 1.0);
// Save to the default Apple (Camera Roll) folder.
[imageData writeToFile:#"/private/var/mobile/Media/DCIM/100APPLE/customImageFilename.jpg" atomically:NO];
Now adjust the path of folder as per your folder name...
Sorry to disappoint you, but it seems that you can not change the name of the photos, before or after saving, in the photo album, custom or not. Here is a post to explain it:
iOS rename/delete albums of photos
Edit
So, to clarify my comment, use the following override:
Download the NSMutableDictionary category for metadata of image here.
Also download the sample project CustomAlbumDemo from here and modify the NSMutableDictionary+ImageMetadata.m file in the CustomAlbumDemo project as:
-(void)saveImage:(UIImage*)image toAlbum:(NSString*)albumName withCompletionBlock:(SaveImageCompletion)completionBlock
{
//write the image data to the assets library (camera roll)
NSData* imageData = UIImageJPEGRepresentation(image, 1.0);
NSMutableDictionary *metadata = [[NSMutableDictionary alloc] init];
[metadata setDescription:#"This is my special image"];
[self writeImageDataToSavedPhotosAlbum:imageData metadata:metadata completionBlock:^(NSURL *assetURL, NSError *error) {
//error handling
if (error!=nil) {
completionBlock(error);
return;
}
//add the asset to the custom photo album
[self addAssetURL: assetURL
toAlbum:albumName
withCompletionBlock:completionBlock];
}];
}

Get UIImage from URL like assets-library:/

I'm developing and iOS app for iPad and I'm using a Repository called Grabkit in order to get images from different services like Instagram and Flicker in addition to images from the Camera Roll. The problem is that when the user selects a picture from the roll I get and URL such this: assets-library://asset/asset.JPG?id=DCFB9E49-93AA-49E3-89C8-2EE64AE2C4C6&ext=JPG
I've tried some codes to get the image from this kind of paths but no one has worked, such as the following:
ALAssetsLibrary* library = [[ALAssetsLibrary alloc] init];
// Ask for the "Asset" for the URL. An asset is a representation of an image in the Photo application.
[library assetForURL:originalImage.URL
resultBlock:^(ALAsset *asset) {
// Here, we have the asset, let's retrieve the image from it
CGImageRef imgRef = [[asset defaultRepresentation] fullResolutionImage];
/* Instead of the full res image, you can ask for an image that fits the screen
CGImageRef imgRef = [[asset defaultRepresentation] fullScreenImage];
*/
// From the CGImage, let's build an UIImage
imatgetemporal = [UIImage imageWithCGImage:imgRef];
} failureBlock:^(NSError *error) {
// Something wrong happened.
}];
Is something in my code wrong? Must I try another code?

Resources