I want to directly open the 'Camera Roll' album using the imagePickerController instead of showing all 3 albums (camera roll, photo library, last import).
Is there any way to do that?
Use
imagePickerController.sourceType = UIImagePickerControllerSourceTypeSavedPhotosAlbum;
It will directly take you to camera roll.
It is very simple to do that ...
For an example of a button ... You click on the button and then try to use :
imgPicker.sourceType = UIImagePickerControllerSourceTypeCamera;
By using that you will force the Controller to use the camera.
.
[self presentModalViewController:imgPicker animated:YES];
imgPicker is here the name of my controller
To achieve this you have only oneway..
You can create customImagePickerController and grab display all camera roll images in it.
For that you can use collectionview
or else
https://github.com/rahulmane91/CustomAlbumDemo
May this useful to you.
Thanks & Regards
Nirav Zalavadia
make use of Photos Framework
#property(nonatomic , strong) PHFetchResult *assetsFetchResults;
NSMutableArray *array;
PHFetchOptions *fetchOptions = [[PHFetchOptions alloc] init];
fetchOptions.predicate = [NSPredicate predicateWithFormat:#"title = %#", #"Custom Photo Album"];
collection = [PHAssetCollection fetchAssetCollectionsWithType:PHAssetCollectionTypeAlbum
subtype:PHAssetCollectionSubtypeAny
options:fetchOptions].firstObject;
_assetsFetchResults = [PHAsset fetchAssetsInAssetCollection:collection options:nil];
Use the above code and put your album name whose data you want to Fetch in the place of "Custom Photo Album"
PHFetchResult *collectionResult = [PHAsset fetchAssetsInAssetCollection:collection options:nil];
for (int h=0; h<[collectionResult count]; h++) {
PHAsset *asset1 = collectionResult[h];
[_imageManager requestImageForAsset:asset1 targetSize:frame.size contentMode:PHImageContentModeAspectFill options:nil resultHandler:^(UIImage *result, NSDictionary *info)
{
[array2 addObject:result];
}];
}
NSLog(#"array count%lu",(unsigned long)[array2 count]);
and use the array anywhere you want to display all albums images
Tou can use this code to access directly
imagePickerController.sourceType = UIImagePickerControllerSourceTypeSavedPhotosAlbum
but you will have to have a better UI
Try this framework which gives both cam and cam roll
https://github.com/hyperoslo/ImagePicker
https://github.com/hyperoslo/Gallery
Related
I have a requirement in which the user needs to fetch a gif from a list of gif files in library. I tried to fetch both images & Videos without any issue. But when I used kUTTypeGIF as media, it crashes with error :
Terminating app due to uncaught exception
'NSInvalidArgumentException', reason: 'No available types for source
0'
Here is my code:
#import "ViewController.h"
#import <MobileCoreServices/MobileCoreServices.h>
#interface ViewController ()<UIImagePickerControllerDelegate, UINavigationControllerDelegate>
#end
#implementation ViewController
- (void)viewDidLoad {
[super viewDidLoad];
}
-(IBAction)btnChooseGif:(id)sender {
UIImagePickerController *imagePicker = [[UIImagePickerController alloc] init];
imagePicker.delegate = self;
imagePicker.sourceType = UIImagePickerControllerSourceTypePhotoLibrary;
imagePicker.mediaTypes = [[NSArray alloc] initWithObjects:(NSString *)kUTTypeGIF, nil]; // Here is the crash
[self presentViewController:imagePicker animated:YES completion:nil];
}
-(void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary<NSString *,id> *)info
{
}
#end
How can i solve this? And if kUTTypeGIF media is not supported here, how can i show the list all gif files to the user for choosing one? I need to display gif files only in the UIImagePickerController
iOS does not give you an easy way to determine -- while using UIImagePickerController -- what the underlying file format is for the pictures stored in the camera roll. Apple's philosophy here is that an image should be thought of as a UIImage object and that you should not care what the ultimate file format is.
So, since you can not use UIImagePickerController to filter out GIF files. Here's a couple possibilities for you:
1 )
Once you pick an image, you can determine what kind of file it is. Here's an example question that asks how to determine if the image is a PNG or JPEG. Once the user picks a file, you'll know whether it's a GIF or a JPEG or a PNG or whatever.
2 )
You could convert any UIImage to a GIF file. Here's a question that points to a library that might be able to help.
3 )
You could iterate across the entire camera roll and convert/save those images into your app's documents directory as GIF files. Something that starts with enumeration found in this related question and then runs each picture through the ImageIO framework to convert it to a gif file (the code for which I pointed out in solution # 2). You can then roll your own picker.
p.s. your own code wasn't going to work because, as Nathan pointed out, gif is not a media type. This is a function that points out the available media types:
-(IBAction)btnChooseGif:(id)sender {
NSArray *availableMedia = [UIImagePickerController availableMediaTypesForSourceType: UIImagePickerControllerSourceTypePhotoLibrary];
NSLog(#"availableMedia is %#", availableMedia);
UIImagePickerController *imagePicker = [[UIImagePickerController alloc] init];
imagePicker.delegate = self;
imagePicker.sourceType = UIImagePickerControllerSourceTypePhotoLibrary;
imagePicker.mediaTypes = [[NSArray alloc] initWithObjects:(NSString *)kUTTypeImage, nil];
[self presentViewController:imagePicker animated:YES completion:nil];
}
If you only want to fetch assets from Photos library without picker, you can use PHFetchResult for getting array of PHAsset. Below is the list of available MediaType enums available in Photos.Framework:
typedef NS_ENUM(NSInteger, PHAssetMediaType) {
PHAssetMediaTypeUnknown = 0,
PHAssetMediaTypeImage = 1,
PHAssetMediaTypeVideo = 2,
PHAssetMediaTypeAudio = 3,
} PHOTOS_ENUM_AVAILABLE_IOS_TVOS(8_0, 10_0);
You can use it as :
PHFetchResult *result = [PHAsset fetchAssetsWithMediaType:PHAssetMediaTypeImage options:nil];
and request image from asset as :
PHImageManager *manager = [PHImageManager defaultManager];
PHImageRequestOptions *requestOptions = [[PHImageRequestOptions alloc] init];
requestOptions.resizeMode = PHImageRequestOptionsResizeModeExact;
requestOptions.deliveryMode = PHImageRequestOptionsDeliveryModeHighQualityFormat;
requestOptions.synchronous = true;
[manager requestImageDataForAsset:asset options:requestOptions resultHandler:^(NSData * _Nullable imageData, NSString * _Nullable dataUTI, UIImageOrientation orientation, NSDictionary * _Nullable info) {
NSLog(#"Data UTI :%# \t Info :%#",dataUTI,info);
}];
Hope this will help you!!
In iOS 11 you can get all gif by Smart Album.
func getGif() -> PHFetchResult<PHAsset> {
if let gifCollection = PHAssetCollection.fetchAssetCollections(with: .smartAlbum, subtype: .smartAlbumAnimated, options: nil).firstObject {
return PHAsset.fetchAssets(in: gifCollection, options: nil)
}
return PHFetchResult<PHAsset>()
}
When I take a video using UIImagePickerController and save it to SavedPhotosAlbum using
ALAssetsLibrary *library = [[ALAssetsLibrary alloc] init];
[library writeVideoAtPathToSavedPhotosAlbum:(NSURL *)movieURL
completionBlock:^(NSURL *assetURL, NSError *error){
if (error) {
NSLog(#"Save video fail:%#",error);
} else {
NSLog(#"Save video succeed.");
[self getsavedvideo];
}}];
the saved file contains a name and creation date but not the GPS info. I am surprised by this since I am using the built in camera and the imagePicker. Why wouldn't Apple just include the GPS coordinates (even if I had to ask users permission to use location services).
anyway I digress
I have not found anyway to add the GPS coordinates to the video as it is being saved and I haven't been able to find anyway to add the this information in the completionBlock.
If someone knows how please let me know.
So I wrote a bit of code that is called after the video is saved [self getsavedvideo]; above which gets the saved video and I can get the movie name and creation date and it shows there is no location info.
-(void) getsavedvideo;{
ALAssetsLibrary *library = [[ALAssetsLibrary alloc] init];
// Enumerate just the photos and videos group by using ALAssetsGroupSavedPhotos.
[library enumerateGroupsWithTypes:ALAssetsGroupSavedPhotos usingBlock:^(ALAssetsGroup *group, BOOL *stop) {
// Within the group enumeration block, filter to enumerate just photos.
[group setAssetsFilter:[ALAssetsFilter allVideos]];
// Chooses the photo at the last index
[group enumerateAssetsWithOptions:NSEnumerationReverse usingBlock:^(ALAsset *alAsset, NSUInteger index, BOOL *innerStop) {
// The end of the enumeration is signaled by asset == nil.
if (alAsset) {
ALAssetRepresentation *representation = [alAsset defaultRepresentation];
// Stop the enumerations
*stop = YES; *innerStop = YES;
// Do something interesting with the AV asset.
NSString *fileName = [representation filename];
NSDate *myDate = [alAsset valueForProperty:ALAssetPropertyDate];
CLLocation *location = [alAsset valueForProperty:ALAssetPropertyLocation];
NSLog(#"fileName!!!!,%#",fileName);
NSDateFormatter * dateFormatter = [[NSDateFormatter alloc] init];
[dateFormatter setDateFormat:mad:"dd-MM-yyyy 'at' HH:mm"];
NSLog( #"date for this file is %# %#", [[[alAsset defaultRepresentation] url] absoluteString], [dateFormatter stringFromDate: myDate] );
NSLog (#"locate!!!!%#",location);
} }];
} failureBlock: ^(NSError *error) {
NSLog(#"No groups");}];}
so does anyone know how to append this with the GPS information and save it back to the video's metadata.
I have been wracking my brains out on this for a few days and haven't really found anything helpful.
Plz help
So I figured this out for a device running ios8 or newer anyway.
add this code in the -(void) getsavedvideo that was in the completionBlock above.
Make sure you have
import
if ([PHAsset class]) { // If this class is available, we're running iOS 8
PHFetchOptions *fetchOptions = [PHFetchOptions new];
fetchOptions.sortDescriptors = #[[NSSortDescriptor sortDescriptorWithKey:#"creationDate" ascending:NO]];
fetchOptions.fetchLimit = 1;
PHFetchResult *fetchResult = [PHAsset fetchAssetsWithMediaType:PHAssetMediaTypeVideo options:fetchOptions];
PHAsset *lastImageAsset = [fetchResult lastObject];
[[PHImageManager defaultManager]requestImageForAsset:lastImageAsset targetSize:PHImageManagerMaximumSize contentMode:PHImageContentModeDefault options:nil resultHandler:^(UIImage *result, NSDictionary *info){
if ([info objectForKey:PHImageErrorKey] == nil && ![[info objectForKey:PHImageResultIsDegradedKey] boolValue]) {
NSArray *resources = [PHAssetResource assetResourcesForAsset:lastImageAsset];
NSString *orgFilename = ((PHAssetResource*)resources[0]).originalFilename;
[lastImageAsset requestContentEditingInputWithOptions:nil
completionHandler:^(PHContentEditingInput *contentEditingInput, NSDictionary *info) {
NSURL *imageURL = contentEditingInput.fullSizeImageURL;
NSString *urlstring = [imageURL absoluteString];
NSLog(#"urlstring%#",urlstring);
}];
CLLocationCoordinate2D locationNew = CLLocationCoordinate2DMake( currentLocation.coordinate.latitude, currentLocation.coordinate.longitude) ;
NSDate *nowDate = [NSDate date];
CLLocation *myLocation = [[CLLocation alloc ]initWithCoordinate:locationNew altitude:0.0 horizontalAccuracy:1.0 verticalAccuracy:1.0 timestamp:nowDate];
[[PHPhotoLibrary sharedPhotoLibrary] performChanges:^{
// Create a change request from the asset to be modified.
PHAssetChangeRequest *request = [PHAssetChangeRequest changeRequestForAsset:lastImageAsset];
// Set a property of the request to change the asset itself.
request.location = myLocation;
} completionHandler:^(BOOL success, NSError *error) {
NSLog(#"Finished updating asset. %#", (success ? #"Success." : error));
}];
}
}];
}
else {
//if you aren't running ios8 or newer I haven't found a way to add/change metadata but I will keep looking.So if you leave this blank no new metadata is added.
}
You can add other metadata just create the data in the form needed and then use request.(what you want to change) = new data.
you can check to see if the data already exist and if there is a metadata for it by putting in something like
lastImageAsset (which was the PHAsset fetch result above) and then the property like
NSDate *originalFileDate = [lastImageAsset creationDate]; etc.
there is a list at
https://developer.apple.com/reference/photos/phasset
I am creating an Application that fetches Images from the photo library of the phone and remove duplicate images from it.I searched a lot but did not find any way to delete the image from the photo library.
i create on demo for that
here is the code
i used UIImageJPEGRepresentation to convert image into data.and compare them it gives me result.is there any other image property that we can compare?
UIImage *img1 = [UIImage imageNamed:#"1.jpg"];
UIImage *img2 = [UIImage imageNamed:#"2.jpg"];
NSData *data1 = UIImageJPEGRepresentation(img1, 1.0);
NSLog(#"%#",data1);
NSData *data2 = UIImageJPEGRepresentation(img2, 1.0);
NSLog(#"%#",data2);
if ([data1 isEqualToData:data2])
{
NSLog(#"yes");
}
else
{
NSLog(#"no");
}
As you know we dont have access to modify anything out side the sandbox.So before ios 8 it was not possible to delete photos from photo library.But in ios 8 and later versions are supported to delete photos from library but before removing it will ask user that you want to delete photos.If user allow then photos will be deleted.
I providing you the CODE which I have used in my app to delete photo from photo library.
if (check system version >= 8.0)
{
PHPhotoLibrary *library = [PHPhotoLibrary sharedPhotoLibrary];
[library performChanges:^{
PHFetchResult *assetsToBeDeleted = [PHAsset fetchAssetsWithALAssetURLs:delet
options:nil];
[PHAssetChangeRequest deleteAssets:assetsToBeDeleted];
} completionHandler:^(BOOL success, NSError *error) {
//do something here when error
}];
}
Where delet is the array of asset url of images you get from library with help of AssetLibrary framework.
PHFetchResult *moments = [PHAssetCollection fetchMomentsWithOptions:nil];
for (PHAssetCollection *moment in moments)
{
PHFetchResult *assetsFetchResults = [PHAsset fetchAssetsInAssetCollection:moment options:nil];
for (PHAsset *asset in assetsFetchResults)
{
// Do something with assets, for example add them to array.
}
}
I am using Photos framework to select photos from the Camera roll. After selecting the assets from the grid, I am using PHImageManager to access each of the selected images and then storing these images in array to show in a collection view of mine.
I am using this piece of code to achieve that:-
-(void)extractFullSizeImagesFromAssets{
PHImageRequestOptions* options = [[PHImageRequestOptions alloc] init];
options.version = PHImageRequestOptionsVersionCurrent;
options.deliveryMode = PHImageRequestOptionsDeliveryModeHighQualityFormat;
options.resizeMode = PHImageRequestOptionsResizeModeExact;
options.networkAccessAllowed = TRUE;
for (int i = 0; i < self.assets.count; i++) {
PHAsset * asset = [self.assets objectAtIndex:i];
CGSize fullSizeImage = CGSizeMake(1000, (asset.pixelHeight / asset.pixelWidth) * 1000);
[[PHImageManager defaultManager] requestImageForAsset:asset
targetSize:fullSizeImage
contentMode:PHImageContentModeAspectFit
options:options
resultHandler:^(UIImage *image, NSDictionary *info){
// [self.arr_images addObject:image];
[_arr_fullSizeImages addObject:image];
}];
}
}
Now my array "arr_fullSizeImages" contains the extracted images in some different random order than the way I did select while picking up the assets. For Example If I have selected 5 images from the camera roll then sometimes the selected image which was at index 3 in Camera Roll is saved on index 5 in the arr_fullSizeImages.
I am not able to track the reason for this behaviour. Please identify the source of the mistake and how t solve this error also.
Thanks.
This is the expected behaviour as requestImageForAsset executed by default asynchronously.
If you want a synchronous behaviour (and no random order), just set
options.synchronous = YES;
I am using the following method to request a number of photos and add them to an array for later use:
-(void) fetchImages{
self.assets = [[PHFetchResult alloc]init];
PHFetchOptions *options = [[PHFetchOptions alloc] init];
options.sortDescriptors = #[[NSSortDescriptor sortDescriptorWithKey:#"creationDate" ascending:YES]];
self.assets = [PHAsset fetchAssetsWithMediaType:PHAssetMediaTypeImage options:options];
self.photosToVideofy = [[NSMutableArray alloc]init];
CGSize size = CGSizeMake(640, 480);
PHImageRequestOptions *photoRequestOptions = [[PHImageRequestOptions alloc]init];
photoRequestOptions.synchronous = YES;
for (NSInteger i = self.firstPhotoIndex; i < self.lastPhotoIndex; i++)
{
PHAsset *asset = self.assets[i];
[[PHImageManager defaultManager] requestImageForAsset:asset targetSize:size contentMode:PHImageContentModeAspectFit options:photoRequestOptions resultHandler:^(UIImage *result, NSDictionary *info) {
if (result) {
[self.photosToVideofy addObject:result];
}
}];
}
NSLog(#"There are %lu photos to Videofy", (unsigned long)self.photosToVideofy.count);
}
This works fine when the number of photos is less than 50. After that memory jumps to 150-160mb, I get the message Connection to assetsd was interrupted or assetsd died and the app crashes.
How can I release the assets (PHFetchResult) from memory after I get the ones I want?(do i need to?)
I would like to be able to add up to 150 photos.
Any ideas?
Thanks
You should not put the results from PHFetchResult into an Array. The idea of PHFetchResult is to point to many images from the Photos library without storing them all in RAM, (I'm not sure how exactly it does this) just use the PHFetchResult object like an array and it handles the memory issues for you. For example, connect a collectionViewController to the PHFetchResult object directly and use the PHImageManager to request images only for visible cells etc.
From apple documentation:
"Unlike an NSArray object, however, a PHFetchResult object dynamically loads its contents from the Photos library as needed, providing optimal performance even when handling a large number of results."
https://developer.apple.com/library/ios/documentation/Photos/Reference/PHFetchResult_Class/
Your code inside fetchImages method needs some refactoring, take a look on this suggestion:
-(void) fetchImages {
PHFetchOptions *options = [[PHFetchOptions alloc] init];
options.sortDescriptors = #[[NSSortDescriptor sortDescriptorWithKey:#"creationDate" ascending:YES]];
PHFetchResult *assets = [PHAsset fetchAssetsWithMediaType:PHAssetMediaTypeImage options:options];
self.photosToVideofy = [[NSMutableArray alloc] init];
CGSize size = CGSizeMake(640, 480);
PHImageRequestOptions *photoRequestOptions = [[PHImageRequestOptions alloc] init];
photoRequestOptions.synchronous = YES;
for (NSInteger i = self.firstPhotoIndex; i < self.lastPhotoIndex; i++)
{
PHAsset *asset = assets[i];
[[PHImageManager defaultManager] requestImageForAsset:asset targetSize:size contentMode:PHImageContentModeAspectFit options:photoRequestOptions resultHandler:^(UIImage *result, NSDictionary *info) {
if (result) {
[self.photosToVideofy addObject:result];
}
}];
}
NSLog(#"There are %lu photos to Videofy", (unsigned long)self.photosToVideofy.count);
}
But the problem is memory consumption. Lets make some calculations.
Single image, using ARGB and 4 bytes per pixel:
640x480x4 = 1.2MB
And now, you want to store in the RAM 150 images, so:
150x1.2MB = 180MB
For example, iPhone 4 with 512 MB will crash if you use more that about 300 MB, but it can be less if other apps are also consuming a lot of RAM.
I think, you should consider storing images to files instead to RAM.
This might be intentional (can't tell without looking at the rest of your code), but self.photosToVideofy is never released: since you're accessing it in a block, the object to which you pass the block ([PHImageManager defaultManager]) will always have a reference to the array.
Try explicitly clearing your array when you're done with the images. The array itself still won't be released, but the objects it contains will (or can be if they're not referenced anywhere else).
The best solution is to remove the array from the block. But, that would require changing the logic of your code.
You have to set
photoRequestOptions.synchronous = NO;
instead of
photoRequestOptions.synchronous = YES;
Worked for me, iOS 10.2