How to convert PHAsset to UIImage in Objective-C - ios

I have an array filled with PHAsset objects (https://developer.apple.com/library/prerelease/ios/documentation/Photos/Reference/PHAsset_Class/index.html), and I want to know how I can convert them into a UIImage and then save them in an Array.
The array with the PHAsset objects is called self.assets and here is what I have so far:
PHImageManager *manager = [PHImageManager defaultManager];
CGFloat scale = UIScreen.mainScreen.scale;
NSMutableArray *images = [NSMutableArray arrayWithCapacity:[self.assets count]];
for (int i = 0; i < [self.assets count]; i++) {
CGSize targetSize = CGSizeMake(scale, scale);
[manager requestImageForAsset:[self.assets objectAtIndex:i]
targetSize:targetSize
contentMode:PHImageContentModeAspectFill
options:self.requestOptions
resultHandler:^(UIImage *image, NSDictionary *info){
[images addObject:image];
}];
}
self.requestOptions is a property in the .h
#property (nonatomic, strong) PHImageRequestOptions *requestOptions;
and in the viewDidLoad I am doing this:
self.requestOptions = [[PHImageRequestOptions alloc] init];
self.requestOptions.resizeMode = PHImageRequestOptionsResizeModeExact;
self.requestOptions.deliveryMode = PHImageRequestOptionsDeliveryModeHighQualityFormat;
But after doing some debugging, I keep seeing that self.assets has the following values:
(
<PHAsset: 0x1743828a0> 3B6D658D-EC76-43A1-9793-35D889E9CF15/L0/001 mediaType=1/0, assetSource=3, (2448x2448), creationDate=2015-07-27 02:02:46 +0000, location=1, hidden=0, favorite=0 ,
<PHAsset: 0x174382970> 50F05575-71D2-446B-BD1E-8E3250E375AD/L0/001 mediaType=1/0, assetSource=3, (2448x2448), creationDate=2015-07-27 02:02:47 +0000, location=1, hidden=0, favorite=0
)
and images is empty. Does anyone know how I can add convert the PHAssets into UIImages and add them to the images array? Any help is appreciated. Thanks!

For anyone struggling as much as I had on this issue, this is the way to go.
First set the requestOptions as:
self.requestOptions = [[PHImageRequestOptions alloc] init];
self.requestOptions.resizeMode = PHImageRequestOptionsResizeModeExact;
self.requestOptions.deliveryMode = PHImageRequestOptionsDeliveryModeHighQualityFormat;
// this one is key
self.requestOptions.synchronous = YES;
and if there are multiple assets in an array filled with PHAsset objects, then add this code:
self.assets = [NSMutableArray arrayWithArray:assets];
PHImageManager *manager = [PHImageManager defaultManager];
NSMutableArray *images = [NSMutableArray arrayWithCapacity:[assets count]];
// assets contains PHAsset objects.
__block UIImage *ima;
for (PHAsset *asset in self.assets) {
// Do something with the asset
[manager requestImageForAsset:asset
targetSize:PHImageManagerMaximumSize
contentMode:PHImageContentModeDefault
options:self.requestOptions
resultHandler:^void(UIImage *image, NSDictionary *info) {
ima = image;
[images addObject:ima];
}];
}
and now the images array contains all the images in uiimage format.

Swift 3.0 Answer
let photoAsset = asset
let manager = PHImageManager.default()
var options: PHImageRequestOptions?
options = PHImageRequestOptions()
options?.resizeMode = .exact
options?.isSynchronous = true
manager.requestImage(
for: photoAsset,
targetSize: PHImageManagerMaximumSize,
contentMode: .aspectFill,
options: options
) { [weak self] result, _ in
completion(result)
}
options?.isSynchronous = true is very important
BOOL synchronous; // return only a single result, blocking until available (or failure). Defaults to NO

image may be nil(seldom). It's better to evaluate, otherwise array add nil will crash app.
if(image){
ima = image;
[images addObject:ima];
}

Related

get dropbox thumbnails with getThumbnailBatch

I'm using the dropbox objc API and I'm trying to get all thumbnails in a specific dropbox folder.
But I'm completely stuck at DBFILESGetThumbnailBatchArg. How do I initiate paths to all images in a folder?
This is the line I'm stuck at:
[[client.filesRoutes getThumbnailBatch:<#(nonnull NSArray<DBFILESThumbnailArg *> *)#>]
setResponseBlock:^(
DBFILESGetThumbnailBatchResult * _Nullable result,
DBFILESGetThumbnailBatchError * _Nullable routeError,
DBRequestError * _Nullable networkError) { etc etc..
Documentation says
DBFILESThumbnailArg *arg = [[DBFILESThumbnailArg alloc] initWithPath:<#(nonnull NSString *)#>];
DBFILESGetThumbnailBatchArg *batchArg = [[DBFILESGetThumbnailBatchArg alloc]
initWithEntries:<#(nonnull NSArray<DBFILESThumbnailArg *> *)#>];
How do I init a list of paths of DBFILESThumbnailArg?
Link to documentation:
https://dropbox.github.io/dropbox-sdk-obj-c/api-docs/latest/Classes/DBFILESRouteObjects.html#/c:objc(cs)DBFILESRouteObjects(cm)DBFILESGetThumbnailBatch
As you found, the getThumbnailBatch method expects an NSArray<DBFILESThumbnailArg *>, so calling it would look like this:
NSArray<DBFILESThumbnailArg *> *entries = #[[[DBFILESThumbnailArg alloc] initWithPath:#"/test1.jpg"], [[DBFILESThumbnailArg alloc] initWithPath:#"/test2.jpg"]];
[[client.filesRoutes getThumbnailBatch:entries]
setResponseBlock:^(DBFILESGetThumbnailBatchResult *result, DBFILESGetThumbnailBatchError *routeError, DBRequestError *networkError) {
if (result) {
NSLog(#"result:");
NSLog(#"%#", result);
} else if (routeError) {
NSLog(#"routeError:");
NSLog(#"%#", routeError);
} else if (networkError) {
NSLog(#"networkError:");
NSLog(#"%#", networkError);
};
}];
I solved this using a NSMutableArray, posting my solution if others come looking:
//Create a temporary NSMutableArray
NSMutableArray *thumbArgMutable = [[NSMutableArray alloc] init];
for (NSString* image in _images)
{
//Create DBFILESThumbnailArg from NSString
DBFILESThumbnailArg *arg = [[DBFILESThumbnailArg alloc] initWithPath:image];
//Add path as DBFILESThumbnailArg to NSMutableArray
[thumbArgMutable addObject:arg];
}
//Copy NSMutableArray to a new DBFILESThumbnailArg
DBFILESThumbnailArg *thumbArg = [thumbArgMutable copy];
//create a DBFILESGetThumbnailBatchArg and init with the copied DBFILESThumbnailArg
DBFILESGetThumbnailBatchArg *thumbArgBatch = [[DBFILESGetThumbnailBatchArg alloc] initWithEntries:thumbArg];
DBUserClient *client = [[DBUserClient alloc] initWithAccessToken:#"TOKEN"];
//use property entries from DBFILESGetThumbnailBatchArg
[[client.filesRoutes getThumbnailBatch:thumbArgBatch.entries]
setResponseBlock:^(DBFILESGetThumbnailBatchResult * _Nullable result,
DBFILESGetThumbnailBatchError * _Nullable routeError,
DBRequestError * _Nullable networkError)
{
if (result) {
NSLog(#"%#\n", result);
//loop all downloaded thumbnails
for (DBFILESGetThumbnailBatchResultEntry *data in result.entries)
{
//extract data from each base64 encoded thumbnail string
NSData *thumbData = [[NSData alloc] initWithBase64EncodedString:data.success.thumbnail options:0];
//create UIImage from data
UIImage *thumbImage = [UIImage imageWithData:thumbData];
}
}
else { //if download failed
NSLog(#"%#\n%#\n", routeError, networkError);
}

Receiveing array of Images from CoreData

I've created NSManagedObject* imagesArrayData that stores strings (paths) to images stored in the documents directory:
- (void)setImagesArray:(NSMutableArray *)imagesArray {
NSMutableArray* newImagesArray = [NSMutableArray new];
int i = 1;
for (UIImage* image in imagesArray) {
//generate path to createdFile
NSString* fileName = [NSString stringWithFormat:#"%#_%d", self.name, i];
NSString* filePath = [self documentsPathForFileName:fileName];
//save image to disk
NSData *imageData = UIImagePNGRepresentation(image);
[imageData writeToFile:filePath atomically:YES];
//add image path to CoreData
[newImagesArray addObject:filePath];
i++;
}
//set new value of imagesArray
imagesArrayData = [NSKeyedArchiver archivedDataWithRootObject:newImagesArray];
I am now not showing pathsToImages in header file, but property imagesArray:
-(NSMutableArray*) imagesArray {
NSMutableArray* images = [NSMutableArray new];
NSArray* imagePaths = [NSKeyedUnarchiver unarchiveObjectWithData:imagesArrayData];
for (NSString* imagePath in imagePaths) {
UIImage *image = [[UIImage alloc] initWithContentsOfFile: imagePath];
[images addObject:image];
}
return images;
The problem is, that whenever I want to get to [imagesArray objectatIndex:xxx], the imagesArray getter is called, and it takes time to recreate the full array. When trying to switch fast between images, the UI slows down.
What would be the elegant way to overcome this problem? Maybe creating another array full of images and updating it from time to time? Maybe something else? Please, help.
One thing you could do is refactor your getter to lazily load the array. If it is already defined, simply return it. If not, build it:
-(NSMutableArray*) imagesArray
{
if (!_imagesArray)
{
NSMutableArray* _imagesArray = [NSMutableArray new];
NSArray* imagePaths =
[NSKeyedUnarchiver unarchiveObjectWithData: imagesArrayData];
for (NSString* imagePath in imagePaths)
{
UIImage *image = [[UIImage alloc] initWithContentsOfFile: imagePath];
[_imagesArray addObject:image];
}
return _imagesArray;
}
I'm not sure what you mean about updating an array of images from time to time.
If your array of image names changes you will need some method to respond to those changes.

BAD_ACCESS when calling CSSearchableIndex indexSearchableItems

I am trying to implement the CoreSpotlight API in my app but can't seem to figure out why I am getting a BAD_ACCESS exception with my implementation:
CSSearchableItemAttributeSet * attributeSet = [CSSearchableItemAttributeSet new];
attributeSet.title = route.Options.name;
attributeSet.keywords = [route.Options.name componentsSeparatedByString:#" "];
UIImage *image = [UIImage imageNamed:#"pin_busstop.png"];
NSData *imageData = [NSData dataWithData:UIImagePNGRepresentation(image)];
attributeSet.thumbnailData = imageData;
CSSearchableItem * item = [[CSSearchableItem alloc] initWithUniqueIdentifier:route.ObjectID domainIdentifier:#"com.whatever.itsmyappsname.loadwithroute" attributeSet:attributeSet];
[[CSSearchableIndex defaultSearchableIndex] indexSearchableItems:[NSArray arrayWithObject:item] completionHandler:^(NSError * _Nullable error) {
NSLog(#"It worked");
}];
Looking at the call stack for the exception, I can see that it occurs on the CSSearchableIndex indexSearchableItems: completionHandler: call. However I can step past that call without the exception triggering, maybe it has to do with the completion handler, however it happens regardless of if I have one or not. I have CoreSpotlight/CoreSpotlight.h and MobileCoreServices/MobileCoreServices.h imported both in my .h file and in the target.
You're (I'm) creating the CSSearchableItemAttributeSet object incorrectly. Instead of:
CSSearchableItemAttributeSet * attributeSet = [CSSearchableItemAttributeSet new];
Use:
CSSearchableItemAttributeSet * attributeSet = [[CSSearchableItemAttributeSet alloc]
initWithItemContentType:(NSString *)kUTTypeImage];

Is it possible to add own metadata in captured Images in Swift

I'm very new to Swift and Ios programming. I like to, as mentioned above, insert my own metadata to captured images before i save them to album.
I'm trying to get this done with this code. The saved image does not contain my own metadata, but its generated metadata. Can anybody please tell me what I'm doing wrong?
Or maybe isn't it possible to add own new metadata table to captured images?
Thanks a lot for your help
#IBAction func btnPressed(sender: UIButton) {
capturePicture()
}
func capturePicture(){
stillImageOutput.outputSettings = [AVVideoCodecKey: AVVideoCodecJPEG]
session.addOutput(stillImageOutput)
if let connection = self.stillImageOutput.connectionWithMediaType(AVMediaTypeVideo) {
self.stillImageOutput.captureStillImageAsynchronouslyFromConnection(connection) {
(imageDataSampleBuffer, error) -> Void in
if error == nil {
var asset = ALAssetsLibrary()
let imageData = AVCaptureStillImageOutput.jpegStillImageNSDataRepresentation(imageDataSampleBuffer)
// The Metadata of the Image
var metadata:NSDictionary = CMCopyDictionaryOfAttachments(nil, imageDataSampleBuffer, CMAttachmentMode(kCMAttachmentMode_ShouldPropagate)).takeUnretainedValue()
// My Metadata i want to add for testing purpose
var meta : NSDictionary = ["Ersteller": "Dennis","Datum" : "25.04.14","Ort" : "Köln" ]
asset.writeImageDataToSavedPhotosAlbum(imageData, metadata: meta as [NSObject : AnyObject], completionBlock: { (path:NSURL!, error:NSError!) -> Void in
println("\(path)")
println("\(error)")
})
}
}
}
}
Just Convert Below code to Swift. Below code are written in Objective-C. You just need to create IPTC or TIFF dictionary. Add value with suitable IPTC/TIFF key and write dictionary data(Meta Data) on image.
- (void) imagePickerController: (UIImagePickerController *)picker didFinishPickingMediaWithInfo: (NSDictionary *)info
{
UIImage *image = info[UIImagePickerControllerOriginalImage];
//Here We Get current system date and time and store as a description of photo
NSDateFormatter *dateFormatter=[[NSDateFormatter alloc] init];
[dateFormatter setDateFormat:#"dd-MM-yyyy"];
NSLog(#"Date Formatter : %#",[dateFormatter stringFromDate:[NSDate date]]);
//hh:mm:ss
NSDateFormatter *timeFormatter=[[NSDateFormatter alloc] init];
[timeFormatter setDateFormat:#"hh:mm:ss"];
NSLog(#"time Formatterr : %#",[timeFormatter stringFromDate:[NSDate date]]);
//ADD IPTC Dictionary Data as a META DATA
NSMutableDictionary *iptcDict = [NSMutableDictionary dictionary];
[iptcDict setValue:[[DataEngine sharedInstance] getAlbumName] forKey:(NSString *)kCGImagePropertyIPTCObjectTypeReference]; //folder name
[iptcDict setValue:#“Renish Dadhaniya - 101" forKey:(NSString *)kCGImagePropertyIPTCObjectAttributeReference]; //add Image ID -get using query from database
[iptcDict setValue:[NSString stringWithFormat:#“Renish Sweet Memory "forKey:(NSString *)kCGImagePropertyIPTCObjectName]; //Add Image name
[iptcDict setValue:[dateFormatter stringFromDate:[NSDate date]]forKey:(NSString *)kCGImagePropertyIPTCDateCreated]; //Add Image Date
[iptcDict setValue:[timeFormatter stringFromDate:[NSDate date]]forKey:(NSString *)kCGImagePropertyIPTCTimeCreated]; //Add Image Time
NSMutableDictionary *dict = [NSMutableDictionary dictionary];
[dict setValue:iptcDict forKey:(NSString *)kCGImagePropertyIPTCDictionary];
//Get Iamge Url
__block NSURL *imageAssestURL = nil;
[asSetLib writeImageToSavedPhotosAlbum:image.CGImage metadata:dict completionBlock:^(NSURL* assetURL, NSError* error) {
if (error) {
NSLog(#"Image could not be safed to the assets library: %#", error);
imageAssestURL = nil;
}
else {
NSLog( #"Image safed successfully to assetURL: %#", assetURL);
imageAssestURL = assetURL;
}
}];
[picker dismissViewControllerAnimated:YES completion:nil];
}

How to get only images in the camera roll using Photos Framework

The following code loads images that are also located on iCloud or the streams images. How can we limit the search to only images in the camera roll?
var assets = PHAsset.fetchAssetsWithMediaType(PHAssetMediaType.Image, options: nil)
After adding the Camera Roll and Photo Stream albums, Apple added the following PHAssetCollectionSubtype types in iOS 8.1:
PHAssetCollectionSubtypeAlbumMyPhotoStream (together with PHAssetCollectionTypeAlbum) - fetches the Photo Stream album.
PHAssetCollectionSubtypeSmartAlbumUserLibrary (together with PHAssetCollectionTypeSmartAlbum) - fetches the Camera Roll album.
Haven't tested if this is backward-compatible with iOS 8.0.x though.
Through some experimentation we discovered a hidden property not listed in the documentation (assetSource). Basically you have to do a regular fetch request, then use a predicate to filter the ones from the camera roll. This value should be 3.
Sample code:
//fetch all assets, then sub fetch only the range we need
var assets = PHAsset.fetchAssetsWithMediaType(PHAssetMediaType.Image, options: fetchOptions)
assets.enumerateObjectsUsingBlock { (obj, idx, bool) -> Void in
results.addObject(obj)
}
var cameraRollAssets = results.filteredArrayUsingPredicate(NSPredicate(format: "assetSource == %#", argumentArray: [3]))
results = NSMutableArray(array: cameraRollAssets)
If you are searching like me for Objective C code, and also you didn't get Answer of new library/ Photo Framework as you were getting deprecated AssetsLibrary's code , Then this will help you:
Swift
Global Variables:
func getAllPhotosFromCameraRoll() -> [UIImage] {
// TODO: Add `NSPhotoLibraryUsageDescription` to info.plist
PHPhotoLibrary.requestAuthorization { print($0) } // TODO: Move this line of code to somewhere before attempting to access photos
var images = [UIImage]()
let requestOptions: PHImageRequestOptions = PHImageRequestOptions()
requestOptions.resizeMode = .exact
requestOptions.deliveryMode = .highQualityFormat
requestOptions.isSynchronous = true
let fetchResult: PHFetchResult = PHAsset.fetchAssets(with: .image, options: nil)
let manager: PHImageManager = PHImageManager.default()
for i in 0..<fetchResult.count {
let asset = fetchResult.object(at: i)
manager.requestImage(
for: asset,
targetSize: PHImageManagerMaximumSize,
contentMode: .default,
options: requestOptions,
resultHandler: { (image: UIImage?, info: [AnyHashable: Any]?) -> Void in
if let image = image {
images.append(image)
}
})
}
return images
}
Objective C
Global Variables:
NSArray *imageArray;
NSMutableArray *mutableArray;
below method will help you:
-(void)getAllPhotosFromCamera
{
imageArray=[[NSArray alloc] init];
mutableArray =[[NSMutableArray alloc]init];
PHImageRequestOptions *requestOptions = [[PHImageRequestOptions alloc] init];
requestOptions.resizeMode = PHImageRequestOptionsResizeModeExact;
requestOptions.deliveryMode = PHImageRequestOptionsDeliveryModeHighQualityFormat;
requestOptions.synchronous = true;
PHFetchResult *result = [PHAsset fetchAssetsWithMediaType:PHAssetMediaTypeImage options:nil];
NSLog(#"%d",(int)result.count);
PHImageManager *manager = [PHImageManager defaultManager];
NSMutableArray *images = [NSMutableArray arrayWithCapacity:[result count]];
// assets contains PHAsset objects.
__block UIImage *ima;
for (PHAsset *asset in result) {
// Do something with the asset
[manager requestImageForAsset:asset
targetSize:PHImageManagerMaximumSize
contentMode:PHImageContentModeDefault
options:requestOptions
resultHandler:^void(UIImage *image, NSDictionary *info) {
ima = image;
[images addObject:ima];
}];
}
imageArray = [images copy]; // You can direct use NSMutuable Array images
}
If you use your own PHCachingImageManager instead of the shared PHImageManager instance then when you call requestImageForAsset:targetSize:contentMode:options:resultHandler: you can set an option in PHImageRequestOptions to specify that the image is local.
networkAccessAllowed
Property
A Boolean value that specifies whether Photos can download the requested image from iCloud.
networkAccessAllowed
Discussion
If YES, and the requested image is not stored on the local device, Photos downloads the image from iCloud. To be notified of the download’s progress, use the progressHandler property to provide a block that Photos calls periodically while downloading the image. If NO (the default), and the image is not on the local device, the PHImageResultIsInCloudKey value in the result handler’s info dictionary indicates that the image is not available unless you enable network access.
This can help. You can use your own data model instead of AlbumModel I used.
func getCameraRoll() -> AlbumModel {
var cameraRollAlbum : AlbumModel!
let cameraRoll = PHAssetCollection.fetchAssetCollections(with: .smartAlbum, subtype: .smartAlbumUserLibrary, options: nil)
cameraRoll.enumerateObjects({ (object: AnyObject!, count: Int, stop: UnsafeMutablePointer) in
if object is PHAssetCollection {
let obj:PHAssetCollection = object as! PHAssetCollection
let fetchOptions = PHFetchOptions()
fetchOptions.sortDescriptors = [NSSortDescriptor(key: "creationDate", ascending: false)]
fetchOptions.predicate = NSPredicate(format: "mediaType = %d", PHAssetMediaType.image.rawValue)
let assets = PHAsset.fetchAssets(in: obj, options: fetchOptions)
if assets.count > 0 {
let newAlbum = AlbumModel(name: obj.localizedTitle!, count: assets.count, collection:obj, assets: assets)
cameraRollAlbum = newAlbum
}
}
})
return cameraRollAlbum
}
Here is Objective- c version provided by apple.
-(NSMutableArray *)getNumberOfPhotoFromCameraRoll:(NSArray *)array{
PHFetchResult *fetchResult = array[1];
int index = 0;
unsigned long pictures = 0;
for(int i = 0; i < fetchResult.count; i++){
unsigned long temp = 0;
temp = [PHAsset fetchAssetsInAssetCollection:fetchResult[i] options:nil].count;
if(temp > pictures ){
pictures = temp;
index = i;
}
}
PHCollection *collection = fetchResult[index];
if (![collection isKindOfClass:[PHAssetCollection class]]) {
// return;
}
// Configure the AAPLAssetGridViewController with the asset collection.
PHAssetCollection *assetCollection = (PHAssetCollection *)collection;
PHFetchResult *assetsFetchResult = [PHAsset fetchAssetsInAssetCollection:assetCollection options:nil];
self. assetsFetchResults = assetsFetchResult;
self. assetCollection = assetCollection;
self.numberOfPhotoArray = [NSMutableArray array];
for (int i = 0; i<[assetsFetchResult count]; i++) {
PHAsset *asset = assetsFetchResult[i];
[self.numberOfPhotoArray addObject:asset];
}
NSLog(#"%lu",(unsigned long)[self.numberOfPhotoArray count]);
return self.numberOfPhotoArray;
}
Where you can grab following details
PHFetchResult *fetchResult = self.sectionFetchResults[1];
PHCollection *collection = fetchResult[6];
**value 1,6 used to get camera images**
**value 1,0 used to get screen shots**
**value 1,1 used to get hidden**
**value 1,2 used to get selfies**
**value 1,3 used to get recently added**
**value 1,4 used to get videos**
**value 1,5 used to get recently deleted**
**value 1,7 used to get favorites**
Apple demo link
Declare your property
#property (nonatomic, strong) NSArray *sectionFetchResults;
#property (nonatomic, strong) PHFetchResult *assetsFetchResults;
#property (nonatomic, strong) PHAssetCollection *assetCollection;
#property (nonatomic, strong) NSMutableArray *numberOfPhotoArray;
I've been banging my head over this too. I've found no way to filter for only assets on the device with fetchAssetsWithMediaType or fetchAssetsInAssetCollection. I'm able to use requestContentEditingInputWithOptions or requestImageDataForAsset to determine if the asset is on the device or not, but this is asynchronous and seems like it's using way too much resources to do for every asset in the list. There must be a better way.
PHFetchResult *fetchResult = [PHAsset fetchAssetsWithMediaType:PHAssetMediaTypeImage options:nil];
for (int i=0; i<[fetchResult count]; i++) {
PHAsset *asset = fetchResult[i];
[asset requestContentEditingInputWithOptions:nil
completionHandler:^(PHContentEditingInput *contentEditingInput, NSDictionary *info) {
if ([[info objectForKey:PHContentEditingInputResultIsInCloudKey] intValue] == 1) {
NSLog(#"asset is in cloud");
} else {
NSLog(#"asset is on device");
}
}];
}
If you don't want to rely on an undocumented API, look at [asset canPerformEditOperation:PHAssetEditOperationContent]. This only returns true if the full original is available on device.
Admittedly this is also fragile, but testing shows it works for all of the assetSource types (photostream, iTunes sync, etc).

Resources