I'm very new to Swift and Ios programming. I like to, as mentioned above, insert my own metadata to captured images before i save them to album.
I'm trying to get this done with this code. The saved image does not contain my own metadata, but its generated metadata. Can anybody please tell me what I'm doing wrong?
Or maybe isn't it possible to add own new metadata table to captured images?
Thanks a lot for your help
#IBAction func btnPressed(sender: UIButton) {
capturePicture()
}
func capturePicture(){
stillImageOutput.outputSettings = [AVVideoCodecKey: AVVideoCodecJPEG]
session.addOutput(stillImageOutput)
if let connection = self.stillImageOutput.connectionWithMediaType(AVMediaTypeVideo) {
self.stillImageOutput.captureStillImageAsynchronouslyFromConnection(connection) {
(imageDataSampleBuffer, error) -> Void in
if error == nil {
var asset = ALAssetsLibrary()
let imageData = AVCaptureStillImageOutput.jpegStillImageNSDataRepresentation(imageDataSampleBuffer)
// The Metadata of the Image
var metadata:NSDictionary = CMCopyDictionaryOfAttachments(nil, imageDataSampleBuffer, CMAttachmentMode(kCMAttachmentMode_ShouldPropagate)).takeUnretainedValue()
// My Metadata i want to add for testing purpose
var meta : NSDictionary = ["Ersteller": "Dennis","Datum" : "25.04.14","Ort" : "Köln" ]
asset.writeImageDataToSavedPhotosAlbum(imageData, metadata: meta as [NSObject : AnyObject], completionBlock: { (path:NSURL!, error:NSError!) -> Void in
println("\(path)")
println("\(error)")
})
}
}
}
}
Just Convert Below code to Swift. Below code are written in Objective-C. You just need to create IPTC or TIFF dictionary. Add value with suitable IPTC/TIFF key and write dictionary data(Meta Data) on image.
- (void) imagePickerController: (UIImagePickerController *)picker didFinishPickingMediaWithInfo: (NSDictionary *)info
{
UIImage *image = info[UIImagePickerControllerOriginalImage];
//Here We Get current system date and time and store as a description of photo
NSDateFormatter *dateFormatter=[[NSDateFormatter alloc] init];
[dateFormatter setDateFormat:#"dd-MM-yyyy"];
NSLog(#"Date Formatter : %#",[dateFormatter stringFromDate:[NSDate date]]);
//hh:mm:ss
NSDateFormatter *timeFormatter=[[NSDateFormatter alloc] init];
[timeFormatter setDateFormat:#"hh:mm:ss"];
NSLog(#"time Formatterr : %#",[timeFormatter stringFromDate:[NSDate date]]);
//ADD IPTC Dictionary Data as a META DATA
NSMutableDictionary *iptcDict = [NSMutableDictionary dictionary];
[iptcDict setValue:[[DataEngine sharedInstance] getAlbumName] forKey:(NSString *)kCGImagePropertyIPTCObjectTypeReference]; //folder name
[iptcDict setValue:#“Renish Dadhaniya - 101" forKey:(NSString *)kCGImagePropertyIPTCObjectAttributeReference]; //add Image ID -get using query from database
[iptcDict setValue:[NSString stringWithFormat:#“Renish Sweet Memory "forKey:(NSString *)kCGImagePropertyIPTCObjectName]; //Add Image name
[iptcDict setValue:[dateFormatter stringFromDate:[NSDate date]]forKey:(NSString *)kCGImagePropertyIPTCDateCreated]; //Add Image Date
[iptcDict setValue:[timeFormatter stringFromDate:[NSDate date]]forKey:(NSString *)kCGImagePropertyIPTCTimeCreated]; //Add Image Time
NSMutableDictionary *dict = [NSMutableDictionary dictionary];
[dict setValue:iptcDict forKey:(NSString *)kCGImagePropertyIPTCDictionary];
//Get Iamge Url
__block NSURL *imageAssestURL = nil;
[asSetLib writeImageToSavedPhotosAlbum:image.CGImage metadata:dict completionBlock:^(NSURL* assetURL, NSError* error) {
if (error) {
NSLog(#"Image could not be safed to the assets library: %#", error);
imageAssestURL = nil;
}
else {
NSLog( #"Image safed successfully to assetURL: %#", assetURL);
imageAssestURL = assetURL;
}
}];
[picker dismissViewControllerAnimated:YES completion:nil];
}
Related
I'm using the dropbox objc API and I'm trying to get all thumbnails in a specific dropbox folder.
But I'm completely stuck at DBFILESGetThumbnailBatchArg. How do I initiate paths to all images in a folder?
This is the line I'm stuck at:
[[client.filesRoutes getThumbnailBatch:<#(nonnull NSArray<DBFILESThumbnailArg *> *)#>]
setResponseBlock:^(
DBFILESGetThumbnailBatchResult * _Nullable result,
DBFILESGetThumbnailBatchError * _Nullable routeError,
DBRequestError * _Nullable networkError) { etc etc..
Documentation says
DBFILESThumbnailArg *arg = [[DBFILESThumbnailArg alloc] initWithPath:<#(nonnull NSString *)#>];
DBFILESGetThumbnailBatchArg *batchArg = [[DBFILESGetThumbnailBatchArg alloc]
initWithEntries:<#(nonnull NSArray<DBFILESThumbnailArg *> *)#>];
How do I init a list of paths of DBFILESThumbnailArg?
Link to documentation:
https://dropbox.github.io/dropbox-sdk-obj-c/api-docs/latest/Classes/DBFILESRouteObjects.html#/c:objc(cs)DBFILESRouteObjects(cm)DBFILESGetThumbnailBatch
As you found, the getThumbnailBatch method expects an NSArray<DBFILESThumbnailArg *>, so calling it would look like this:
NSArray<DBFILESThumbnailArg *> *entries = #[[[DBFILESThumbnailArg alloc] initWithPath:#"/test1.jpg"], [[DBFILESThumbnailArg alloc] initWithPath:#"/test2.jpg"]];
[[client.filesRoutes getThumbnailBatch:entries]
setResponseBlock:^(DBFILESGetThumbnailBatchResult *result, DBFILESGetThumbnailBatchError *routeError, DBRequestError *networkError) {
if (result) {
NSLog(#"result:");
NSLog(#"%#", result);
} else if (routeError) {
NSLog(#"routeError:");
NSLog(#"%#", routeError);
} else if (networkError) {
NSLog(#"networkError:");
NSLog(#"%#", networkError);
};
}];
I solved this using a NSMutableArray, posting my solution if others come looking:
//Create a temporary NSMutableArray
NSMutableArray *thumbArgMutable = [[NSMutableArray alloc] init];
for (NSString* image in _images)
{
//Create DBFILESThumbnailArg from NSString
DBFILESThumbnailArg *arg = [[DBFILESThumbnailArg alloc] initWithPath:image];
//Add path as DBFILESThumbnailArg to NSMutableArray
[thumbArgMutable addObject:arg];
}
//Copy NSMutableArray to a new DBFILESThumbnailArg
DBFILESThumbnailArg *thumbArg = [thumbArgMutable copy];
//create a DBFILESGetThumbnailBatchArg and init with the copied DBFILESThumbnailArg
DBFILESGetThumbnailBatchArg *thumbArgBatch = [[DBFILESGetThumbnailBatchArg alloc] initWithEntries:thumbArg];
DBUserClient *client = [[DBUserClient alloc] initWithAccessToken:#"TOKEN"];
//use property entries from DBFILESGetThumbnailBatchArg
[[client.filesRoutes getThumbnailBatch:thumbArgBatch.entries]
setResponseBlock:^(DBFILESGetThumbnailBatchResult * _Nullable result,
DBFILESGetThumbnailBatchError * _Nullable routeError,
DBRequestError * _Nullable networkError)
{
if (result) {
NSLog(#"%#\n", result);
//loop all downloaded thumbnails
for (DBFILESGetThumbnailBatchResultEntry *data in result.entries)
{
//extract data from each base64 encoded thumbnail string
NSData *thumbData = [[NSData alloc] initWithBase64EncodedString:data.success.thumbnail options:0];
//create UIImage from data
UIImage *thumbImage = [UIImage imageWithData:thumbData];
}
}
else { //if download failed
NSLog(#"%#\n%#\n", routeError, networkError);
}
I'm using CNContacts and CNContactUI framework and picking a contact via this
CNContactPickerViewController *contactPicker = [CNContactPickerViewController new];
contactPicker.delegate = self;
[self presentViewController:contactPicker animated:YES completion:nil];
and
-(void)contactPicker:(CNContactPickerViewController *)picker didSelectContact:(CNContact *)contact
{
NSArray *array = [[NSArray alloc] initWithObjects:contact, nil];
NSError *error;
NSData *data = [CNContactVCardSerialization dataWithContacts:array error:&error];
NSLog(#"ERROR_IF_ANY :: %#",error.description);
}
This contact object have contact.imageData and coming in logs. But when I tried to cross check this data by
NSArray *contactList = [NSArray arrayWithArray:[CNContactVCardSerialization contactsWithData:data error:nil]];
CNContact *contactObject = [contactList objectAtIndex:0];
This is getting null:
//contactObject.imageData
Why am I getting this null and this contact has image when check in contacts?
I'd like to improve upon and modernise for Swift 3 the excellent answer by kudinovdenis.
Just put the following extension into your project
import Foundation
import Contacts
extension CNContactVCardSerialization {
internal class func vcardDataAppendingPhoto(vcard: Data, photoAsBase64String photo: String) -> Data? {
let vcardAsString = String(data: vcard, encoding: .utf8)
let vcardPhoto = "PHOTO;TYPE=JPEG;ENCODING=BASE64:".appending(photo)
let vcardPhotoThenEnd = vcardPhoto.appending("\nEND:VCARD")
if let vcardPhotoAppended = vcardAsString?.replacingOccurrences(of: "END:VCARD", with: vcardPhotoThenEnd) {
return vcardPhotoAppended.data(using: .utf8)
}
return nil
}
class func data(jpegPhotoContacts: [CNContact]) throws -> Data {
var overallData = Data()
for contact in jpegPhotoContacts {
let data = try CNContactVCardSerialization.data(with: [contact])
if contact.imageDataAvailable {
if let base64imageString = contact.imageData?.base64EncodedString(),
let updatedData = vcardDataAppendingPhoto(vcard: data, photoAsBase64String: base64imageString) {
overallData.append(updatedData)
}
} else {
overallData.append(data)
}
}
return overallData
}
}
and then you can use it similarly to the existing serialisation method:
CNContactVCardSerialization.data(jpegPhotoContacts: [contact1, contact2])
Note that this takes care of serialisation, you'll need to write a similar method for deserialisation if you are also importing.
As a workaround you can create PHOTO field inside of VCard.
NSError* error = nil;
NSData* vCardData = [CNContactVCardSerialization dataWithContacts:#[contact] error:&error];
NSString* vcString = [[NSString alloc] initWithData:vCardData encoding:NSUTF8StringEncoding];
NSString* base64Image = contact.imageData.base64Encoding;
NSString* vcardImageString = [[#"PHOTO;TYPE=JPEG;ENCODING=BASE64:" stringByAppendingString:base64Image] stringByAppendingString:#"\n"];
vcString = [vcString stringByReplacingOccurrencesOfString:#"END:VCARD" withString:[vcardImageString stringByAppendingString:#"END:VCARD"]];
vCardData = [vcString dataUsingEncoding:NSUTF8StringEncoding];
For some reasons CNContactVCardSerialization does not use any photo of contact. VCard after serialization is looks like:
BEGIN:VCARD
VERSION:3.0
PRODID:-//Apple Inc.//iPhone OS 9.3.2//EN
N:Contact;Test;;;
FN: Test Contact
END:VCARD
After insertion the PHOTO field inside VCard you will get
BEGIN:VCARD
VERSION:3.0
PRODID:-//Apple Inc.//iPhone OS 9.3.2//EN
N:Contact;Test;;;
FN: Test Contact
PHOTO;TYPE=JPEG;ENCODING=BASE64:<photo base64 string>
END:VCARD
After this insertion contact will looks fine in CNContactViewController
For N number of contacts, you can add image data into VCF by using simple method as below.
-(NSData*)getVCFDataWithImagesFromContacts:(NSArray*)arrContacts
{
//---- Convert contacts array into VCF data.
NSError *error;
NSData *vcfData = [CNContactVCardSerialization dataWithContacts:arrContacts error:&error];
//--- Convert VCF data into string.
NSString *strVCF = [[NSString alloc] initWithData:vcfData encoding:NSUTF8StringEncoding];
//--- Split contacts from VCF.
NSMutableArray *arrSplit = (NSMutableArray*)[strVCF componentsSeparatedByString:#"END:VCARD"];
[arrSplit removeLastObject];//-- if object is "\r\n" otherwise comment this line.
//--- Validate array count
if (arrSplit.count == arrContacts.count)
{
for (int index=0;index<arrContacts.count;index++)
{
//--- Get current contact and VCF contact string.
CNContact *contact = arrContacts[index];
NSString *strContact = arrSplit[index];
//--- Get base64 string of image.
NSString* base64Image = [UIImagePNGRepresentation([ViewController imageWithImage:[UIImage imageWithData:contact.imageData] scaledToSize:CGSizeMake(50,50)]) base64EncodedStringWithOptions:NSDataBase64EncodingEndLineWithCarriageReturn];
//--- Append image tag into contact string.
NSString* vcardImageString = [[#"PHOTO;ENCODING=BASE64;JPEG:" stringByAppendingString:base64Image] stringByAppendingString:#"\r\n"];
strContact = [strContact stringByAppendingString:[NSString stringWithFormat:#"%#%#",vcardImageString,#"END:VCARD\r\n"]];
//--- Update contact string from array.
[arrSplit replaceObjectAtIndex:index withObject:strContact];
NSLog(#"strContact :%#",strContact);
}
}
//--- Combine all contacts together in VCF.
vcfData = [[arrSplit componentsJoinedByString:#""] dataUsingEncoding:NSUTF8StringEncoding];
strVCF = [[NSString alloc] initWithData:vcfData encoding:NSUTF8StringEncoding];//--- VCF Data
NSLog(#"Contact VCF error :%#",error.localizedDescription);
return vcfData;
}
+(UIImage *)imageWithImage:(UIImage *)image scaledToSize:(CGSize)newSize
{
UIGraphicsBeginImageContextWithOptions(newSize, NO, 0.0);
[image drawInRect:CGRectMake(0, 0, newSize.width, newSize.height)];
UIImage *newImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return newImage;
}
I have an array filled with PHAsset objects (https://developer.apple.com/library/prerelease/ios/documentation/Photos/Reference/PHAsset_Class/index.html), and I want to know how I can convert them into a UIImage and then save them in an Array.
The array with the PHAsset objects is called self.assets and here is what I have so far:
PHImageManager *manager = [PHImageManager defaultManager];
CGFloat scale = UIScreen.mainScreen.scale;
NSMutableArray *images = [NSMutableArray arrayWithCapacity:[self.assets count]];
for (int i = 0; i < [self.assets count]; i++) {
CGSize targetSize = CGSizeMake(scale, scale);
[manager requestImageForAsset:[self.assets objectAtIndex:i]
targetSize:targetSize
contentMode:PHImageContentModeAspectFill
options:self.requestOptions
resultHandler:^(UIImage *image, NSDictionary *info){
[images addObject:image];
}];
}
self.requestOptions is a property in the .h
#property (nonatomic, strong) PHImageRequestOptions *requestOptions;
and in the viewDidLoad I am doing this:
self.requestOptions = [[PHImageRequestOptions alloc] init];
self.requestOptions.resizeMode = PHImageRequestOptionsResizeModeExact;
self.requestOptions.deliveryMode = PHImageRequestOptionsDeliveryModeHighQualityFormat;
But after doing some debugging, I keep seeing that self.assets has the following values:
(
<PHAsset: 0x1743828a0> 3B6D658D-EC76-43A1-9793-35D889E9CF15/L0/001 mediaType=1/0, assetSource=3, (2448x2448), creationDate=2015-07-27 02:02:46 +0000, location=1, hidden=0, favorite=0 ,
<PHAsset: 0x174382970> 50F05575-71D2-446B-BD1E-8E3250E375AD/L0/001 mediaType=1/0, assetSource=3, (2448x2448), creationDate=2015-07-27 02:02:47 +0000, location=1, hidden=0, favorite=0
)
and images is empty. Does anyone know how I can add convert the PHAssets into UIImages and add them to the images array? Any help is appreciated. Thanks!
For anyone struggling as much as I had on this issue, this is the way to go.
First set the requestOptions as:
self.requestOptions = [[PHImageRequestOptions alloc] init];
self.requestOptions.resizeMode = PHImageRequestOptionsResizeModeExact;
self.requestOptions.deliveryMode = PHImageRequestOptionsDeliveryModeHighQualityFormat;
// this one is key
self.requestOptions.synchronous = YES;
and if there are multiple assets in an array filled with PHAsset objects, then add this code:
self.assets = [NSMutableArray arrayWithArray:assets];
PHImageManager *manager = [PHImageManager defaultManager];
NSMutableArray *images = [NSMutableArray arrayWithCapacity:[assets count]];
// assets contains PHAsset objects.
__block UIImage *ima;
for (PHAsset *asset in self.assets) {
// Do something with the asset
[manager requestImageForAsset:asset
targetSize:PHImageManagerMaximumSize
contentMode:PHImageContentModeDefault
options:self.requestOptions
resultHandler:^void(UIImage *image, NSDictionary *info) {
ima = image;
[images addObject:ima];
}];
}
and now the images array contains all the images in uiimage format.
Swift 3.0 Answer
let photoAsset = asset
let manager = PHImageManager.default()
var options: PHImageRequestOptions?
options = PHImageRequestOptions()
options?.resizeMode = .exact
options?.isSynchronous = true
manager.requestImage(
for: photoAsset,
targetSize: PHImageManagerMaximumSize,
contentMode: .aspectFill,
options: options
) { [weak self] result, _ in
completion(result)
}
options?.isSynchronous = true is very important
BOOL synchronous; // return only a single result, blocking until available (or failure). Defaults to NO
image may be nil(seldom). It's better to evaluate, otherwise array add nil will crash app.
if(image){
ima = image;
[images addObject:ima];
}
I'm trying to select/compress a video from the photo library but when I go to get the duration and creation date, they both are returning null (for duration this defaults to 0.0 sec). I'm not sure if I'm doing something wrong here.
- (void)imagePickerController:(UIImagePickerController *)uploadPick didFinishPickingMediaWithInfo:(NSDictionary *)info {
NSString *mediaType = [info objectForKey:UIImagePickerControllerMediaType];
if (CFStringCompare (( CFStringRef) mediaType, kUTTypeMovie, 0) == kCFCompareEqualTo)
{
NSURL *videoURL = [info objectForKey:UIImagePickerControllerMediaURL];
//Video Duration:
MPMoviePlayerController *mp = [[MPMoviePlayerController alloc]
initWithContentURL:videoURL];
VideoTime.text = [NSString stringWithFormat:#"Time: %.2f", mp.duration];
//Video Creation Date
NSDictionary *metadataDictionary = (NSDictionary *)[info valueForKey:UIImagePickerControllerMediaMetadata];
NSDateFormatter *dateFormatter = [[NSDateFormatter alloc] init];
[dateFormatter setDateStyle:NSDateFormatterMediumStyle];
NSString *stringDate = [dateFormatter stringFromDate:metadataDictionary.fileCreationDate];
[dateFormatter release];
VideoDateTaken.text = [NSString stringWithFormat:#"Date Taken: %#", stringDate];
}
}
According to the documentation, UIImagePickerControllerMediaMetadata is only valid for still images:
This key is valid only when using an image picker whose source type is
set to UIImagePickerControllerSourceTypeCamera, and applies only to
still images.
In order do get the metadata you want, use an ALAsset and the metadata method.
I am trying to save some values from my app using NSCoding. I'm able to save the value but not able to retrieve it.
Here's where I am declaring the protocol:
#interface AddReminderEventViewController : UIViewController <UIPickerViewDelegate, UIPickerViewDataSource, NSCoding>
Here's where I'm complying with the protocol:
-(void)encodeWithCoder:(NSCoder *)enCoder
{
[enCoder encodeObject:self.eventRepeatDurationDate forKey:kEventRepeatDuration];
[enCoder encodeObject:self.eventIDsMutableArray forKey:kEventIDsMutableArray];
[enCoder encodeObject:self.eventRepeatDurationString forKey:#"mytest"];}
and here:
-(id)initWithCoder:(NSCoder *)decoder {
if (self = [super init]){
self.eventRepeatDurationDate = [[decoder decodeObjectForKey:kEventRepeatDuration] retain];
self.eventIDsMutableArray = [[decoder decodeObjectForKey:kEventIDsMutableArray] retain];
self.eventRepeatDurationString = [[decoder decodeObjectForKey:#"mytest"] retain];} return self; }
and here's where I call the methods to do the archiving and unarchiving:
[self saveDataToDisk];
[self loadDataFromDisk];
and here are the bodies of these methods and it's NSLog contents:
- (void)saveDataToDisk {
NSString *reminderEventIDsPathString = #"~/Library/Application Support/ReminderIDs.archive";
//reminderEventIDsPathString = #"~/Library/Application Support/ReminderIDs.archive";
reminderEventIDsPathString = [reminderEventIDsPathString stringByExpandingTildeInPath];
NSLog(#"WATCH1: reminderEventIDsPathString is %#", reminderEventIDsPathString);
NSMutableDictionary *rootObject;
rootObject = [NSMutableDictionary dictionary];
[rootObject setValue:eventRepeatDurationString forKey:#"mytest"];
NSLog(#"1rootObject IS %#", rootObject);
[NSKeyedArchiver archiveRootObject:rootObject toFile:reminderEventIDsPathString];}
reminderEventIDsPathString is /Users/tester/Library/Application Support/iPhone Simulator/5.0/Applications/E26D57DE-C4E1-4318-AEDD-7207F41010A9/Library/Application Support/ReminderIDs.archive
2012-01-16 15:47:48.578 [29658:15503] 1rootObject IS {mytest = 7;}
and here is the unarchiver code along with its NSLog contents:
- (void)loadDataFromDisk {
NSString *testValue = [[NSString alloc] init];
NSString *reminderEventIDsPathString = #"~/Library/Application Support/ReminderIDs.archive";
reminderEventIDsPathString = [reminderEventIDsPathString stringByExpandingTildeInPath];
NSLog(#"WATCH2: reminderEventIDsPathString is %#", reminderEventIDsPathString);
NSMutableDictionary *rootObject;
rootObject = [[NSKeyedUnarchiver unarchiveObjectWithFile:reminderEventIDsPathString] retain];
NSLog(#"2rootObject IS %#", rootObject);
NSLog(#"WATCH3 - %#", [rootObject objectForKey:#"mytest" ]);
if ([rootObject valueForKey:#"mytest"]) {
testValue = [rootObject valueForKey:#"mytest"];
NSLog(#"WATCH: testValue is %#", testValue); } }
2012-01-16 15:48:14.965 [29658:15503] WATCH2: reminderEventIDsPathString is /Users/tester/Library/Application Support/iPhone Simulator/5.0/Applications/E26D57DE-C4E1-4318-AEDD-7207F41010A9/Library/Application Support/ReminderIDs.archive
2012-01-16 15:48:17.879 [29658:15503] 2rootObject IS (null)
What am I missing that I'm not able to unarchive the contents? I'm just focusing on the easiest of the values in my encoder/decoder methods just to test it but I'm not even able to get the string value to work.
Thanks
The path where you save and load your reminder is wrong. Maybe replace to this
NSString *reminderEventIDsPathString = [[NSHomeDirectory() stringByAppendingPathComponent:#"Documents"] stringByAppendingPathComponent:#"ReminderIDs.archive"];