I want to find the DPI for an image that has been captured from iPhone/iPad Camera
this is how i am trying to get the DPI
CFDictionaryRef exifDict = CMGetAttachment(imageDataSampleBuffer,
kCGImagePropertyExifDictionary ,
NULL);
originalExifDict = (__bridge NSMutableDictionary *)(exifDict);
[originalExifDict objectForKey:(NSString *)kCGImagePropertyDPIHeight]
[originalExifDict objectForKey:(NSString *)kCGImagePropertyDPIWidth]
However both the entries in the dictionary come to be 0.
What is the correct way to find the DPI ?
Thanks in advance for the help
CGSize size;
NSNumber *width = (NSNumber *)CFDictionaryGetValue(exifDict, kCGImagePropertyDPIWidth);
NSNumber *height = (NSNumber *)CFDictionaryGetValue(exifDict, kCGImagePropertyDPIHeight);
size.width = [width floatValue];
size.height = [height floatValue];
//Tell me its work or not.
The information isn't in the metadata that comes with your imageDataSampleBuffer. It is written (72 dpi) at the time the image is saved, unless you have, first, manually set it yourself when editing the metadata, before the save.
For most purposes, it is meaningless, However, some software uses it to calculate the "correct size" of an image when placing it in a document. A 3000 pixel square image at 300 dpi will thus appear 10 inches (c.25.4 cm) square; at 72 dpi it will be nearly 42 inches (c.105.8 cm) square. Also, some online image uploaders (especially those used by stock photo libraries and the like) insist on images having high-ish dpi.
If you are using imagePickerController use this below code
NSURL *assetURL = [info objectForKey:UIImagePickerControllerReferenceURL];
ALAssetsLibrary *library = [[ALAssetsLibrary alloc] init];
[library assetForURL:assetURL
resultBlock:^(ALAsset *asset) {
NSMutableDictionary *imageMetadata = nil;
NSDictionary *metadata = asset.defaultRepresentation.metadata;
imageMetadata = [[NSMutableDictionary alloc] initWithDictionary:metadata];
NSLog (#"imageMetaData from AssetLibrary %#",imageMetadata);
NSString *dpi = [imageMetadata objectForKey:#"DPIHeight"];
NSLog (#"Dpi: %#",dpi);
}
failureBlock:^(NSError *error) {
NSLog (#"error %#",error);
}];
Related
I have a strange issue about getting UIImages with PHImageManager.
Everything works fine with when the requested UIImage is not edited from photos app of iPhone;
[[PHImageManager defaultManager] requestImageForAsset:[assets objectAtIndex:0] targetSize:PHImageManagerMaximumSize contentMode:PHImageContentModeDefault options:requestOptions resultHandler:^void(UIImage *image, NSDictionary *info)
{
DebugLog(#"%ld", assets.count);
#autoreleasepool
{
if (image)
{
// Write image to system
NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
NSString *fileName = [NSString stringWithFormat:#"%ld_%#.jpg", (long)assets.count, [NSDate date]];
NSString *filePath = [[paths objectAtIndex:0] stringByAppendingPathComponent:fileName];
[UIImageJPEGRepresentation(image, 0.5) writeToFile:filePath atomically:YES];
}
}];
But in case that if a user edited the photo from iPhone's self photo editor (i.e. s/he cropped the picture) the PHImageManager fails to retrieve that photo.
Without editing, the dictionary provided in the resultHandler is normal which is called "info":
{
PHImageFileOrientationKey = 0;
PHImageFileSandboxExtensionTokenKey = "5565e83d376d8c4e018b8ea41401e58e5aabbc18;00000000;00000000;000000000000001a;com.apple.app-sandbox.read;00000001;01000003;000000000039a762;/private/var/mobile/Media/DCIM/113APPLE/IMG_3433.PNG";
PHImageFileURLKey = "file:///var/mobile/Media/DCIM/113APPLE/IMG_3433.PNG";
PHImageFileUTIKey = "public.png";
PHImageResultDeliveredImageFormatKey = 9999;
PHImageResultIsDegradedKey = 0;
PHImageResultIsInCloudKey = 0;
PHImageResultIsPlaceholderKey = 0;
PHImageResultOptimizedForSharing = 0;
PHImageResultRequestIDKey = 55;
PHImageResultWantedImageFormatKey = 9999;
}
But after editing from photo application the dictionary becomes:
{
PHImageFileOrientationKey = 0;
PHImageResultDeliveredImageFormatKey = 4031;
PHImageResultIsDegradedKey = 1;
PHImageResultRequestIDKey = 56;
PHImageResultWantedImageFormatKey = 9998;
}
Does anyone faced with this issue before?
Thank you for your replies.
Okey I find out what happens there.
In PHImageRequestOptions there is an option to choose the unadjusted version of the photo by adding;
requestOptions.version = PHImageRequestOptionsVersionUnadjusted;
With this, I successfully get the Image (however in unadjusted format).
Anyone could find further solution without taking the image adjusted, please comment me.
Cheers.
I am too late to help you now, but maybe someone else will have the same issue in the future.
Instead of PHImageRequestOptionsVersionUnadjusted use PHImageRequestOptionsDeliveryModeHighQualityFormat.
Or in Swift use,
requestOptions.deliveryMode = PHImageRequestOptionsDeliveryMode.highQualityFormat
If you check the documentation for highQualityFormat:
Photos provides only the highest-quality image available, regardless
of how much time it takes to load. If the isSynchronous property is
true or if using the requestImageData(for:options:resultHandler:)
method, this behavior is the default and only option (that is,
specifying other delivery mode options has no effect).
I am exporting a Quicktime video using AVExporterSession and setting the metadata on it as follows:
AVMutableMetadataItem *newMetaDataCommentItem = [[AVMutableMetadataItem alloc] init];
[newMetaDataCommentItem setKeySpace:AVMetadataKeySpaceQuickTimeMetadata];
[newMetaDataCommentItem setKey:AVMetadataQuickTimeMetadataKeyComment];
[newMetaDataCommentItem setValue:#"Test metadata value"];
NSMutableArray *metaData = [NSMutableArray array];
[metaData addObject:newMetaDataCommentItem];
exporter = [[AVAssetExportSession alloc] initWithAsset:mutableComposition
presetName:AVAssetExportPresetHighestQuality];
exporter.outputURL=[[SNMovieManager instance] urlForFinalMovie];
exporter.metadata = metaData;
exporter.outputFileType = AVFileTypeQuickTimeMovie;
exporter.shouldOptimizeForNetworkUse = YES;
exporter.videoComposition = video;
I then import the file video to my Mac and run mdls on it and see the value has been set correctly: kMDItemComment = "Test metadata value"
The bit I can't do is read that value back. I am using the following to read the file. The asset is correct but the metadata property is always an empty dictionary.
[group enumerateAssetsUsingBlock:^(ALAsset *result, NSUInteger index, BOOL *stop) {
if([[result valueForProperty:#"ALAssetPropertyType"] isEqualToString:#"ALAssetTypeVideo"])
{
ALAssetRepresentation *rep = result.defaultRepresentation;
NSDictionary *metadata = rep.metadata;
[images addObject:(id)rep.fullScreenImage];
}
Does anyone know if I am taking the correct approach here and if not let me know what the correct approach to read this comment back out is?
Thanks
Simon
I would be highly appreciated if you can provide more code base related to the PhotoLibrary save process.
Otherwise only one answer, Metadata will return nil if the representation is one that the system cannot interpret.
The returned dictionary holds the properties of the video at a specified location in an file source.
I'think your problem is on getting metadata script
You should get an AVURLAsset first and get metadata from it ALAssetRepresentation metadata is different
[group enumerateAssetsUsingBlock:^(ALAsset *result, NSUInteger index, BOOL *stop) {
if([[result valueForProperty:#"ALAssetPropertyType"] isEqualToString:#"ALAssetTypeVideo"])
{
AVURLAsset *videoAset = [AVURLAsset assetWithURL:[[asset defaultRepresentation] url]];
if ([[videoAset metadataForFormat:AVMetadataFormatQuickTimeMetadata] count]) {
AVMutableMetadataItem *meta = [[videoAset metadataForFormat:AVMetadataFormatQuickTimeUserData] objectAtIndex:0];
NSLog(#"%#",meta);
NSLog(#"%lu",(unsigned long)[[videoAset metadataForFormat:AVMetadataFormatQuickTimeMetadata] count]);
}
}
I'm having a problem while getting the latitude and longitude data from an image(which is having geo location details). I have imported the EXIF framework and I'm using the following code to achieve this:
NSData *jpegData = [UIImageJPEGRepresentation(image, 0.5) base64String];
EXFJpeg* jpegScanner = [[EXFJpeg alloc] init];
[jpegScanner scanImageData: jpegData];
EXFMetaData* exifData = jpegScanner.exifMetaData;
id latitudeValue = [exifData tagValue:[NSNumber numberWithInt:EXIF_GPSLatitude]];
id longitudeValue = [exifData tagValue:[NSNumber numberWithInt:EXIF_GPSLongitude]];
NSLog(#"Longitude: %# Longitude: %#", latitudeValue, longitudeValue);
But its returning the NULL value for both latitude and longitude, can anyone please tell me what I'm doing wrong in the above code? Please help me out. Thanks in Advance!!
You can do it with the alasset framework.
ALAssetsLibrary *assetsLibrary = [[ALAssetsLibrary alloc]init];
[assetsLibrary assetForURL:photoUrl resultBlock:resultBlock failureBlock:nil];
ALAssetsLibraryAssetForURLResultBlock resultBlock = ^(ALAsset *photoAsset) {
CLLocation *location = [photoAsset valueForProperty:ALAssetPropertyLocation];
NSMutableDictionary *exifDataDict = [[NSMutableDictionary alloc] init];
if (location != nil) {
[exifDataDict setObject:[NSNumber numberWithDouble:location.coordinate.latitude] forKey:#"latitude"];
[exifDataDict setObject:[NSNumber numberWithDouble:location.coordinate.longitude] forKey:#"longitude"];
}
}
I had a similar issue once. While dealing with it I got the impression that UIImage sort of strips all or some of the EXIF data. EXIFJpeg worked fine for me when the image data was read from file, boundle or webservice etc. direclty but I did not manage to extract any reasonalbe EXIFs when I stored the image in memory as UIImage object and then used UIImageJPEGRepresentaiion to get the image data and the EXIF from that data.
I will not sign this in blood but that was my impression and using the "raw" data from file did actually work for me. So I received the file from some server, then extracted the EXIF including geo tags (if any) and after that created the UIImage.
I am updating my app to allow photo uploads to included GPS metadata when using UIImagePickerControllerSourceTypeSavedPhotosAlbum. The GPS data's accuracy is very important. I am running into an issue where the location data derived using ALAsset is different than the photo's actual exif data I can see when opening the same photo in Photoshop.
I have used two methods to read the GPS data in xcode:
ALAssetsLibraryAssetForURLResultBlock resultblock = ^(ALAsset *myasset) {
CLLocation *location = [myasset valueForProperty:ALAssetPropertyLocation];
latitudeString = [NSString stringWithFormat:#"%g",point.latitude];
longitudeString = [NSString stringWithFormat:#"%g",point.longitude];
}
AND
ALAssetRepresentation *representation = [myasset defaultRepresentation];
NSDictionary *metadata = [representation metadata];
NSDictionary *gpsDict = [metadata objectForKey:#"{GPS}"];
NSNumber *latitudeNumber = [gpsDict objectForKey:#"Latitude"];
NSNumber *longitudeNumber = [gpsDict objectForKey:#"Longitude"];
if ([[gpsDict valueForKey:#"LatitudeRef"] isEqualToString:#"S"])
{
//latitudeNumber = -latitudeNumber;
}
if ([[gpsDict valueForKey:#"LongitudeRef"] isEqualToString:#"W"])
{
//longitudeNumber = -longitudeNumber);
}
On a representative photo I am using as an example both sets of code above give me a latitude of 47.576333 which converts to 47,34,35N
If I look in Photoshop exif data - the latitude is 47,34,59N
These numbers are close - but they aren't the same. This happens without about 30% of my photos. Any idea why?
Edit - Photo shop does not give seconds - it give 34.59 minutes which is indeed accurate.
Your conversion is wrong, photoshop is more correct.
47.576333 (DEG) converts to 47* 34.5799' (DM). which can be rounded to 47* 34.58
which is the format photoshop obviously displays.
converted to DMS it gives your value: 47* 34' 35" N.
(Please replace all "*" with degrees symbol.)
So you exchanged DMS (Degress Minutes Seconds) with DM (Degrees Minutes) representation.
Let's say I want to find out the size of an image, so if a user tries to load a 10,000x10,000 pixel image in my iPad app I can present them with a dialog and not crash. If I do [UIImage imageNamed:] or [UIImage imageWithContentsOfFile:] that will load my potentially large image into memory immediately.
If I use Core Image instead, say like this:
CIImage *ciImage = [CIImage imageWithContentsOfURL:[NSURL fileURLWithPath:imgPath]];
Then ask my new CIImage for its size:
CGSize imgSize = ciImage.extent.size;
Will that load the entire image into memory to tell me this, or will it just look at the metadata of the file to discover the size of the image?
The imageWithContentsOfURL function loads the image into memory, yes.
Fortunately Apple implemented CGImageSource for reading image metadata without loading the actual pixel data into memory in iOS4, you can read about how to use it in this blog post (conveniently it provides a code sample on how to get image dimensions).
EDIT: Pasted code sample here to protect against link rot:
#import <ImageIO/ImageIO.h>
NSURL *imageFileURL = [NSURL fileURLWithPath:...];
CGImageSourceRef imageSource = CGImageSourceCreateWithURL((CFURLRef)imageFileURL, NULL);
if (imageSource == NULL) {
// Error loading image
...
return;
}
NSDictionary *options = [NSDictionary dictionaryWithObjectsAndKeys:
[NSNumber numberWithBool:NO], (NSString *)kCGImageSourceShouldCache,nil];
CFDictionaryRef imageProperties = CGImageSourceCopyPropertiesAtIndex(imageSource, 0, (CFDictionaryRef)options);
if (imageProperties) {
NSNumber *width = (NSNumber *)CFDictionaryGetValue(imageProperties, kCGImagePropertyPixelWidth);
NSNumber *height = (NSNumber *)CFDictionaryGetValue(imageProperties, kCGImagePropertyPixelHeight);
NSLog(#"Image dimensions: %# x %# px", width, height);
CFRelease(imageProperties);
}
The full API reference is also available here.