When a user makes some changes (cropping, red-eye removal, ...) to photos in the built-in Photos.app on iOS, the changes are not applied to the fullResolutionImage returned by the corresponding ALAssetRepresentation.
However, the changes are applied to the thumbnail and the fullScreenImage returned by the ALAssetRepresentation.
Furthermore, information about the applied changes can be found in the ALAssetRepresentation's metadata dictionary via the key #"AdjustmentXMP".
I would like to apply these changes to the fullResolutionImage myself to preserve consistency. I've found out that on iOS6+ CIFilter's filterArrayFromSerializedXMP: inputImageExtent:error: can convert this XMP-metadata to an array of CIFilter's:
ALAssetRepresentation *rep;
NSString *xmpString = rep.metadata[#"AdjustmentXMP"];
NSData *xmpData = [xmpString dataUsingEncoding:NSUTF8StringEncoding];
CIImage *image = [CIImage imageWithCGImage:rep.fullResolutionImage];
NSError *error = nil;
NSArray *filterArray = [CIFilter filterArrayFromSerializedXMP:xmpData
inputImageExtent:image.extent
error:&error];
if (error) {
NSLog(#"Error during CIFilter creation: %#", [error localizedDescription]);
}
CIContext *context = [CIContext contextWithOptions:nil];
for (CIFilter *filter in filterArray) {
[filter setValue:image forKey:kCIInputImageKey];
image = [filter outputImage];
}
However, this works only for some filters (cropping, auto-enhance) but not for others like red-eye removal. In these cases, the CIFilters have no visible effect. Therefore, my questions:
Is anyone aware of a way to create red-eye removal CIFilter? (In a way consistent with the Photos.app. The filter with the key kCIImageAutoAdjustRedEye is not enough. E.g., it does not take parameters for the position of the eyes.)
Is there a possibility to generate and apply these filters under iOS 5?
ALAssetRepresentation* representation = [[self assetAtIndex:index] defaultRepresentation];
// Create a buffer to hold the data for the asset's image
uint8_t *buffer = (Byte*)malloc(representation.size); // Copy the data from the asset into the buffer
NSUInteger length = [representation getBytes:buffer fromOffset: 0.0 length:representation.size error:nil];
if (length==0)
return nil;
// Convert the buffer into a NSData object, and free the buffer after.
NSData *adata = [[NSData alloc] initWithBytesNoCopy:buffer length:representation.size freeWhenDone:YES];
// Set up a dictionary with a UTI hint. The UTI hint identifies the type
// of image we are dealing with (that is, a jpeg, png, or a possible
// RAW file).
// Specify the source hint.
NSDictionary* sourceOptionsDict = [NSDictionary dictionaryWithObjectsAndKeys:
(id)[representation UTI], kCGImageSourceTypeIdentifierHint, nil];
// Create a CGImageSource with the NSData. A image source can
// contain x number of thumbnails and full images.
CGImageSourceRef sourceRef = CGImageSourceCreateWithData((CFDataRef) adata, (CFDictionaryRef) sourceOptionsDict);
[adata release];
CFDictionaryRef imagePropertiesDictionary;
// Get a copy of the image properties from the CGImageSourceRef.
imagePropertiesDictionary = CGImageSourceCopyPropertiesAtIndex(sourceRef,0, NULL);
CFNumberRef imageWidth = (CFNumberRef)CFDictionaryGetValue(imagePropertiesDictionary, kCGImagePropertyPixelWidth);
CFNumberRef imageHeight = (CFNumberRef)CFDictionaryGetValue(imagePropertiesDictionary, kCGImagePropertyPixelHeight);
int w = 0;
int h = 0;
CFNumberGetValue(imageWidth, kCFNumberIntType, &w);
CFNumberGetValue(imageHeight, kCFNumberIntType, &h);
// Clean up memory
CFRelease(imagePropertiesDictionary);
Related
Do you know is there any way to get the same NSData of Image ( JPG , PNG ) after save with PHPhotoLibrary or no?
OfC, iOS will modify some metadata and EXIF-- > ( Timestamp,... )data after save but, I'm asking about UIImage Data (include same EXIF data).
I didn't copy the exif in in my code here but it doesn't work
so Let's talk over the code:
Save Image and get hash
UIImage * tmp = [[UIImage alloc] initWithData:tmpData];
tmpData =UIImageJPEGRepresentation(tmp, 1.0);
self.str1 = [tmpData MD5];
[[PHPhotoLibrary sharedPhotoLibrary] performChanges:^{
PHAssetResourceCreationOptions *options = [[PHAssetResourceCreationOptions alloc] init];
options.originalFilename = #"XXX";
PHAssetCreationRequest * createReq = [PHAssetCreationRequest creationRequestForAsset];
[createReq addResourceWithType:PHAssetResourceTypePhoto data:tmpData options:options];
} completionHandler:^(BOOL success, NSError * _Nullable error) {
NSLog(#":%d",success);
}];
Load same Image :
[asset requestContentEditingInputWithOptions:0 completionHandler:^(PHContentEditingInput * _Nullable contentEditingInput, NSDictionary * _Nonnull info) {
PHImageRequestOptions * option = [[PHImageRequestOptions alloc] init];
option.synchronous = YES;
option.version = PHImageRequestOptionsVersionOriginal;
option.deliveryMode = PHImageRequestOptionsDeliveryModeHighQualityFormat;
option.resizeMode = PHImageRequestOptionsResizeModeNone;
[[PHImageManager defaultManager] requestImageDataForAsset:asset options:option resultHandler:^(NSData * _Nullable imageData, NSString * _Nullable dataUTI, UIImageOrientation orientation, NSDictionary * _Nullable info) {
UIImage * image = [UIImage imageWithData:imageData];
NSData * tmpDAt = UIImageJPEGRepresentation(image, 1.0);
NSString * md5 = [tmpDAt MD5];
if ([md5 isEqualToString:self.str1]) {
NSLog(#"My Expextation");
}
}];
The Intresting thing that I found is if I crop my image to 1*1 for test, I receive some error ( JPEGDecompressSurface : Picture decode failed: ) during save (It seems OS can't modify image) so I get the same hash before and after save :) !
I presume the difference is due to your JPEGs having different timestamps (and possibly other differences) in their EXIF metadata.
Have you tried using UIImagePNGRepresentation instead of UIImageJPEGRepresentation? Hopefully PNG representations will match.
Jpeg compression is a Lossy form of compression. Every time you convert to Jpeg you will lose data. There is no way around it. Removing PHPhotoLibrary from the equation. If you run the following
UIImage * tmp = [[UIImage alloc] initWithData:tmpData];
tmpData =UIImageJPEGRepresentation(tmp, 1.0);
str1 = [tmpData MD5];
tmp = [[UIImage alloc] initWithData:tmpData];
tmpData =UIImageJPEGRepresentation(tmp, 1.0);
str2 = [tmpData MD5];
You will find that str1 and str2 are different.
If you want the same data you will have to either keep the original jpeg data that generated the image or use a loseless compression method like the one used within the PNG files.
Scenario:
I have an image in the iPhone camera roll. I access it using ALAssetLibrary and get an ALAsset object. I get a UIImage and NSData object from it using something like the following code.
ALAssetRepresentation *rep = [myasset defaultRepresentation];
CGImageRef iref = [rep fullResolutionImage];
if (iref)
{
UIImage *largeimage = [UIImage imageWithCGImage:iref ];
NSData * data = UIImageJPEGRepresentation(largeimage, 1.0f);
}
I then copy the image from the camera roll using Image Capture onto my mac. I then use NSImage in my Mac Code to open the copied image and try to get a NSData representation using the following code.
NSImage * image = [[NSImage alloc] initWithContentsOfURL:fileURL];
NSBitmapImageRep *imgRep = [[image representations] objectAtIndex: 0];
NSData *data = [imgRep representationUsingType: NSJPEGFileType properties: nil];
Problem:
Unfortunately, the two NSData representations I get are very different. I want to be able to get the same NSData representation in both cases (since it is the same file). I can then go on to hash the NSData objects and compare the hashes to conclude that the two are (possibly) the same image. Ideally I would want the following two functions;
//In iOS
-(NSData *) getDataFromALAsset:(ALAsset*)asset;
//or
-(NSData *) getDataFromUIImage:(UIImage*)image;
//In OS X
-(NSData *) getDataFromFileAtURL:(NSURL*)url;
//or
-(NSData *) getDataFromNSImage:(NSImage*)image;
Such that the NSData* representation I get in OS X and iOS are exactly the same given that they come from the same source image.
What I have tried:
I have tried to play around with how I get the UIImage object from the ALAsset object, I have tried to UIImagePNGRepresentation (and the corresponding for getting NSData in OS X). I have also tried to play around with different parameters for getting the representation in OS X but nothing has come through. I have also tried to create a CGIImageRef on both platforms, convert that to Bitmaps and read them pixel by pixel and even those seem to be off (and yes I do realise that the NSBitmapImageRep has different co-ordinate system).
I did eventually find a way to do what I wanted. The ALAssetRepresentation class's getBytes:fromOffset:length:error: method can be used to get the NSData object which is same as [NSData dataWithContentsOfURL:fileURL] in OS X. Note that doing it from the UIImage is not possible since the UIImage performs some processing on the image. Here is what the requested functions would look like.
//In iOS
-(NSData *) getDataFromALAsset:(ALAsset*)asset {
ALAssetRepresentation *rep = [asset defaultRepresentation];
Byte *buffer = (Byte*)malloc(rep.size);
NSUInteger buffered = [rep getBytes:buffer fromOffset:0.0 length:rep.size error:nil];
NSData *assetData = [NSData dataWithBytesNoCopy:buffer length:buffered freeWhenDone:YES];
return assetData;
}
//In OS X
-(NSData *) getDataFromFileAtURL:(NSURL*)url
{
return [NSData dataWithContentsOfURL:url];
}
I have an issue with new iOS 7 photo filters feature.
I have a photolibrary in my app. While I showing photo's thumbnails in UICollectionView I receive images with filters and crops already applied. There are two methods that return "ready for use" images:
[asset thumbnail]
[[asset defaultRepresentation] fullScreenImage]
On the contrary, when I want to share fullsize image I receive unchanged photo without any filters:
[[asset defaultRepresentation] fullResolutionImage]
Read image data through getBytes:fromOffset:length:error:
Is it possible to get a fullsize image with filter appropriate applied?
So far I figured out only one way to get what I want. All assets store their modification (like filters, crops and etc) info in the metadata dictionary by the key #"AdjustmentXMP". We're able to interpret this data and apply all filters to the fullResolutionImage like in this SO answer. Here is my complete solution:
...
ALAssetRepresentation *assetRepresentation = [asset defaultRepresentation];
CGImageRef fullResImage = [assetRepresentation fullResolutionImage];
NSString *adjustment = [[assetRepresentation metadata] objectForKey:#"AdjustmentXMP"];
if (adjustment) {
NSData *xmpData = [adjustment dataUsingEncoding:NSUTF8StringEncoding];
CIImage *image = [CIImage imageWithCGImage:fullResImage];
NSError *error = nil;
NSArray *filterArray = [CIFilter filterArrayFromSerializedXMP:xmpData
inputImageExtent:image.extent
error:&error];
CIContext *context = [CIContext contextWithOptions:nil];
if (filterArray && !error) {
for (CIFilter *filter in filterArray) {
[filter setValue:image forKey:kCIInputImageKey];
image = [filter outputImage];
}
fullResImage = [context createCGImage:image fromRect:[image extent]];
}
}
UIImage *result = [UIImage imageWithCGImage:fullResImage
scale:[assetRepresentation scale]
orientation:(UIImageOrientation)[assetRepresentation orientation]];
In order to prevent lagging in my app, I'm trying to compress images larger than 1 MB (mostly for pics taken from iphone's normal camera.
UIImage *image = [info objectForKey:UIImagePickerControllerOriginalImage];
NSData *imageSize = UIImageJPEGRepresentation(image, 1);
NSLog(#"original size %u", [imageSize length]);
UIImage *image2 = [UIImage imageWithData:UIImageJPEGRepresentation(image, 0)];
NSData *newImageSize = UIImageJPEGRepresentation(image2, 1);
NSLog(#"new size %u", [newImageSize length]);
UIImage *image3 = [UIImage imageWithData:UIImageJPEGRepresentation(image2, 0)];
NSData *newImageSize2 = UIImageJPEGRepresentation(image3, 1);
NSLog(#"new size %u", [newImageSize2 length]);
picView = [[UIImageView alloc] initWithImage:image3] ;
However, the NSLog I get outputs something along the lines of
original size 3649058
new size 1835251
new size 1834884
The difference between the 1st and 2nd compression is almost negligible. My goal is to get the image size below 1 MB. Did I overlook something/is there an alternative approach to achieve this?
EDIT: I want to avoid scaling the image's height and width, if possible.
A couple of thoughts:
The UIImageJPEGRepresentation function does not return the "original" image. For example, if you employ a compressionQuality of 1.0, it does not, technically, return the "original" image, but rather it returns a JPEG rendition of the image with compressionQuality at its maximum value. This can actually yield an object that is larger than the original asset (at least if the original image is a JPEG). You're also discarding all of the metadata (information about where the image was taken, the camera settings, etc.) in the process.
If you want the original asset, you should use PHImageManager:
NSURL *url = [info objectForKey:UIImagePickerControllerReferenceURL];
PHFetchResult *result = [PHAsset fetchAssetsWithALAssetURLs:#[url] options:nil];
PHAsset *asset = [result firstObject];
PHImageManager *manager = [PHImageManager defaultManager];
[manager requestImageDataForAsset:asset options:nil resultHandler:^(NSData *imageData, NSString *dataUTI, UIImageOrientation orientation, NSDictionary *info) {
NSString *filename = [(NSURL *)info[#"PHImageFileURLKey"] lastPathComponent];
// do what you want with the `imageData`
}];
In iOS versions prior to 8, you'd have to use assetForURL of the ALAssetsLibrary class:
NSURL *url = [info objectForKey:UIImagePickerControllerReferenceURL];
ALAssetsLibrary *library = [[ALAssetsLibrary alloc] init];
[library assetForURL:url resultBlock:^(ALAsset *asset) {
ALAssetRepresentation *representation = [asset defaultRepresentation];
NSLog(#"size of original asset %llu", [representation size]);
// I generally would write directly to a `NSOutputStream`, but if you want it in a
// NSData, it would be something like:
NSMutableData *data = [NSMutableData data];
// now loop, reading data into buffer and writing that to our data strea
NSError *error;
long long bufferOffset = 0ll;
NSInteger bufferSize = 10000;
long long bytesRemaining = [representation size];
uint8_t buffer[bufferSize];
NSUInteger bytesRead;
while (bytesRemaining > 0) {
bytesRead = [representation getBytes:buffer fromOffset:bufferOffset length:bufferSize error:&error];
if (bytesRead == 0) {
NSLog(#"error reading asset representation: %#", error);
return;
}
bytesRemaining -= bytesRead;
bufferOffset += bytesRead;
[data appendBytes:buffer length:bytesRead];
}
// ok, successfully read original asset;
// do whatever you want with it here
} failureBlock:^(NSError *error) {
NSLog(#"error=%#", error);
}];
Please note that this assetForURL runs asynchronously.
If you want a NSData with compression, you can use UIImageJPEGRepresentation with a compressionQuality less than 1.0. Your code actually does this with a compressionQuality of 0.0, which should offer maximum compression. But you don't save that NSData, but rather use it to create a UIImage and you then get a new UIImageJPEGRepresentation with a compressionQuality of 1.0, thus losing much of the compression you originally achieved.
Consider the following code:
// a UIImage of the original asset (discarding meta data)
UIImage *image = [info objectForKey:UIImagePickerControllerOriginalImage];
// this may well be larger than the original asset
NSData *jpgDataHighestCompressionQuality = UIImageJPEGRepresentation(image, 1.0);
[jpgDataHighestCompressionQuality writeToFile:[docsPath stringByAppendingPathComponent:#"imageDataFromJpeg.jpg"] atomically:YES];
NSLog(#"compressionQuality = 1.0; length = %u", [jpgDataHighestCompressionQuality length]);
// this will be smaller, but with some loss of data
NSData *jpgDataLowestCompressionQuality = UIImageJPEGRepresentation(image, 0.0);
NSLog(#"compressionQuality = 0.0; length = %u", [jpgDataLowestCompressionQuality length]);
UIImage *image2 = [UIImage imageWithData:jpgDataLowestCompressionQuality];
// ironically, this will be larger than jpgDataLowestCompressionQuality
NSData *newImageSize = UIImageJPEGRepresentation(image2, 1.0);
NSLog(#"new size %u", [newImageSize length]);
In addition to the JPEG compression quality outlined the prior point, you could also just resize the image. You can also marry this with the JPEG compressionQuality, too.
You can not compress the image again and again. If so everything can be compressed again and again. Then how small do you think it will be?
One way to make your image smaller is to change it's size. For example change 640X960 to 320X480. But you will lose quality.
I is the first implementation of UIImageJPEGRepresentation (image, 0.75), and then change the size. Maybe image's width and heigh two-thirds or half.
My application should be able to write custom metadata entries to PNG images for export to the UIPasteboard.
By piecing together various posts on the subject, I've been able to come up with the class given below as source.
Triggering the copyPressed method with a button, I'm able to set custom metadata with JPG images (EXIF):
Image[6101:907] found jpg exif dictionary
Image[6101:907] checking image metadata on clipboard
Image[6101:907] {
ColorModel = RGB;
Depth = 8;
Orientation = 1;
PixelHeight = 224;
PixelWidth = 240;
"{Exif}" = {
ColorSpace = 1;
PixelXDimension = 240;
PixelYDimension = 224;
UserComment = "Here is a comment";
};
"{JFIF}" = {
DensityUnit = 0;
JFIFVersion = (
1,
1
);
XDensity = 1;
YDensity = 1;
};
"{TIFF}" = {
Orientation = 1;
};
}
Although I'm able to read the PNG metadata just fine, I can't seem to write to it:
Image[6116:907] found png property dictionary
Image[6116:907] checking image metadata on clipboard
Image[6116:907] {
ColorModel = RGB;
Depth = 8;
PixelHeight = 224;
PixelWidth = 240;
"{PNG}" = {
InterlaceType = 0;
};
}
However, nothing in the documentation suggests this should fail and the presence of many PNG-specific metadata constants suggests it should succeed.
My application should use PNG to avoid JPG's lossy compression.
Why can I not set custom metadata on an in-memory PNG image in iOS?
Note: I've seen this SO question, but it doesn't address the problem here, which is how to write metadata to PNG images specifically.
IMViewController.m
#import "IMViewController.h"
#import <ImageIO/ImageIO.h>
#interface IMViewController ()
#end
#implementation IMViewController
- (IBAction)copyPressed:(id)sender
{
// [self copyJPG];
[self copyPNG];
}
-(void)copyPNG
{
NSData *pngData = UIImagePNGRepresentation([UIImage imageNamed:#"wow.png"]);
CGImageSourceRef source = CGImageSourceCreateWithData((__bridge CFDataRef)pngData, NULL);
NSDictionary *metadata = (__bridge NSDictionary *) CGImageSourceCopyPropertiesAtIndex(source, 0, NULL);
NSMutableDictionary *mutableMetadata = [metadata mutableCopy];
NSMutableDictionary *dict = [[mutableMetadata objectForKey:(NSString *) kCGImagePropertyPNGDictionary] mutableCopy];
if (dict) {
NSLog(#"found png property dictionary");
} else {
NSLog(#"creating png property dictionary");
dict = [NSMutableDictionary dictionary];
}
// set values on the root dictionary
[mutableMetadata setObject:#"Name of Software" forKey:(NSString *)kCGImagePropertyPNGDescription];
[mutableMetadata setObject:dict forKey:(NSString *)kCGImagePropertyPNGDictionary];
// set values on the internal dictionary
[dict setObject:#"works" forKey:(NSString *)kCGImagePropertyPNGDescription];
CFStringRef UTI = CGImageSourceGetType(source);
NSMutableData *data = [NSMutableData data];
CGImageDestinationRef destination = CGImageDestinationCreateWithData((__bridge CFMutableDataRef) data, UTI, 1, NULL);
if (!destination) {
NSLog(#">>> Could not create image destination <<<");
return;
}
CGImageDestinationAddImageFromSource(destination, source, 0, (__bridge CFDictionaryRef) mutableMetadata);
BOOL success = CGImageDestinationFinalize(destination);
if (!success) {
NSLog(#">>> Error Writing Data <<<");
}
UIPasteboard *pasteboard = [UIPasteboard generalPasteboard];
[pasteboard setData:data forPasteboardType:#"public.png"];
[self showPNGMetadata];
}
-(void)copyJPG
{
NSData *jpgData = UIImageJPEGRepresentation([UIImage imageNamed:#"wow.jpg"], 1);
CGImageSourceRef source = CGImageSourceCreateWithData((__bridge CFDataRef) jpgData, NULL);
NSDictionary *metadata = (__bridge NSDictionary *) CGImageSourceCopyPropertiesAtIndex(source, 0, NULL);
NSMutableDictionary *mutableMetadata = [metadata mutableCopy];
NSMutableDictionary *exif = [[mutableMetadata objectForKey:(NSString *)kCGImagePropertyExifDictionary] mutableCopy];
if (exif) {
NSLog(#"found jpg exif dictionary");
} else {
NSLog(#"creating jpg exif dictionary");
}
// set values on the exif dictionary
[exif setObject:#"Here is a comment" forKey:(NSString *)kCGImagePropertyExifUserComment];
[mutableMetadata setObject:exif forKey:(NSString *)kCGImagePropertyExifDictionary];
CFStringRef UTI = CGImageSourceGetType(source);
NSMutableData *data = [NSMutableData data];
CGImageDestinationRef destination = CGImageDestinationCreateWithData((__bridge CFMutableDataRef) data, UTI, 1, NULL);
if(!destination) {
NSLog(#">>> Could not create image destination <<<");
return;
}
CGImageDestinationAddImageFromSource(destination,source, 0, (__bridge CFDictionaryRef) mutableMetadata);
BOOL success = CGImageDestinationFinalize(destination);
if (!success) {
NSLog(#">>> Could not create data from image destination <<<");
}
UIPasteboard *pasteboard = [UIPasteboard generalPasteboard];
[pasteboard setData:data forPasteboardType:#"public.jpeg"];
[self showJPGMetadata];
}
-(void)showJPGMetadata
{
NSLog(#"checking image metadata on clipboard");
UIPasteboard *pasteboard = [UIPasteboard generalPasteboard];
NSData *data = [pasteboard dataForPasteboardType:#"public.jpeg"];
CGImageSourceRef source = CGImageSourceCreateWithData((__bridge CFDataRef)data, NULL);
NSDictionary *metadata = (__bridge NSDictionary *) CGImageSourceCopyPropertiesAtIndex(source,0,NULL);
NSLog(#"%#", metadata);
}
-(void)showPNGMetadata
{
NSLog(#"checking image metadata on clipboard");
UIPasteboard *pasteboard = [UIPasteboard generalPasteboard];
NSData *data = [pasteboard dataForPasteboardType:#"public.png"];
CGImageSourceRef source = CGImageSourceCreateWithData((__bridge CFDataRef)data, NULL);
NSDictionary *metadata = (__bridge NSDictionary *) CGImageSourceCopyPropertiesAtIndex(source,0,NULL);
NSLog(#"%#", metadata);
}
#end
If you will try to save your image with modified metadata
[data writeToFile:[NSTemporaryDirectory() stringByAppendingPathComponent:#"test.png"]
atomically:YES];
And than view it properties in Finder. You will see that kCGImagePropertyPNGDescription field was setted up successfully.
But if you will try read metadata of this new file, kCGImagePropertyPNGDescription will be lost.
ColorModel = RGB;
Depth = 8;
PixelHeight = 1136;
PixelWidth = 640;
"{PNG}" = {
InterlaceType = 0;
};
After some research I found that PNG doesn't contain metadata. But it may contain XMP metadata. However seems like ImageIO didn't work with XMP.
Maybe you can try to use ImageMagic or libexif.
Useful links:
PNG Specification
Reading/Writing image XMP on iPhone / Objective-c
Does PNG support metadata fields like Author, Camera Model, etc?
Does PNG contain EXIF data like JPG?
libexif.sourceforge.net