I use setImageData:metadata:completionBlock: of ALAsset to update the exif(metadata) of an asset.
I just want to update the metadata, but this method require an imageData as the first parameter. I use the code below to generate imageData, but it modified my image(I checked the file size and file hash).
ALAssetRepresentation *dr = asset.defaultRepresentation;
UIImage *image = [UIImage imageWithCGImage:dr.fullResolutionImage scale:dr.scale orientation:dr.orientation];
NSData *data = UIImageJPEGRepresentation(image, 1);
Is there any other method I could use to update just the exif of an ALAsset? Or any way to generate the right imageData for method setImageData:metadata:completionBlock: ?
I found a way to generate imageData. Code below:
Byte *buffer = (Byte *)malloc(dr.size);
NSUInteger k = [dr getBytes:buffer fromOffset:0.0 length:dr.size error:nil];
NSData *data = [NSData dataWithBytesNoCopy:buffer length:k freeWhenDone:YES];
So I can use the data above with setImageData:metadata:completionBlock: to update only the exif of ALAsset.
Related
I am converting image into base64 string like this.
ALAssetRepresentation *rep = [imageAsset defaultRepresentation];
Byte *buffer = (Byte*)malloc(rep.size);
UIImage *copyOfOriginalImage = [UIImage imageWithCGImage:[[imageAsset defaultRepresentation] fullResolutionImage]];
NSUInteger buffered = [rep getBytes:buffer fromOffset:0.0 length:rep.size error:nil];
imageData = UIImageJPEGRepresentation(copyOfOriginalImage, 1.0);
Then I convert this NSData into base64 string like this.
strEncoded=[imageData base64EncodedStringWithOptions:76];
I used this online base64 convertor to test my image(use same image in online tool and within my app)
FROM THE TOOL
9j/4AAQSkZJRgABAQAASABIAAD/4QBYRXhpZgAATU0AKgAAAAgAAgESAAMAAAABAAYAAIdpAAQAAAABAAAAJgAAAAAAA6ABAAMAAAABAAEAAKACAAQAAAABAAAMwKADAAQAAAABAAAJkAAAAAD/7QA4
FROM THE APP
\/9j\/4AAQSkZJRgABAQAASABIAAD\/4QBMRXhpZgAATU0AKgAAAAgAAgESAAMAAAABAAEAAIdpAAQAAAABAAAAJgAAAAAAAqACAAQAAAABAAAMwKADAAQAAAABAAAJkAAAAAD\/
These starting portions are different. But when I converted using online tool that string accepts by my server.But not when I pass from the app.
What is the reason for this?
Please help me.
Thanks
My server is giving me my jpg image in the following NSData format:
/9j/4AAQSkZJRgABAQAASABIAAD/4QBMRXhpZgAATU0AKgAAAAgAAgESAAMAAAABAAEAAIdpAAQA
AAABAAAAJgAAAAAAAqACAAQAAAABAAAGqKADAAQAAAABAAAI4AAAAAD/7QA4UGhvdG9zaG9wIDMu
MAA4QklNBAQAAAAAAAA4QklNBCUAAAAAABDUHYzZjwCyBOmACZjs+EJ+/8AAEQgI4AaoAwERAAIR
AQMRAf/EAB8AAAEFAQEBAQEBAAAAAAAAAAABAgMEBQYHCAkKC//EALUQAAIBAwMCBAMFBQQEAAAB
fQECAwAEEQUSITFBBhNRYQcicRQygZGhCCNCscEVUtHwJDNicoIJChYXGBkaJSYnKCkqNDU2Nzg5/
I am saving it to a file, and while reading that file, the img object in my code below is giving me nil, although imgData object is holding the saved data.
- (void)selectedAttachedFiledownloadedSuccessfully
{
NSLog(#"\nFile has downloaded\n");
NSData *imgData = [NSData dataWithContentsOfFile:[self pathOfTheImage]];
NSString * imageExt = [self contentTypeForImageData:imgData];
UIImage *img = [UIImage imageWithData:imgData];
self.imgView.image = img;
}
Checking NSData for the image formats, it's not matching any and my code below is returning me nil
- (NSString *)contentTypeForImageData:(NSData *)data {
uint8_t c;
[data getBytes:&c length:1];
switch (c) {
case 0xFF:
return #"image/jpeg";
case 0x89:
return #"image/png";
case 0x47:
return #"image/gif";
case 0x49:
case 0x4D:
return #"image/tiff";
}
return nil;
}
I don't know what I am doing wrong over here. Can any one guide me through this plz?
You might have used the wrong encoding (such as NSUTF8StringEncoding) when storing the data.
NSUTF32StringEncoding should be used for image data.
The NSData which you are getting back from server might be corrupted. If the data is not in proper format it will not give you back the proper image
Use + (instancetype)dataWithContentsOfFile:(NSString *)path options:(NSDataReadingOptions)mask error:(NSError * _Nullable *)errorPtr to read your image back from file and check the value of errorPtr
Refer this link for explanation for the above method https://developer.apple.com/library/mac/documentation/Cocoa/Reference/Foundation/Classes/NSData_Class/#//apple_ref/occ/clm/NSData/dataWithContentsOfFile:options:error:
The server was sending my the image in base64Binary string format. I changed it to NSData as:
NSData *data = [[NSData alloc] initWithBase64EncodedString:output[1] options:NSDataBase64DecodingIgnoreUnknownCharacters];
I save this data to my image file as:
[data writeToFile:filePath atomically:YES];
Read it to show it on image view as:
- (void)showImage
{
NSData *imgData = [NSData dataWithContentsOfFile:[self pathOfTheImage]];
UIImage *img = [UIImage imageWithData:imgData];
self.imgView.image = img;
}
It's working fine now.
Scenario:
I have an image in the iPhone camera roll. I access it using ALAssetLibrary and get an ALAsset object. I get a UIImage and NSData object from it using something like the following code.
ALAssetRepresentation *rep = [myasset defaultRepresentation];
CGImageRef iref = [rep fullResolutionImage];
if (iref)
{
UIImage *largeimage = [UIImage imageWithCGImage:iref ];
NSData * data = UIImageJPEGRepresentation(largeimage, 1.0f);
}
I then copy the image from the camera roll using Image Capture onto my mac. I then use NSImage in my Mac Code to open the copied image and try to get a NSData representation using the following code.
NSImage * image = [[NSImage alloc] initWithContentsOfURL:fileURL];
NSBitmapImageRep *imgRep = [[image representations] objectAtIndex: 0];
NSData *data = [imgRep representationUsingType: NSJPEGFileType properties: nil];
Problem:
Unfortunately, the two NSData representations I get are very different. I want to be able to get the same NSData representation in both cases (since it is the same file). I can then go on to hash the NSData objects and compare the hashes to conclude that the two are (possibly) the same image. Ideally I would want the following two functions;
//In iOS
-(NSData *) getDataFromALAsset:(ALAsset*)asset;
//or
-(NSData *) getDataFromUIImage:(UIImage*)image;
//In OS X
-(NSData *) getDataFromFileAtURL:(NSURL*)url;
//or
-(NSData *) getDataFromNSImage:(NSImage*)image;
Such that the NSData* representation I get in OS X and iOS are exactly the same given that they come from the same source image.
What I have tried:
I have tried to play around with how I get the UIImage object from the ALAsset object, I have tried to UIImagePNGRepresentation (and the corresponding for getting NSData in OS X). I have also tried to play around with different parameters for getting the representation in OS X but nothing has come through. I have also tried to create a CGIImageRef on both platforms, convert that to Bitmaps and read them pixel by pixel and even those seem to be off (and yes I do realise that the NSBitmapImageRep has different co-ordinate system).
I did eventually find a way to do what I wanted. The ALAssetRepresentation class's getBytes:fromOffset:length:error: method can be used to get the NSData object which is same as [NSData dataWithContentsOfURL:fileURL] in OS X. Note that doing it from the UIImage is not possible since the UIImage performs some processing on the image. Here is what the requested functions would look like.
//In iOS
-(NSData *) getDataFromALAsset:(ALAsset*)asset {
ALAssetRepresentation *rep = [asset defaultRepresentation];
Byte *buffer = (Byte*)malloc(rep.size);
NSUInteger buffered = [rep getBytes:buffer fromOffset:0.0 length:rep.size error:nil];
NSData *assetData = [NSData dataWithBytesNoCopy:buffer length:buffered freeWhenDone:YES];
return assetData;
}
//In OS X
-(NSData *) getDataFromFileAtURL:(NSURL*)url
{
return [NSData dataWithContentsOfURL:url];
}
I have a .png file in my resources folder.(actual size is 411 KB)
When I convert the uiimage to nsdata and try accessing length property, it gives me wrong value.
Code...
UIImage *image = [UIImage imageNamed:#"sample.png"];
NSData *imgData = [[NSData alloc] initWithData:UIImageJPEGRepresentation(image, 1.0)];
int imageSize = imgData.length;
NSLog(#"Image size in KB is %d",imageSize/1024); //-------- returns 631 KB
Please let me know if there is any other property which helps..
So here is my requirement....
I want to know the size of the image I pick from uimagepicker. The exact size of the image when I see it in the finder and the size which gets returned to me after picking it from the library is totally different... Is there any other property which can be used instead of length?
You are converting a png to a jpeg, and so different file size should be expected.
If you wish to get the file-size of the original, png image, do the following.
NSString *path = [[NSBundle mainBundle] pathForResource:#"sample" ofType:#"png"];
NSData *rawData = [NSData dataWithContentsOfFile:path];
NSLog(#"%d", rawData.length);
try this:
unsigned int len = [data length];
uint32_t little = (uint32_t)NSSwapHostIntToLittle(len);
NSData *byteData = [NSData dataWithBytes:&little length:4];
When you loaded the image you decompressed it. When you created "imgData" the image did not get recompressed with the same algorithm. There would be no reason to expect the two to have the same size.
In order to prevent lagging in my app, I'm trying to compress images larger than 1 MB (mostly for pics taken from iphone's normal camera.
UIImage *image = [info objectForKey:UIImagePickerControllerOriginalImage];
NSData *imageSize = UIImageJPEGRepresentation(image, 1);
NSLog(#"original size %u", [imageSize length]);
UIImage *image2 = [UIImage imageWithData:UIImageJPEGRepresentation(image, 0)];
NSData *newImageSize = UIImageJPEGRepresentation(image2, 1);
NSLog(#"new size %u", [newImageSize length]);
UIImage *image3 = [UIImage imageWithData:UIImageJPEGRepresentation(image2, 0)];
NSData *newImageSize2 = UIImageJPEGRepresentation(image3, 1);
NSLog(#"new size %u", [newImageSize2 length]);
picView = [[UIImageView alloc] initWithImage:image3] ;
However, the NSLog I get outputs something along the lines of
original size 3649058
new size 1835251
new size 1834884
The difference between the 1st and 2nd compression is almost negligible. My goal is to get the image size below 1 MB. Did I overlook something/is there an alternative approach to achieve this?
EDIT: I want to avoid scaling the image's height and width, if possible.
A couple of thoughts:
The UIImageJPEGRepresentation function does not return the "original" image. For example, if you employ a compressionQuality of 1.0, it does not, technically, return the "original" image, but rather it returns a JPEG rendition of the image with compressionQuality at its maximum value. This can actually yield an object that is larger than the original asset (at least if the original image is a JPEG). You're also discarding all of the metadata (information about where the image was taken, the camera settings, etc.) in the process.
If you want the original asset, you should use PHImageManager:
NSURL *url = [info objectForKey:UIImagePickerControllerReferenceURL];
PHFetchResult *result = [PHAsset fetchAssetsWithALAssetURLs:#[url] options:nil];
PHAsset *asset = [result firstObject];
PHImageManager *manager = [PHImageManager defaultManager];
[manager requestImageDataForAsset:asset options:nil resultHandler:^(NSData *imageData, NSString *dataUTI, UIImageOrientation orientation, NSDictionary *info) {
NSString *filename = [(NSURL *)info[#"PHImageFileURLKey"] lastPathComponent];
// do what you want with the `imageData`
}];
In iOS versions prior to 8, you'd have to use assetForURL of the ALAssetsLibrary class:
NSURL *url = [info objectForKey:UIImagePickerControllerReferenceURL];
ALAssetsLibrary *library = [[ALAssetsLibrary alloc] init];
[library assetForURL:url resultBlock:^(ALAsset *asset) {
ALAssetRepresentation *representation = [asset defaultRepresentation];
NSLog(#"size of original asset %llu", [representation size]);
// I generally would write directly to a `NSOutputStream`, but if you want it in a
// NSData, it would be something like:
NSMutableData *data = [NSMutableData data];
// now loop, reading data into buffer and writing that to our data strea
NSError *error;
long long bufferOffset = 0ll;
NSInteger bufferSize = 10000;
long long bytesRemaining = [representation size];
uint8_t buffer[bufferSize];
NSUInteger bytesRead;
while (bytesRemaining > 0) {
bytesRead = [representation getBytes:buffer fromOffset:bufferOffset length:bufferSize error:&error];
if (bytesRead == 0) {
NSLog(#"error reading asset representation: %#", error);
return;
}
bytesRemaining -= bytesRead;
bufferOffset += bytesRead;
[data appendBytes:buffer length:bytesRead];
}
// ok, successfully read original asset;
// do whatever you want with it here
} failureBlock:^(NSError *error) {
NSLog(#"error=%#", error);
}];
Please note that this assetForURL runs asynchronously.
If you want a NSData with compression, you can use UIImageJPEGRepresentation with a compressionQuality less than 1.0. Your code actually does this with a compressionQuality of 0.0, which should offer maximum compression. But you don't save that NSData, but rather use it to create a UIImage and you then get a new UIImageJPEGRepresentation with a compressionQuality of 1.0, thus losing much of the compression you originally achieved.
Consider the following code:
// a UIImage of the original asset (discarding meta data)
UIImage *image = [info objectForKey:UIImagePickerControllerOriginalImage];
// this may well be larger than the original asset
NSData *jpgDataHighestCompressionQuality = UIImageJPEGRepresentation(image, 1.0);
[jpgDataHighestCompressionQuality writeToFile:[docsPath stringByAppendingPathComponent:#"imageDataFromJpeg.jpg"] atomically:YES];
NSLog(#"compressionQuality = 1.0; length = %u", [jpgDataHighestCompressionQuality length]);
// this will be smaller, but with some loss of data
NSData *jpgDataLowestCompressionQuality = UIImageJPEGRepresentation(image, 0.0);
NSLog(#"compressionQuality = 0.0; length = %u", [jpgDataLowestCompressionQuality length]);
UIImage *image2 = [UIImage imageWithData:jpgDataLowestCompressionQuality];
// ironically, this will be larger than jpgDataLowestCompressionQuality
NSData *newImageSize = UIImageJPEGRepresentation(image2, 1.0);
NSLog(#"new size %u", [newImageSize length]);
In addition to the JPEG compression quality outlined the prior point, you could also just resize the image. You can also marry this with the JPEG compressionQuality, too.
You can not compress the image again and again. If so everything can be compressed again and again. Then how small do you think it will be?
One way to make your image smaller is to change it's size. For example change 640X960 to 320X480. But you will lose quality.
I is the first implementation of UIImageJPEGRepresentation (image, 0.75), and then change the size. Maybe image's width and heigh two-thirds or half.