NSData to UIImage Conversion is not working? - ios

My server is giving me my jpg image in the following NSData format:
/9j/4AAQSkZJRgABAQAASABIAAD/4QBMRXhpZgAATU0AKgAAAAgAAgESAAMAAAABAAEAAIdpAAQA
AAABAAAAJgAAAAAAAqACAAQAAAABAAAGqKADAAQAAAABAAAI4AAAAAD/7QA4UGhvdG9zaG9wIDMu
MAA4QklNBAQAAAAAAAA4QklNBCUAAAAAABDUHYzZjwCyBOmACZjs+EJ+/8AAEQgI4AaoAwERAAIR
AQMRAf/EAB8AAAEFAQEBAQEBAAAAAAAAAAABAgMEBQYHCAkKC//EALUQAAIBAwMCBAMFBQQEAAAB
fQECAwAEEQUSITFBBhNRYQcicRQygZGhCCNCscEVUtHwJDNicoIJChYXGBkaJSYnKCkqNDU2Nzg5/
I am saving it to a file, and while reading that file, the img object in my code below is giving me nil, although imgData object is holding the saved data.
- (void)selectedAttachedFiledownloadedSuccessfully
{
NSLog(#"\nFile has downloaded\n");
NSData *imgData = [NSData dataWithContentsOfFile:[self pathOfTheImage]];
NSString * imageExt = [self contentTypeForImageData:imgData];
UIImage *img = [UIImage imageWithData:imgData];
self.imgView.image = img;
}
Checking NSData for the image formats, it's not matching any and my code below is returning me nil
- (NSString *)contentTypeForImageData:(NSData *)data {
uint8_t c;
[data getBytes:&c length:1];
switch (c) {
case 0xFF:
return #"image/jpeg";
case 0x89:
return #"image/png";
case 0x47:
return #"image/gif";
case 0x49:
case 0x4D:
return #"image/tiff";
}
return nil;
}
I don't know what I am doing wrong over here. Can any one guide me through this plz?

You might have used the wrong encoding (such as NSUTF8StringEncoding) when storing the data.
NSUTF32StringEncoding should be used for image data.

The NSData which you are getting back from server might be corrupted. If the data is not in proper format it will not give you back the proper image
Use + (instancetype)dataWithContentsOfFile:(NSString *)path options:(NSDataReadingOptions)mask error:(NSError * _Nullable *)errorPtr to read your image back from file and check the value of errorPtr
Refer this link for explanation for the above method https://developer.apple.com/library/mac/documentation/Cocoa/Reference/Foundation/Classes/NSData_Class/#//apple_ref/occ/clm/NSData/dataWithContentsOfFile:options:error:

The server was sending my the image in base64Binary string format. I changed it to NSData as:
NSData *data = [[NSData alloc] initWithBase64EncodedString:output[1] options:NSDataBase64DecodingIgnoreUnknownCharacters];
I save this data to my image file as:
[data writeToFile:filePath atomically:YES];
Read it to show it on image view as:
- (void)showImage
{
NSData *imgData = [NSData dataWithContentsOfFile:[self pathOfTheImage]];
UIImage *img = [UIImage imageWithData:imgData];
self.imgView.image = img;
}
It's working fine now.

Related

NSImage and UIImage give different NSData representations

Scenario:
I have an image in the iPhone camera roll. I access it using ALAssetLibrary and get an ALAsset object. I get a UIImage and NSData object from it using something like the following code.
ALAssetRepresentation *rep = [myasset defaultRepresentation];
CGImageRef iref = [rep fullResolutionImage];
if (iref)
{
UIImage *largeimage = [UIImage imageWithCGImage:iref ];
NSData * data = UIImageJPEGRepresentation(largeimage, 1.0f);
}
I then copy the image from the camera roll using Image Capture onto my mac. I then use NSImage in my Mac Code to open the copied image and try to get a NSData representation using the following code.
NSImage * image = [[NSImage alloc] initWithContentsOfURL:fileURL];
NSBitmapImageRep *imgRep = [[image representations] objectAtIndex: 0];
NSData *data = [imgRep representationUsingType: NSJPEGFileType properties: nil];
Problem:
Unfortunately, the two NSData representations I get are very different. I want to be able to get the same NSData representation in both cases (since it is the same file). I can then go on to hash the NSData objects and compare the hashes to conclude that the two are (possibly) the same image. Ideally I would want the following two functions;
//In iOS
-(NSData *) getDataFromALAsset:(ALAsset*)asset;
//or
-(NSData *) getDataFromUIImage:(UIImage*)image;
//In OS X
-(NSData *) getDataFromFileAtURL:(NSURL*)url;
//or
-(NSData *) getDataFromNSImage:(NSImage*)image;
Such that the NSData* representation I get in OS X and iOS are exactly the same given that they come from the same source image.
What I have tried:
I have tried to play around with how I get the UIImage object from the ALAsset object, I have tried to UIImagePNGRepresentation (and the corresponding for getting NSData in OS X). I have also tried to play around with different parameters for getting the representation in OS X but nothing has come through. I have also tried to create a CGIImageRef on both platforms, convert that to Bitmaps and read them pixel by pixel and even those seem to be off (and yes I do realise that the NSBitmapImageRep has different co-ordinate system).
I did eventually find a way to do what I wanted. The ALAssetRepresentation class's getBytes:fromOffset:length:error: method can be used to get the NSData object which is same as [NSData dataWithContentsOfURL:fileURL] in OS X. Note that doing it from the UIImage is not possible since the UIImage performs some processing on the image. Here is what the requested functions would look like.
//In iOS
-(NSData *) getDataFromALAsset:(ALAsset*)asset {
ALAssetRepresentation *rep = [asset defaultRepresentation];
Byte *buffer = (Byte*)malloc(rep.size);
NSUInteger buffered = [rep getBytes:buffer fromOffset:0.0 length:rep.size error:nil];
NSData *assetData = [NSData dataWithBytesNoCopy:buffer length:buffered freeWhenDone:YES];
return assetData;
}
//In OS X
-(NSData *) getDataFromFileAtURL:(NSURL*)url
{
return [NSData dataWithContentsOfURL:url];
}

Release Core Foundation object

Hey I use this method to return NSData.
-(NSData*)getPersonPicture:(NSDictionary *)person {
NSData *imageData = nil;
if (![person valueForKey:FIELD_PERSON_IMAGEDATA]) {
return imageData;
}
if ([[[person valueForKey:FIELD_PERSON_IMAGEDATA] description]containSubString:#"http"]) {
return [NSData dataWithContentsOfURL:[NSURL URLWithString:[person valueForKey:FIELD_PERSON_IMAGEDATA]]];
} else {
ABRecordRef _person = ABAddressBookGetPersonWithRecordID(_aBook,[[person valueForKey:FIELD_PERSON_IMAGEDATA] integerValue]);
imageData = (__bridge NSData*)ABPersonCopyImageDataWithFormat(_person, kABPersonImageFormatThumbnail);
return imageData;
}
}
I can't figure when I need to release this imageData. I can't leave it like this , right?
If you are using ARC, then ARC need to take ownership of the Core Foundation object - which means, ARC will become responsible for it. You can accomplish this with macro CFBridgingRelease:
imageData = CFBridgingRelease(ABPersonCopyImageDataWithFormat(_person, kABPersonImageFormatThumbnail));
return imageData;
Using non-ARC:
(note: usually, we should leveraging ARC!)
-(NSData*)getPersonPicture:(NSDictionary *)person {
NSData *imageData = nil;
if (![person valueForKey:FIELD_PERSON_IMAGEDATA]) {
return imageData;
}
if ([[[person valueForKey:FIELD_PERSON_IMAGEDATA] description]containSubString:#"http"]) {
return [NSData dataWithContentsOfURL:[NSURL URLWithString:[person valueForKey:FIELD_PERSON_IMAGEDATA]]];
} else {
ABRecordRef _person = ABAddressBookGetPersonWithRecordID(_aBook,[[person valueForKey:FIELD_PERSON_IMAGEDATA] integerValue]);
CFDataRef cfData = ABPersonCopyImageDataWithFormat(_person, kABPersonImageFormatThumbnail);
imageData = [[(NSData*)cfData retain] autorelease];
return imageData;
}
}

Calling imageWithData:UIImageJPEGRepresentation() multiple times only compresses image the first time

In order to prevent lagging in my app, I'm trying to compress images larger than 1 MB (mostly for pics taken from iphone's normal camera.
UIImage *image = [info objectForKey:UIImagePickerControllerOriginalImage];
NSData *imageSize = UIImageJPEGRepresentation(image, 1);
NSLog(#"original size %u", [imageSize length]);
UIImage *image2 = [UIImage imageWithData:UIImageJPEGRepresentation(image, 0)];
NSData *newImageSize = UIImageJPEGRepresentation(image2, 1);
NSLog(#"new size %u", [newImageSize length]);
UIImage *image3 = [UIImage imageWithData:UIImageJPEGRepresentation(image2, 0)];
NSData *newImageSize2 = UIImageJPEGRepresentation(image3, 1);
NSLog(#"new size %u", [newImageSize2 length]);
picView = [[UIImageView alloc] initWithImage:image3] ;
However, the NSLog I get outputs something along the lines of
original size 3649058
new size 1835251
new size 1834884
The difference between the 1st and 2nd compression is almost negligible. My goal is to get the image size below 1 MB. Did I overlook something/is there an alternative approach to achieve this?
EDIT: I want to avoid scaling the image's height and width, if possible.
A couple of thoughts:
The UIImageJPEGRepresentation function does not return the "original" image. For example, if you employ a compressionQuality of 1.0, it does not, technically, return the "original" image, but rather it returns a JPEG rendition of the image with compressionQuality at its maximum value. This can actually yield an object that is larger than the original asset (at least if the original image is a JPEG). You're also discarding all of the metadata (information about where the image was taken, the camera settings, etc.) in the process.
If you want the original asset, you should use PHImageManager:
NSURL *url = [info objectForKey:UIImagePickerControllerReferenceURL];
PHFetchResult *result = [PHAsset fetchAssetsWithALAssetURLs:#[url] options:nil];
PHAsset *asset = [result firstObject];
PHImageManager *manager = [PHImageManager defaultManager];
[manager requestImageDataForAsset:asset options:nil resultHandler:^(NSData *imageData, NSString *dataUTI, UIImageOrientation orientation, NSDictionary *info) {
NSString *filename = [(NSURL *)info[#"PHImageFileURLKey"] lastPathComponent];
// do what you want with the `imageData`
}];
In iOS versions prior to 8, you'd have to use assetForURL of the ALAssetsLibrary class:
NSURL *url = [info objectForKey:UIImagePickerControllerReferenceURL];
ALAssetsLibrary *library = [[ALAssetsLibrary alloc] init];
[library assetForURL:url resultBlock:^(ALAsset *asset) {
ALAssetRepresentation *representation = [asset defaultRepresentation];
NSLog(#"size of original asset %llu", [representation size]);
// I generally would write directly to a `NSOutputStream`, but if you want it in a
// NSData, it would be something like:
NSMutableData *data = [NSMutableData data];
// now loop, reading data into buffer and writing that to our data strea
NSError *error;
long long bufferOffset = 0ll;
NSInteger bufferSize = 10000;
long long bytesRemaining = [representation size];
uint8_t buffer[bufferSize];
NSUInteger bytesRead;
while (bytesRemaining > 0) {
bytesRead = [representation getBytes:buffer fromOffset:bufferOffset length:bufferSize error:&error];
if (bytesRead == 0) {
NSLog(#"error reading asset representation: %#", error);
return;
}
bytesRemaining -= bytesRead;
bufferOffset += bytesRead;
[data appendBytes:buffer length:bytesRead];
}
// ok, successfully read original asset;
// do whatever you want with it here
} failureBlock:^(NSError *error) {
NSLog(#"error=%#", error);
}];
Please note that this assetForURL runs asynchronously.
If you want a NSData with compression, you can use UIImageJPEGRepresentation with a compressionQuality less than 1.0. Your code actually does this with a compressionQuality of 0.0, which should offer maximum compression. But you don't save that NSData, but rather use it to create a UIImage and you then get a new UIImageJPEGRepresentation with a compressionQuality of 1.0, thus losing much of the compression you originally achieved.
Consider the following code:
// a UIImage of the original asset (discarding meta data)
UIImage *image = [info objectForKey:UIImagePickerControllerOriginalImage];
// this may well be larger than the original asset
NSData *jpgDataHighestCompressionQuality = UIImageJPEGRepresentation(image, 1.0);
[jpgDataHighestCompressionQuality writeToFile:[docsPath stringByAppendingPathComponent:#"imageDataFromJpeg.jpg"] atomically:YES];
NSLog(#"compressionQuality = 1.0; length = %u", [jpgDataHighestCompressionQuality length]);
// this will be smaller, but with some loss of data
NSData *jpgDataLowestCompressionQuality = UIImageJPEGRepresentation(image, 0.0);
NSLog(#"compressionQuality = 0.0; length = %u", [jpgDataLowestCompressionQuality length]);
UIImage *image2 = [UIImage imageWithData:jpgDataLowestCompressionQuality];
// ironically, this will be larger than jpgDataLowestCompressionQuality
NSData *newImageSize = UIImageJPEGRepresentation(image2, 1.0);
NSLog(#"new size %u", [newImageSize length]);
In addition to the JPEG compression quality outlined the prior point, you could also just resize the image. You can also marry this with the JPEG compressionQuality, too.
You can not compress the image again and again. If so everything can be compressed again and again. Then how small do you think it will be?
One way to make your image smaller is to change it's size. For example change 640X960 to 320X480. But you will lose quality.
I is the first implementation of UIImageJPEGRepresentation (image, 0.75), and then change the size. Maybe image's width and heigh two-thirds or half.

NSData doesn't get the image from image url

NSString *imgvalue=[[NSString alloc]initWithString:item.imgItem];
printf("\n img1 value is %s",[imgvalue UTF8String]);
cell.imageView.image=[UIImage imageNamed:#"unknown.jpg"];
if (imgvalue !=nil)
{
NSData *imageData;
#try
{
printf("\n image value in image data is %s",[imgvalue UTF8String]);
imageData = [[NSData alloc]initWithContentsOfURL:[NSURL URLWithString:imgvalue]];
printf("\n imageData Length is %d",[imageData length]);
}
#catch (NSException * e)
{
//printf("Exception message is %s",[e);
}
#finally
{
UIImage * imageFromImageData = [[UIImage alloc] initWithData:imageData];
//[image setImage:imageFromImageData];
cell.imageView.image=imageFromImageData;
[imageData release];
[imageFromImageData release];
}
}
After getting the imgValue I copied that url and when I checked in the browser it shows me the image but it didn't store into NSData.Please help me
I checked your code and everything seems to be working.
Please check imageData with
NSLog(#"%u", [imageData length]);
Change this line
NSString *imgvalue=[[NSString alloc]initWithString:item.imgItem];
to
NSString *imgvalue=#"http://animals.catchsmile.com/cat-3/";
using this i got data in log..
i have used static string and tested you can use item.imgItem value as well..

How to update exif of ALAsset without changing the image?

I use setImageData:metadata:completionBlock: of ALAsset to update the exif(metadata) of an asset.
I just want to update the metadata, but this method require an imageData as the first parameter. I use the code below to generate imageData, but it modified my image(I checked the file size and file hash).
ALAssetRepresentation *dr = asset.defaultRepresentation;
UIImage *image = [UIImage imageWithCGImage:dr.fullResolutionImage scale:dr.scale orientation:dr.orientation];
NSData *data = UIImageJPEGRepresentation(image, 1);
Is there any other method I could use to update just the exif of an ALAsset? Or any way to generate the right imageData for method setImageData:metadata:completionBlock: ?
I found a way to generate imageData. Code below:
Byte *buffer = (Byte *)malloc(dr.size);
NSUInteger k = [dr getBytes:buffer fromOffset:0.0 length:dr.size error:nil];
NSData *data = [NSData dataWithBytesNoCopy:buffer length:k freeWhenDone:YES];
So I can use the data above with setImageData:metadata:completionBlock: to update only the exif of ALAsset.

Resources