UIimage to NSData returns Null - ios

I created a QR code image and displayed it in a UIImageView
NSString *QRMessage = #"MESSAGE";
NSData *data = [QRMessage dataUsingEncoding:NSISOLatin1StringEncoding allowLossyConversion:false];
CIFilter *filter = [CIFilter filterWithName:#"CIQRCodeGenerator"];
[filter setValue:data forKey:#"inputMessage"];
[filter setValue:#"Q" forKey:#"inputCorrectionLevel"];
imageQRCode = filter.outputImage; // imageQRCode is CIImage
ivQR.image = [UIImage imageWithCIImage: imageQRCode]; // ivQR is UIImageView
I'm trying to save the image so the user can somehow send the QR code to another person. I first tried saving it to the "Clipboard" like this...
UIPasteboard *pasteBoard = [UIPasteboard pasteboardWithName:UIPasteboardNameGeneral create:NO];
[pasteBoard setPersistent:true];
[pasteBoard setImage:ivQR.image];
... but it appears nothing is saved in the Clipboard.
So then I tried converting the UIImage to NSData and adding it as an attachment like so:
MFMailComposeViewController *picker = [[MFMailComposeViewController alloc] init];
picker.mailComposeDelegate = self;
UIImage *imageToSend = [UIImage imageWithCIImage:imageQRCode];
NSData *data = UIImageJPEGRepresentation(imageToSend,1);
[picker addAttachmentData:data mimeType:#"image/jpeg" fileName:#"QR.jpg"];
But again nothing seems to be attached.
I did some testing and it appears that the NSData I'm getting back from "UIImageJPEGRepresentation()" gives me "Null" data. The image does in fact get displayed on my phone, so I'm wondering if I'm just converting the data wrong?
Most of my "googling" tells me that the way I'm converting is correct. But most of the examples use a Picture on the user's phone, or a Picture added to the App itself. My Picture is "created"... so does that make a difference?
My goal is to allow the user to Copy to Clipboard, or Add as Attachment of the QR Image. Any assistance is greatly appreciated.

When converting a CIImage to UIImage, I've found that imageWithCIImage often doesn't work. I generally use CIContext method createCGImage and then create the UIImage from that:
CIImage *ciImage = ...
CIContext *context = [CIContext contextWithOptions:nil];
CGImageRef cgImage = [context createCGImage:ciImage fromRect:rect];
UIImage *image = [UIImage imageWithCGImage:cgImage];
CGImageRelease(cgImage);
See samples in Core Image Programming Guide.

Related

Why doesn't UIImage get successfully decoded?

Why doesn't the UIImage in this code snippet get restored back to its original state when I try to encode and decode it using NSKeyedArchiver?
I expect "decodedImage" to contain the image after decoding, but instead it is just NULL.
// Any image here seems to repro the issue
UIImage *image = [UIImage imageNamed:#"soda.jpg"];
// This prints YES (1), just a sanity check.
NSLog(#"Confirms %d", [[UIImage class] conformsToProtocol:#protocol(NSCoding)]);
NSMutableData *data = [[NSMutableData alloc] init];
NSKeyedArchiver *coder = [[NSKeyedArchiver alloc] initForWritingWithMutableData:data];
[coder encodeObject:image forKey:#"image"];
[coder finishEncoding];
// I would expect this to be large, instead it's < 1kb.
NSLog(#"Data length is: %zu", (unsigned long)data.length);
NSKeyedUnarchiver *decoder = [[NSKeyedUnarchiver alloc] initForReadingWithData:data];
// This prints YES (1)
NSLog(#"containsValueForKey returns %d", [decoder containsValueForKey:#"image"]);
// decodedImage is NULL here, even though containsValueForKey returned YES
UIImage *decodedImage = [decoder decodeObjectForKey:#"image"];
[decoder finishDecoding];
In this case, I'm not looking for a workaround like converting the UIImage to NSData first and encoding that. The reason is that I'm trying to reproduce an unrelated piece of code which uses something like this and I'm trying to understand it.
The code works as expected if I roundtrip the image first through nsdata and back to uiimage, why??
UIImage *originalImage = [UIImage imageNamed:#"soda.jpg"];
NSData *imageData = UIImagePNGRepresentation(originalImage);
UIImage *image = [UIImage imageWithData:imageData];
Check this
Decode Data to image:
+(NSData *)decodeBase64ToImage:(NSString *)strEncodeData
{
NSData *data = [[NSData alloc]initWithBase64EncodedString:strEncodeData options:NSDataBase64DecodingIgnoreUnknownCharacters];
return data;
}
self.btnLicenseFront.image=[UIImage imageWithData:[Themes decodeBase64ToImage:licenseFront]];
I used your code and tried with 2 images:
1. A correct image file
The output is
> [36133:5889153] Confirms 1
> [36133:5889153] Data length is: 68267
> [36133:5889153] containsValueForKey returns 1
> [36133:5889153] decodedImage is 1879681920
2. Incorrect/corrupt image file
The output is
> [36130:5888794] Confirms 1
> [36130:5888794] Data length is: 136
> [36130:5888794] containsValueForKey returns 1
> [36130:5888794] decodedImage is 0
So looks like your source JPG file is corrupt or invalid.
I have found a solution.
It turns out that this only happens the image is loaded through the [UIImage imageNamed:]. If the UIImage is created through[UIImage imageWithContentsOfFile:], the issue does not happen.
I believe this must be a bug on the ios side. The imageNamed: way of creating a UIImage is specifically for images inside the bundle. There must be some optimization they have which causes NSCoder to not function as intended since the UIImage seems to not actually contain the image data (since decoding seems to return nil instead of recreating the UIImage with the image from the bundle as expected).

Unarchive UIImage object returns CGSizeZero image using NSKeyedUnarchiver on iOS 8

I have this code working on iOS 7:
NSData *imageData = [NSKeyedArchiver archivedDataWithRootObject:self.imageView.image];
UIImage *imageCopy = [NSKeyedUnarchiver unarchiveObjectWithData:imageData];
NSLog(#"%#",NSStringFromCGSize(imageCopy.size));
but on iOS 8, imageCopy's size is always zero. Same thing happens when I archive UIImageView, the unarchived imageView's image has a zero size. I found out that in iOS 7, UIImage header is like:
UIImage : NSObject <NSSecureCoding, NSCoding>
but on iOS 8 it is :
UIImage : NSObject <NSSecureCoding>
It looks like the NSCoding protocol is missing on iOS 8. I have to encode the actual image data: UIImagePNGRepresentation(self.imageView.image) instead of the image to make sure I get a good image back.
Does anyone know why this happens? Is it for backward compatibility? I noticed in iOS earlier version UIImage doesn't conform to NSCoding.
UIImage : NSObject <NSSecureCoding> is not a problem because NSSecureCoding inherits NSCoding.
Anyway, I confirmed the problem can be reproduced with following code:
UIImage *img = [UIImage imageNamed: #"myImage"];
NSData *data = [NSKeyedArchiver archivedDataWithRootObject:img];
UIImage *imgCopy = [NSKeyedUnarchiver unarchiveObjectWithData:data];
NSLog(#"%#, %#", imgCopy, NSStringFromCGSize(imgCopy.size)); // -> (null), {0, 0}
On the other hand, the following code works as expected:
UIImage *img = [UIImage imageNamed: #"myImage"];
UIImage *img2 = [UIImage imageWithCGImage:img.CGImage scale:img.scale orientation:img.imageOrientation];
NSData *data = [NSKeyedArchiver archivedDataWithRootObject:img2];
UIImage *imgCopy = [NSKeyedUnarchiver unarchiveObjectWithData:data];
NSLog(#"%#, %#", imgCopy, NSStringFromCGSize(imgCopy.size)); // -> <UIImage: 0x7fa013e766f0>, {50, 53}
I don't know why, maybe bug?
I think, this is related to imageAsset or traitCollection property introduced in iOS8

cannot create a CIImage

I cannot understand what I'm doing wrong here, but for some reason, no matter how I try, I cannot create a CIImage
UIImage *origImage = [[UIImage alloc] init];
origImage = [UIImage imageNamed:imageName];
imageName = [imageName substringToIndex:[imageName length]-4];
NSURL *path1 = [[NSBundle mainBundle] URLForResource:imageName withExtension:#"jpg"];
NSLog(#"the path is %#", path1);
NSLog(#"the inputImage is %#", imageName);
CIImage *inputImage = [[CIImage alloc] initWithCGImage:origImage.CGImage];
CIImage *inputImage1 = [[CIImage alloc] initWithImage:_originalImage];
CIImage *inputA = [[CIImage alloc] initWithContentsOfURL:path1];
CIImage *empty = [[CIImage alloc] initWithImage:origImage];
A breakpoint at the end of this code shows a UIImage, a string and a url, all of which are as expected.
The header includes CoreImage.h, as well as UIKit. _originalImage is a UIImage property, which is assigned earlier.
Stumped on this for a few days, any help really appreciated. Thanks.
Using your code I get all 4 CIImages, is it possible that there is something with your jpg file?
I resolved this via the project settings.
I have found that if I had set the optimisation level of the Apple LLVM 6.0 Code Generation to something other than None [-00], then none of these images formed.
however, without the optimisation, it works as expected.
Not sure why, but not complaining.

Converting from JSON string to UIImage always gives null?

I Get this kind of JSON:
NOTE: I put dots in "content" because it is too long byte array abd just to explain the situation.
{
"id":"53abc6a7975a9c10c292f670",
"nfcId":"testse",
"company":"TESt",
"qrId":"testvalue",
"address":"ajs;ldfh",
"mimeType":"IMAGE",
"url":"",
"content":"iVBORw0KGgoAAAANSUhEUgAA....."
}
And im trying to get this Json and diplay this information
the "content" field has a Byte array converted on the server from Image to byte array.
I use this code in xCode to convert those bytes to NSData, then to UIImage to be able to display it in UIImageView:
NSData *dataImage = [jsonArray[key] dataUsingEncoding:NSUTF8StringEncoding];
NSLog(#"data = %#", dataImage);
UIImage *img = [UIImage imageWithData:dataImage];
NSLog(#"img = %#", img);
The image is always gives me null.Although, data give me array of data.
I tried all kinds of encodings as a NSData parameters also:
dataUsingEncoding:NSASCIIStringEncoding
dataUsingEncoding:NSUTF8StringEncoding
dataUsingEncoding:NSUTF16StringEncoding
dataUsingEncoding:NSUTF32StringEncoding
I've used code like this before
NSURL *url = [NSURL URLWithString:[NSString stringWithFormat:#"data:image/png;base64,%#",jsonArray[key]]];
NSData *imageData = [NSData dataWithContentsOfURL:url];
UIImage *img = [UIImage imageWithData:imageData];
Note that initWithBase64EncodedString is only available from iOS7 onwards
I tried this code just right now and it works:
NSData* dataImage = [[NSData alloc] initWithBase64EncodedString:jsonArray[key] options:0];
UIImage *img = [UIImage imageWithData:dataImage];
The "content" encoded by Base64 String type.

Send a UIImage through an email

Given that I have a UIImage that is stored in a local variable:
UIImage *myImage = (Some image extracted somewhere)
How do I add this image as an attachment through MFMailComposeViewController ?
The image isn't stored on the users device and it's simply available at runtime.
So I can't access this image through a file name as other questions/examples have demonstrated.
Thank you!
NSData *imageData = UIImageJPEGRepresentation(photo, 0.9);
NSString *attachmentName = #"Any name.jpg"
[mailer addAttachmentData:imageData mimeType:#"image/jpeg" fileName:attachmentName];

Resources