cached NSData object can't be used after retrieval - ios

I am caching an NSData object containing image data retrieved from the web. The image displays correctly before caching. When I retrieve the data object from the cache, the data can no longer be used to create a UIImage, even though the data objects are identical.
Please see the relevant snippets of my code below
NSData *webData= [NSData dataWithContentsOfURL:webPath]; //retrieve from web
UIImage *webImage = [UIImage imageWithData:webData]; //works fine
[webData writeToURL:filePath atomically:YES]; //cache
NSData *cacheData = [NSData dataWithContentsOfURL:filePath]; //get from cache
if ([cacheData isEqualToData:webData]) NSLog(#"Equal"); //Data objects are equal
UIImage *cacheImage = [UIImage imageWithData:cacheData]; //cacheImage is nil
I can fix the problem by changing the way I store my data to the cache
NSData *temp = UIImageJPEGRepresentation(webImage, 1.0):
[temp writeToURL:filePath atomically:YES];
Now the webData and cacheData are no longer equal, but cacheImage is not nil and displays properly.
EDIT - After a bit more testing, I realized I get the same problem using UIImageJPEGRepresentation as well.
Anyone know why this would be?
Thanks.

Figure out the problem was that I was trying to do all this before my view controller was fully loaded.

Related

How to save and retrieve UIImages using "imageWithData" maintaining proper image format?

I'm working on creating and storing OpenGL ES1 3D models, and want to include image files to be used as textures, within the same file as the 3D model data. I am having trouble loading the image data in a usable format. I'm using UIImageJPEGRepresentation to convert the image data and store it into an NSData. I then append it to a NSMutableData object, along with all the 3D data, and write it out to a file. The data seems to write and read without error, but I encounter problems when trying use the image data to create a "CGImageRef" which I use to generate the texture data for the 3D model. The image data seems to be in an unrecognized format after it is loaded from the file, because it generates the error "CGContextDrawImage: invalid context 0x0.” when I attempt to create the "CGImageRef". I suspect that the image data is gettin misaligned somehow, causing it to be rejected when attempting to create the "CGImageRef". I appreciate any help. I'm stumped at this point. All of the data sizes and offsets add up and look fine. Saves and loads happen without error. The image data just seems off a bit, but I don't know why.
Here's my code:
//======================================================
- (BOOL)save3DFile: (NSString *)filePath {
// load TEST IMAGE into UIIMAGE
UIImage *image = [UIImage imageNamed:#“testImage.jpg"];
// convert image to JPEG encoded NSDATA
NSData *imageData = UIImageJPEGRepresentation(image,1.0);
// Save length of imageData to global "imDataLen" to use later in “load3DFile”
imDataLen = [imageData length];
// TEST: this works fine for CGImageRef creation in “loadTexture”
// traceView.image=[UIImage imageWithData:[imageData subdataWithRange:NSMakeRange(0, imageDataLen)]];
// [self loadTexture];
// TEST: this also works fine for CGImageRef creation in “loadTexture”
// traceView.image=[UIImage imageWithData:txImData];
// [self loadTexture];
fvoh.fileVersion = FVO_VERSION;
fvoh.obVertDatLen = obVertDatLen;
fvoh.obFaceDatLen = obFaceDatLen;
fvoh.obNormDatLen = obNormDatLen;
fvoh.obTextDatLen = obTextDatLen;
fvoh.obCompCount = obCompCount;
fvoh.obVertCount = obVertCount;
fvoh.obElemCount = obElemCount;
fvoh.obElemSize = obElemSize;
fvoh.obElemType = obElemType;
NSMutableData *obSvData;
obSvData=[NSMutableData dataWithBytes:&fvoh length:(sizeof(fvoh))];
[obSvData appendBytes:obElem length:obFaceDatLen];
[obSvData appendBytes:mvElem length:obVertDatLen];
[obSvData appendBytes:mvNorm length:obNormDatLen];
[obSvData appendBytes:obText length:obTextDatLen];
[obSvData appendBytes:&ds length:(sizeof(ds))];
// next, we append image data, and write all data to a file
// seems to work fine, no errors, at this point
[obSvData appendBytes: imageData length:[imageData length]];
BOOL success=[obSvData writeToFile: filePath atomically:YES];
return success;
}
//======================================================
- (void) load3DFile:(NSString *)filePath {
NSData *fvoData;
NSUInteger offSet,fiLen,fhLen,dsLen;
[[FileList sharedFileList] setCurrFile:(NSString *)filePath];
fvoData=[NSData dataWithContentsOfFile:filePath];
fiLen=[fvoData length];
fhLen=sizeof(fvoh);
dsLen=sizeof(ds);
memcpy(&fvoh,[fvoData bytes],fhLen);offSet=fhLen;
//+++++++++++++++++++++++++++++++
obVertDatLen = fvoh.obVertDatLen;
obFaceDatLen = fvoh.obFaceDatLen;
obNormDatLen = fvoh.obNormDatLen;
obTextDatLen = fvoh.obTextDatLen;
obCompCount = fvoh.obCompCount;
obVertCount = fvoh.obVertCount;
obElemCount = fvoh.obElemCount;
obElemSize = fvoh.obElemSize;
obElemType = fvoh.obElemType;
//+++++++++++++++++++++++++++++++
memcpy(obElem, [fvoData bytes]+offSet,obFaceDatLen);offSet+=obFaceDatLen;
memcpy(mvElem, [fvoData bytes]+offSet,obVertDatLen);offSet+=obVertDatLen;
memcpy(mvNorm, [fvoData bytes]+offSet,obNormDatLen);offSet+=obNormDatLen;
memcpy(obText, [fvoData bytes]+offSet,obTextDatLen);offSet+=obTextDatLen;
memcpy(&ds, [fvoData bytes]+offSet,dsLen);offSet+=dsLen;
// the following seem to read the data into “imageData” just fine, no errors
// NSData *imageData = [fvoData subdataWithRange:NSMakeRange(offSet, imDataLen)];
// NSData *imageData = [fvoData subdataWithRange:NSMakeRange((fiLen-imDataLen), imDataLen)];
// NSData *imageData = [NSData dataWithBytes:[fvoData bytes]+offSet length: imDataLen];
NSData *imageData = [NSData dataWithBytes:[fvoData bytes]+(fiLen-imDataLen) length: imDataLen];
// but the contents of imageData seem to end up in an unexpected format, causing error:
// “CGContextDrawImage: invalid context 0x0.” during CGImageRef creation in “loadTexture”
traceView.image=[UIImage imageWithData:imageData];
[self loadTexture];
}
//======================================================
- (void)loadTexture {
CGImageRef image=[traceView.image].CGImage;
CGContextRef texContext;GLubyte* bytes=nil;GLsizei width,height;
if(image){
width=(GLsizei)CGImageGetWidth(image);
height=(GLsizei)CGImageGetHeight(image);
bytes=(GLubyte*) calloc(width*height*4,sizeof(GLubyte));
texContext=CGBitmapContextCreate(bytes,width,height,8,width*4,CGImageGetColorSpace(image),
kCGImageAlphaPremultipliedLast);
CGContextDrawImage(texContext,CGRectMake(0.0,0.0,(CGFloat)width,(CGFloat)height),image);
CGContextRelease(texContext);
}
if(bytes){
glTexParameteri(GL_TEXTURE_2D,GL_TEXTURE_MAG_FILTER,GL_LINEAR);
glTexParameteri(GL_TEXTURE_2D,GL_TEXTURE_MIN_FILTER,GL_LINEAR);
glTexParameteri(GL_TEXTURE_2D,GL_TEXTURE_WRAP_S,GL_CLAMP_TO_EDGE);
glTexParameteri(GL_TEXTURE_2D,GL_TEXTURE_WRAP_T,GL_CLAMP_TO_EDGE);
glTexImage2D(GL_TEXTURE_2D,0,GL_RGBA,width,height,0,GL_RGBA,GL_UNSIGNED_BYTE,bytes);
free(bytes);
}
}
//======================================================
I failed to receive any answers to this question. I finally stumbled across the answer myself. When I execute the save3DFile code, instead of adding the image data to NSMutableData *obSvData, using 'appendBytes' as illustrated below:
[obSvData appendBytes: imageData length:[imageData length]];
I instead use 'appendData' as shown here:
[obSvData appendData: imageData];
where imageData was previously filled with the contents of a UIImage and converted to JPEG format in the process as follows:
NSData *imageData = UIImageJPEGRepresentation(image,1.0);
See the complete code listing above for context. Anyway, the using 'appendData' instead of 'appendBytes' made all the difference, and allowed me to store the image data in the same file along with all the other 3D model data (vertices, indices, normals, et cetera), reloading all that data without problem, and successfully create 3D models with textures from a single file.

How do i save an image in core data then retrieve it? Using swift

I don't want to use NSUserDefaults, so how do I save an image in core data then retrieve it?
My image is in this variable.
var image:UIIMage = image1
Can you please give me some sample code to do this?
UIImage -> NSData
NSData *imageData = UIImagePNGRepresentation(image); or UIImageJpegRepresentation(image)
NSData -> UIImage
UIImage *image=[UIImage imageWithData:data];
Convert UIImage into NSData and save as the entity's attribute with the type Binary Data
this question is already answered by stack-overflow refer this
How to store an image in core data

Can't display BLOB image in iOS application

I'm trying to display a BLOB image (get from web server using Json) in my iOS app, but when I run my application I get an empty UIimageView, here is my code :
NSData *dataURL = [NSData dataWithContentsOfURL:[NSURL URLWithString:encodedUrl]];
NSData *profileImage1 = [[NSData alloc] initWithBytes:[dataURL bytes] length:[dataURL length]];
UIImage *profileImage2 = [UIImage imageWithData:profileImage1];
[profilImage setImage:profileImage2];
How can I fix this problem?
[UIImage imageWithData:data] only parses known image file formats like JPEG, PNG, etc. (Full info in the decomentation). Passing blobs isn't supported by UIImage. You need to do some decoding to be able to use the data for the UIImage. You can use GMTBase.64 for encoding and decoding of data. Read the docs and other posts and you'll find out how to change your code.
Hope this helps.

How to get UIImage(ImageURL) height and width without converting to NSData

In my project i need to show the different sizes of images in zig-zag fashion. so, i converted the uiimages(url) which are coming from service to NSData and then i get the uiimage. my code is
NSURL *url = [NSURL URLWithString:[[_result objectAtIndex:i ] valueForKey:#"PImage"]];
NSData *data = [NSData dataWithContentsOfURL:url];
UIImage *image = [UIImage imageWithData:data];
so i can get the image size(width and height), But my problem is according to the image size, i need to create UIView, this code is works fine for me, but it is taking too much of time(almost 25 sec) to load 8 images. i figured converting UIImage to NSData is taking time. Is there any way to get the image size(width and height) without converting it into NSData
Thanks for spending time for me.
You can get image properties without actually loading whole image data from disk using ImageIO framework:
#import ImageIO;
...
NSURL *imageURL = … // Init URL somehow
CGImageSourceRef imgSource = CGImageSourceCreateWithURL((__bridge CFURLRef)url, NULL);
NSDictionary* imageProps = (__bridge_transfer NSDictionary*) CGImageSourceCopyPropertiesAtIndex(imgSource, 0, NULL);
NSLog(#"%#", imageProps);
CFRelease(imgSource);
Image width and height will be stored in dictionary under PixelHeight and PixelWidth keys (tested with png image, may be other image formats will use different keys)
Instead of converting url to data and to UIImage, Use EGOImageView OR AsyncImageView. You can simply pass the URL to them. Again setFrame based on size of the image.

How to turn image data in JSON from WCF Data Services into UIImage in iOS

How do I get image data out of JSON returned by a WCF Data Service?
I've got a WCF Data Service serving some objects that include some binary data that is an image (column type in SQL Server is image). It comes across as text gibberish in the JSON. In the iOS client I'm writing for this service, I want to display this data as an image ([UIImage imageWitData:myData]).
Here is what I'm doing:
NSData *networkData = nil; //assume this magically gets data from the service
NSError *err = nil;
//assume the returned JSON has one object and is not an array
NSDictionary *dict = [NSJSONSerialization JSONObjectWithData:networkData
options:NSJSONReadingAllowFragments error:&err];
//handle error if needed
NSString *imageString = [dict objectForKey:#"Image"];
NSData *imageData = [imageString dataUsingEncoding:NSUTF8StringEncoding];
///The service is using UTF-8.
UIImage *image = [UIImage imageWithData:imageData];
Putting that image in a UIImageView doesn't show anything. What am I doing wrong?
Turns out that binary data returned from WCF Data Services is base64 encoded. I'm using the NSData-Base64 category to parse it, and it's working great. So the solution looks like this:
NSData imageData = [NSData dataFromBase64String:imageString];
When I imported those files into my project ARC threw up some errors, but I was able to solve them by removing a couple of autorelease calls in the .m file.

Resources