Having trouble saving image to entity field on IOS - ios

On my server I have a table that I want to get over to IOS sqllite table.
My server table has a field called data which is of type Image. The way I populated the field is that I wrote a C# app that converts an image to byte array and then write this byte array to sql Image column.
In IOS, I make a soap request to my wcf service and get all data from my table. I made sure data is received. My problem is writing received image data to my entity's binary data field. I use the following code for that.
NSString *key = (NSString *) [keys objectAtIndex:i]; // I made sure key is valid
NSData *data = (NSData *) [rowData GetValue:key]; // I made sure data is retrieved
[tblRow setValue:data forKey:key]; // After calling this, data for the key is nil.
Portion of Image Data Content
/9j/4AAQSkZJRgABAQEAYABgAAD/2wBDAAgGBgcGBQgHBwcJCQgKDBQNDAsLDBkSEw8UHRofHh0aHBwgJC4nICIsIxwcKDcpLDAxNDQ0Hyc5PTgyPC4zNDL/2wBDAQkJCQwLDBgNDRgyIRwhMjIyMjIyMjIyMjIyMjIyMjIyMjIyMjIyMjIyMjIyMjIyMjIyMjIyMjIyMjIyMjIyMjL/wAARCAMdA2oDASIAAhEBAxEB/8QAHwAAAQUBAQEBAQEAAAAAAAAAAAECAwQFBgcICQoL/8QAtRAAAgEDAwIEAwUFBAQAAAF9AQIDAAQRBRIhMUEGE1FhByJxFDKBkaEII0KxwRVS0fAkM2JyggkKFhcYGRolJicoKSo0NTY3ODk6Q0RFRkdISUpTVFVWV1hZWmNkZWZnaGlqc3R1dnd4eXqDhIWGh4iJipKTlJWWl5iZmqKjpKWmp6ipqrKztLW2t7i5usLDxMXGx8jJytLT1NXW19jZ2uHi4+Tl5ufo6erx8vP09fb3+Pn6/8QAHwEAAwEBAQEBAQEBAQAAAAAAAAECAwQFBgcICQoL/8QAtREAAgECBAQDBAcFBAQAAQJ3AAECAxEEBSExBhJBUQdhcRMiMoEIFEKRobHBCSMzUvAVYnLRChYkNOEl8RcYGRomJygpKjU2Nzg5OkNERUZHSElKU1RVVldYWVpjZGVmZ2hpanN0dXZ3eHl6goOEhYaHiImKkpOUlZaXmJmaoqOkpaanqKmqsrO0tba3uLm6wsPExcbHyMnK0tPU1dbX2Nna4uPk5ebn6Onq8vP09fb3+Pn6/9oADAMBAAIRAxEAPwDn7DRkuEtmmnETXUqww5UkM7dF4HGffAq/L4QC3VzaRyGW6tmjWeNUY7C4JXJAxyFP074qx4aa11nTbGKS5jjjYyBFLAOzNEygqO7KWDf8Brobi3ggWWS91aCG4ZIbi6mJ2vuWSYtIF6hS06qvptA9KmxVzh5fBt3dfuHbCBvmCnqQfu/yH403UPCGoTQeWbYSxbvLOMAJ2/Ou2i00K9jbyWlqv2SKSMNbRY3Myoqy4P8AECpOfpzxVWfS/

I am not the one writing the WCF Service that sends me the image data and I learned that the service applies base 64 encoding to the data and setting base 64 string to NSData object and trying to save it was failing. Once I decoded the data, everything worked fine.

Related

PFFile and JSON?

In my chatting application, I'm using Parse for a user table, getting ID's, images, etc. I recently added this functionality, and I have encountered a problem. When I send a message, I create an NSDictionary with information about the message such as time, message, sender, sender objectId, etc. But, when I try to add the PFFile (image file) associated with the user, I get an error saying that PFFile cannot be converted to JSON (PubNub message format). How can I add PFFile as part of the NSDictionary used in the message to be compatible with JSON, or there might be another way.
I'm not familiar with asynchronous tasks, but in my code, I have a method - (NSDictionary *)parseMessageToDisplay:(NSDictionary *)message {} where the input would be message received from PubNub, and it would return a format better united to be displayed in a UITableView. If I added the ID of the file or user to my dictionary, how could I get my image in UIImage or NSData, and return it from my method in an NSDictionary. Sorry if this post seems long, just trying to provide a lot of information.
In order to use parse.com, PFFile in particular, you'll probably want that NSDictionary to be a PFObject instead. A PFFile reference can be saved as an attribute of a PFObject -- in fact that's the only way it can be saved.
Thanks to #danh for this suggestion, but you really saved me. Apparently Parse creates a URL for all PFFiles and I can just send that URL (NSString *) with my NSDictionary to PubNub, and then in my - (NSDictionary *)parseMessageToDisplay:(NSDictionary *)message method just use [NSData dataWithContentsOfURL:[NSURL URLWithString:imageURLString]]; and get data from that. YAY! No long running confusing asynchronous tasks to made my day terrible!

REST web service to recover an array of pictures

I want to implement a web service to recover an array of my entity PictureCaptures :
PictureCaptures
---------------
- description : string
- captureDate : DateTime
- photoBinary : byte[]
The web service will be mainly called by an iOS application.
What's the best way to implement it, because of the byte array attribute?
Am I suppose to return the byte array without any transformation, as a simple JSON attribute? If yes, how to interpet the JSON response ? -In this case JSONObjectWithData:options:error: doesn't work, too much data and memory issue)-
Thank you for your help.
I would suggest you add two resources: one for the meta data (description, captureDate and so on) and one for the binary data. Let the meta data resource contain a link to the binary photo data.
Like this:
GET /images/1234
Response:
{
description: "Nice photo",
captureDate: "2012-04-23T18:25:43.511Z",
photoData: "http://example.org/images/1234/photo"
}
and http://example.org/images/1234/photo returns the raw photo data
(see also See also The "right" JSON date format for a discussion on date formats).
when you get JSON responce you shoud convert the btye array to NSData.
first add Base64.h and m file to the project ( you can find it easily on internet)
then import Base64.h
from your JSON data
NSString *data= [yourJSONDict objectForKey:#"photoBinary"];
NSData* imageData = [data base64DecodedData];
UIImage *imag=[UIImage imageWithData:imageData];
[yourImageView setImage:imag];
this might help you.

iOS NSJSONSerialization of JPG image encoded binary data from mongodb

I think I need some assistance in figuring out the correct NSJSONSerialization option to make my problem go away.
On my app I allow the user to select an image from the gallery - the image undergoes the following:
NSData *imageData = UIImageJPEGRepresentation(self.profileImageView.image, 0.0);
then
NSString *stringOfImageData = [imageData base64EncodedStringWithOptions:0];
before it is serialized like this:
NSData *jsonData = [NSJSONSerialization dataWithJSONObject:postDict
options:NSJSONWritingPrettyPrinted
error:&error];
and then sent to my REST API. I then decode it in python using base64 like so:
profileImageData = base64.b64decode(request.json['image'])
It is then loaded in GridFS (mongodb). On extracting the data to send back to the app I first encode in base to base64 before using dumps() to send it back:
dumps(base64.b64encode(fs.get_last_version(request.json['userID']).read()))
Within iOS after receiving the data it goes through the below de-serialization:
[NSJSONSerialization JSONObjectWithData:data options:NSJSONReadingMutableContainers|NSJSONReadingMutableLeaves error:&error]
I have narrowed by problem to the last NSJSONSerialization command. After the data is received by the app it is able to print to screen. After the Serialization I get a 'nil' :(
The Serialization and De-Serialization has been working great for strings, integers etc - it just doesn't work when I'm trying to move image data.
Thanks
EDIT: I am able to run a curl request against the API and then using an online base64 to image converter I can see my image. So it definitely means the issues is with the iOS side of decoding a json encoded base64 string.
EDIT: When I repeatedly run the deserialization - every 20th time or so the data is correctly converted. I think the solution might have to be to break up the data coming in.
EDIT: Error:
parsed error:Error Domain=NSCocoaErrorDomain Code=3840 "The operation couldn’t be completed. (Cocoa error 3840.)" (Unterminated string around character 17.) UserInfo=0x109c08790 {NSDebugDescription=Unterminated string around character 17.}
What you don't say is how you are receiving the data. My guess is you are trying to decode the data before you receive all of it, but since I don't know how it's a guess.
To better understand what's going on, try logging the size and hash of the data, to see if the length varies. You can also save each received data object to the file system - put them in the Documents folder and you can access them from your Mac. If the size never varies you will then have to compare a good data object to a bad one.
In fact you can write a little code to save an image as data and a base64 string, upload it, then pull it back, and save it. Now compare the data and strings. Once you find a difference, then look at. What is its offset from the start? How is it different?
When you understand this all you will be able to fix it.

sqlite3* to NSData

Can a pointer to sqlite3 handler be converted to NSData?
I would then be able to encrypt and decrypt it.
I have:
sqlite *sqliteHandle;
NSData *dataDB = [NSData dataWithBytes:&sqliteHandle length:sizeof(sqliteHandle)];
But this only gives me the pointer, not the actual data.
The sqllite * is a pointer to the SQLLite connection object, it is not likely you can get data out of that directly. You need to execute a query first, then extract the data from the result.
You probably want to call sqlite3_column_blob to get the raw data of a column.
Checkout the documentation; http://www.sqlite.org/capi3ref.html#sqlite3_column_blob

CoreData transformable type - can I store NSString or NSData?

Here's the situation. I need to store either NSData (an image) or a string which will be used to pull an image from the bundle. Is this legal?
if(aCondition){
[managedObject setValue:filePath forKey:imageKey];
}else{
[managedObject setValue:imageData forKey:imageKey];
}
If this is legal, when it comes time to retrieve this info how can I determine what type of value I had originally saved?
Make the type NSData (binary), so save both that way. Assuming the path can never be more than some bytes long (say 128) and that an image has to be more than say 500 bytes, you can make the decision when the property is requested to transform it to a string or leave it as data.
If you don't like that approach, add a boolean and use it to indicate the type of the data.

Resources