I am posting data to a server from my ipad using json. But along with my data, i need to be able to send images aswell. I tried to add my image data to a dictionary and parse that into json, but json doesnt like nscfdata. What would be the easiest way i can post my images to the server? from other posts related to this topic, people have been converting to base64. Would i have to do that, or is there another easier or faster way? if i have to encode to base64 is there any tutorial on that?
I convert it to base64. Check out this tutorial to get started!
http://www.cocoawithlove.com/2009/06/base64-encoding-options-on-mac-and.html
Example:
NSImage *image = // some image
NSData *data = UIImagePNGRepresentation(image);
NSString *base64EncodedString = [data base64EncodedString];
You can then send the string via JSON.
Related
I have a large piece of NSData I want to send to my server, but I also want to send a string dictionary of string keys mapped to string keys.
How do I POST both in the same request?
Almost all guides show wrapping it in an NSDictionary then using NSJSONSerialization to turn it into NSData then POST that, but I can't have NSData and NSStrings in the same NSDictionary it just crashes, so I assume I have to keep it separate, but how would that look?
Essentially how do I serialize JSON into NSData and then also have a separate NSData blob with it?
let body = NSMutableDictionary()
body.setValue("myString" forKey: "stringType")
body.setValue(data?.base64EncodedString(options: .lineLength64Characters) forKey: "dataType")
In this way you can have both data and string in the dictionary.
Here 'data?.base64EncodedString(options: .lineLength64Characters)' returns you a string with base64 encoding. So your dictionary contains only string and at server end you have to covert it back to data.
Hope it solves your issue
On my iOS side, i received a stream of images via NSInputStream. NSMutableData will be containing images in [JPG IMAGE][JPG IMAGE][PNG IMAGE].
Is there any elegant way of getting the images from NSMutableData?
EDIT: Thanks for all the replies. I do not know the start/end of the images. I converted the buffer to NSData and append it to NSMutableData. Do i need to include a delimiter in between the images?
Server(.Net): Continuous sending images, so on the client side (iOS) i will never encounter NSStreamEvent.EndEncountered . Closing the NetoworkStream will throw me ObjectDisposedException was unhandled, cannot access a disposed object (fixing it now)
Client: Does not know when the server has completed sending, all it has is a NSMutableData. (fixing it now) I convert the buffer to NSData and append it to NSMutableData
Basically what i want to achieve is:
1. I press connect on my client (iOS app),
2. Establish connection with the server (.Net)
3. Server create and send images to client
4. Client displays the images.
5. Connect and Disconnect initiated by client.
Update: I got it working via:
1. At server side, after each image sent, i append a delimeter.
2. At client side, i append the data to NSMutableData as usual and check the last few bytes for delimeter. If it matches, i can safely proceed to chunk the NSMutableData by the delimeter.
via that, i am able to receive & display multiple images!
You can use
[UIImage imageWithData:[yourMutableData copy]]
copy is not necessary. It makes your mutable data to immutable and you got copy.
You Can Use
NSData *data = yourData;
UIImage *image = [UIImage imageWithData:data];
It is possible, but it will be quite tricky in your case - because you indicated that the NSMutableData will contain more than one type of images (JPEG/PNG).
If we assume, that the data arrived in good format, the best way to explore and separate it is by the file signature:
For PNG it's the first 8 bytes
For JPEG it's the first 4 bytes
Incorporate this to separate the huge blob of data into an array of data chunks. You'll have to go byte by byte and look for subsequences and check if the subsequence happens to be of a JPEG or PNG file. Then simply initialize your UIImage objects like this:
var dataArray: [NSMutableData] = [imageData1, imageData2, imageData3]
var imageArray: [UIImage] = []
for imgData in dataArray {
if let image = UIImage(data: imgData) {
imageArray.append(image)
}
}
I want to implement a web service to recover an array of my entity PictureCaptures :
PictureCaptures
---------------
- description : string
- captureDate : DateTime
- photoBinary : byte[]
The web service will be mainly called by an iOS application.
What's the best way to implement it, because of the byte array attribute?
Am I suppose to return the byte array without any transformation, as a simple JSON attribute? If yes, how to interpet the JSON response ? -In this case JSONObjectWithData:options:error: doesn't work, too much data and memory issue)-
Thank you for your help.
I would suggest you add two resources: one for the meta data (description, captureDate and so on) and one for the binary data. Let the meta data resource contain a link to the binary photo data.
Like this:
GET /images/1234
Response:
{
description: "Nice photo",
captureDate: "2012-04-23T18:25:43.511Z",
photoData: "http://example.org/images/1234/photo"
}
and http://example.org/images/1234/photo returns the raw photo data
(see also See also The "right" JSON date format for a discussion on date formats).
when you get JSON responce you shoud convert the btye array to NSData.
first add Base64.h and m file to the project ( you can find it easily on internet)
then import Base64.h
from your JSON data
NSString *data= [yourJSONDict objectForKey:#"photoBinary"];
NSData* imageData = [data base64DecodedData];
UIImage *imag=[UIImage imageWithData:imageData];
[yourImageView setImage:imag];
this might help you.
I think I need some assistance in figuring out the correct NSJSONSerialization option to make my problem go away.
On my app I allow the user to select an image from the gallery - the image undergoes the following:
NSData *imageData = UIImageJPEGRepresentation(self.profileImageView.image, 0.0);
then
NSString *stringOfImageData = [imageData base64EncodedStringWithOptions:0];
before it is serialized like this:
NSData *jsonData = [NSJSONSerialization dataWithJSONObject:postDict
options:NSJSONWritingPrettyPrinted
error:&error];
and then sent to my REST API. I then decode it in python using base64 like so:
profileImageData = base64.b64decode(request.json['image'])
It is then loaded in GridFS (mongodb). On extracting the data to send back to the app I first encode in base to base64 before using dumps() to send it back:
dumps(base64.b64encode(fs.get_last_version(request.json['userID']).read()))
Within iOS after receiving the data it goes through the below de-serialization:
[NSJSONSerialization JSONObjectWithData:data options:NSJSONReadingMutableContainers|NSJSONReadingMutableLeaves error:&error]
I have narrowed by problem to the last NSJSONSerialization command. After the data is received by the app it is able to print to screen. After the Serialization I get a 'nil' :(
The Serialization and De-Serialization has been working great for strings, integers etc - it just doesn't work when I'm trying to move image data.
Thanks
EDIT: I am able to run a curl request against the API and then using an online base64 to image converter I can see my image. So it definitely means the issues is with the iOS side of decoding a json encoded base64 string.
EDIT: When I repeatedly run the deserialization - every 20th time or so the data is correctly converted. I think the solution might have to be to break up the data coming in.
EDIT: Error:
parsed error:Error Domain=NSCocoaErrorDomain Code=3840 "The operation couldn’t be completed. (Cocoa error 3840.)" (Unterminated string around character 17.) UserInfo=0x109c08790 {NSDebugDescription=Unterminated string around character 17.}
What you don't say is how you are receiving the data. My guess is you are trying to decode the data before you receive all of it, but since I don't know how it's a guess.
To better understand what's going on, try logging the size and hash of the data, to see if the length varies. You can also save each received data object to the file system - put them in the Documents folder and you can access them from your Mac. If the size never varies you will then have to compare a good data object to a bad one.
In fact you can write a little code to save an image as data and a base64 string, upload it, then pull it back, and save it. Now compare the data and strings. Once you find a difference, then look at. What is its offset from the start? How is it different?
When you understand this all you will be able to fix it.
I would like to post image to a webservice with the format below:
picture=%89PNG%0D%0A%1A%0A%00%00%00%0DIHDR%00%00%00%01%00%00%00%01%01%03%00%00%00%25%DBV%CA%00%00%00%04gAMA%00%00%B1%8F%0B%FCa%05%00%00%00%01sRGB%00%AE%CE%1C%E9%00%00%00+cHRM%00%00z%26%00%00%80%84%00%00%FA%00%00%00%80%E8%00%00u0%00%00%EA%60%00%00%3A%98%00%00%17p%9C%BAQ%3C%00%00%00%06PLTE%11%0B%0C%FF%FF%FFPU%C0J%00%00%00%01tRNS%94%B7%84%8F%3B%00%00%00%01bKGD%01%FF%02-%DE%00%00%00%09pHYs%00%00%00H%00%00%00H%00F%C9k%3E%00%00%00%0AIDAT%08%D7c%60%00%00%00%02%00%01%E2%21%BC3%00%00%00%25tEXtdate%3Acreate%002013-01-06T12%3A31%3A53%2B02%3A00%92R%3A%D3%00%00%00%25tEXtdate%3Amodify%002013-01-06T12%3A31%3A53%2B02%3A00%E3%0F%82o%00%00%00%19tEXtSoftware%00Adobe+ImageReadyq%C9e%3C%00%00%00%00IEND%AEB%60%82
My problem is how to convert an image to this format.
Thanks to any help.
I found a solution. If someone need it:
NSData *imageData = UIImagePNGRepresentation(self.photoImageView.image);
NSString *imageStringASCII = [[NSString alloc] initWithData:imageData encoding:NSASCIIStringEncoding];
convert your image into Base64 i.e., encode with help of the following library of Base64
and upload image to server . In te server decode and convert into image and store