AFNetworking iOS JSON Parsing incorrect in just Lebanon - ios

My application has a weird problem. I have a login webservice which is used to authenticate the users, it works well for everyone except for a tester who is in Lebanon. For her, the request always fails. It turns out that the json response is not getting parsed for her.
My first guess was that her network place is using a proxy server that converts json to html, so I asked her to switch to cellular network but this isn't solving the problem either.
Please refer to the debug message in the screenshot below.
Any suggestions on what must be wrong will be greatly helpful.

You'd really need the exact data that was received. JSON parsing is totally independent of any localisation. On the other hand, whatever service produced the JSON data may not. There is a good chance that being in the Lebanon, that customer receives non-ASCII data (which should be just fine), while other customers don't. It is possible that the server sends that data not in UTF-8 but say in some Windows encoding. That would be fine for ASCII but not for non-ASCII data. Or it could be that the server figures out that full UTF-8 is needed and not ASCII, and transmits a byte order marker, which is not legal JSON and would likely create the error message that you received.
To reproduce, I'd try to set up things so that non-ASCII data would be used. For example a username or password with non-ASCII data.

Related

Google/Youtube API Server Key format for pre-validation

Context: I'm updating my WordPress plugin to authenticate against the YouTube v3 API using a server key that has to be requested and entered by the user of the plugin.
Problem: I would like to perform validation of some kind on that key before using it, but can't seem to find documentation of the format a Google API server key adheres to. Based on (a very limited) number of examples it seems as though a key is:
is 39 characters long
is case-sensitive
consists of letters, numbers and at least dashes
So the question, obviously: Is this documented somewhere? Can anyone confirm or expand?
thanks,
frank
I couldn’t find any published key format either. Maybe because they want to keep the freedom to change the format in the future. If you want to be on the safe side, you should probably just do sanity checks well above the observed format. For example <=1024 bytes and non-control ascii characters, or even base64, or just don’t do any validation at all and let Google do that.
How about taking the key and passing it to a server-side script that attempts to use the key for some call. Then if it works return a success, else fail and call this async for the validation. Just seems more reliable than trying to decoded or anticipate the format of the hash.

Sending Signed XML to secure WebService returns BadSignature

I am using Delphi 7's HTTPReqResp component to send a digitally signed SOAP XML Document to a HTTPS web service. I use Eldos XML BlackBox and have set all the transformAlgorithms, CanonicalizationMethod, signaturemethod, etc. to the ones the web service requires and have confirmed this with a tech support officer.
I have validated the signature using XML BlackBox and also this XML Verifier website.
Both ways confirm the signature is correct. However, when I send the XML document via HTTPReqResp.execute, the response I get back is BadSignature (The signature value is invalid).
Originally, I received back a different error messages due to XML errors (malformed, etc.). It appears that the service will do all the standard formatting checks first, then it will attempt to validate the signature. Since I get back the BadSignature response, the rest of the XML must be correct.
I suppose I have 2 questions here.
Does the HTTPReqResp component alter the XML.
Is it likely the webservice alters the XML.
The site is using Access Manager WebSEAL.
It's very likely that the receiving partner is getting a modified document somehow. Some minor modifications shouldn't affect the signature (that's the idea, at least) so you may want to check the following:
"Recommended" encoding used by the receiving partner. A very annoying practice by some receiving partners is to favor one form of encoding and completely ignore others. XML signatures should use utf-8 but I've seen servers that only accept iso-8859-1
Make sure you don't accidentally change encoding after signing.
Verify that the receiving partner is using a sane canonicalization method.
Verify with your receiving partner that no extraneous elements are being added to your document.
Also, have you tried to post this using the SecureBlackBox components? They also have an HTTP client that can do SSL, and that can be used to also verify the bytes being sent through the wire.

Rails, sending mail to an address with accented characters

I am sending emails via Rails, ActionMailer, 1.9 Ruby and rails 3.0
All is good, I am sending emails with accented characters in subject lines and body, without issue. My charset default is UTF-8.
However when I try to send an email to an address containing accented characters it is failing miserably. I first had errors about the email address being invalid and it needing to be fully qualified.
To get around that, I needed to specify the email address in the format '"" '.
However it is sending now, but the characters in the address on the mail client, appear as =?UTF-8?Q?.... which is correct, Rails is rightly encoding my UTF8 address into the header for me.
BUT
My mail client is not recognising this in its display, so it renders all garbled on screen. garbled as in the actual text =?UTF-8?Q?.... appears in the "To" field on the client.
The encoding is UTF8 etc. charset is UTF8, Transfer Encoding is quotable printable.
What am I missing? It is doing my head in!
Also, as a test, I sent an email from my mac mail client to an address with accented characters. This renders fine in my client, however the headings are totally different... as in the charset is an iso, the transfer encoding is base64.... so I am thinking I need to somehow change actionmailer to encode my mails differently? i.e. using iso and base64 encoding to get it to play nice?
I tried this but to no avail. I am either doing it wrong or completely missing the point here? From readong the various forums and sites on this, I need to encode the header fields in a certain way, but I am failing to find the answers I need to tell me exactly what that encoding is and more specifically how can I do this in Rails?
Please help! :-)
finally solved this, if you wrap the local part of the email in quotes, and leave the domain part unquoted it works a treat. Seems like Mailer is encoding the full email address if you dont wrap in quotes, and hence breaks the encoding over to the server.
e.g.
somébody#here.com wont work
where as
"somébody"#here.com will work
routes through fine and displays fine in all clients.
Currently not all mail servers support UTF-8 email addresses (aka SMTPUTF8 ) a lot of them will do crazy things (even content malformation of headers). Can you check to ensure that your encoding header made it all the way through the mail server and wasn't ripped out?
The MTA would have to support RFC6530 to support UTF-8 addresses so it may not be your applications fault.

Axi2 client made with wsdl2Java uses UTF-8 instead of UTF-16

I'm using Axis2 1.6.1 to create a webservice, both the server and the client. The webservice is pretty simple: it receives two strings and returns an array of bytes. The issue I'm finding is that the client is sending the request encoded as UTF-8, so when I send a text in Spanish with accents, they get replaced by some strange characters. How can I force the client to use UTF-16?
Thanks
Jose Luis
you need to set it as a service client property. Please have a look at here[1].
[1] http://wso2.org/library/230

Sending binary data to (Rails) RESTful endpoint via JSON/XML?

I am currently putting together a rails-based web application which will only serve and receive data via json and xml. However, some requirements contain the ability to upload binary data (images).
Now to my understanding JSON is not entirely meant for that... but how do you in general tackle the problem of receiving binary files/data over those two entrypoints to your application?
I suggest encoding the binary data in something like base64. This would make it safe to use in XML or JSON format.
http://en.wikipedia.org/wiki/Base64
maybe you could have a look on Base64 algorithm.
This is used to "transform" everything to ascii char.
You can code and decode it. It's used for webservices, or even on dotnet Serialization.
Hope this helps a little.
Edit: I saw "new post", while posting, someone was faster.Rails base64
If you are using Rails and json and xml than you are using HTTP. "POST" is a part of HTTP and is the best way to transform binary data. Base64 is a very inefficient way of doing this.
If your server is sending data, I would recommend putting a path to the file on the server in the XML or JSON. That way your server doesn't have to base64 encode the data and your client, which already supports HTTP GET, can pull down the data without decoding it. (GET /path/to/file)
For sending files, have your server and/or client generate a unique file name and use a two step process; the client will send the xml or json message with fileToBeUploaded: "name of file.ext" and after sending the message will POST the data with the aforementioned filename. Again, client and server won't have to encode and decode the data. This can be done with one request using a multi-part request.
Base64 is easy but will quickly chew up CPU and/or memory depending on the size of the data and frequency of requests. On the server-side, it's also not an operation which is cached whereas the operation of your web server reading the file from disk is.
If your images are not too large, putting them in the database with a RoR :binary type makes a lot of sense. If you have database replicas, the images get copied for free to the other sites, there's no concern about orphaned or widowed images, and the atomic transaction issues become far simpler.
On the other hand, Nessence is right that Base64, as with any encoding layer, does add network, memory and CPU load to the transactions. If network bandwidth is your top issue, make sure your web service accepts and offers deflate/gzip compressed connections. This will reduce the cost of the Base64 data on the network layer, albeit at the cost of even more memory and CPU load.
These are architectural issues that should be discussed with your team and/or client.
Finally, let me give you a heads up about RoR's XML REST support. The Rails :binary database type will become <object type="binary" encoding="base64">...</object> XML objects when you render to XML using code like this from the default scaffolding:
def show
#myobject = MyObject.find(:id)
respond_to do |format|
format.xml { render => #myobject }
end
end
This works great for GET operations, and the PUT and POST operations are about as easy to write. The catch is the Rails PUT and POST operations don't accept the same tags. This is because the from_xml code does not interpret the type="binary" tag, but instead looks for type="binaryBase64". There is a bug with a patch at the Rails lighthouse site to correct this.

Resources