Using NSURLConnection with URL's with illegal characters - ios

I have to call a third party webservice that expects and only accepts illegal characters (according to the RFC) in it's param string, like the below.
http://example.com?param1={foo=bar}
In this example the braces are the illegal characters, and should be encoded, however this webservice will not accept the parameters if these characters have been encoded.
NSURL correctly doesn't allow me to create a NSURL object with the URLWithString method, return nil using a string like the example.
The webservice is provided by a large corporate entity, so changing it would require submitting a bug report to them, which may or may not be actioned soon, if at all, especially considering that the API works as is.
My question is what are some possible solutions to this problem, that i can implement, without changing the Webservice.
Current ideas (downsides)
Using CFStream to craft custom HTTP requests (Horrifically large amount of work)
Using a webbased proxy that could send the request on my application's behalf (Additional external dependency)
Thanks

What you are trying to do is impossible. You need to URL-encode these characters and the server will then automatically decode them.
If you could somehow hack NSURL there would be still many parts of the whole process that would choke on a malformed URL.

Related

How can I parse HTTP headers in a Chrome extension?

I'm working on a Google Chrome extension that does some processing based on HTTP response headers using the chrome.webRequest.onHeadersReceived event. I'm able to receive the headers, but the documentation seems to indicate that the headers are just represented as simple string name-value pairs.
The values for some HTTP headers, particularly Content-Type and Content-Disposition in my case, can store multiple pieces of information, including parameters with special mechanisms to escape characters. I want to be able to semantically interpret the header values rather than just see them as strings.
However, I don't want to have to write my own HTTP header value parsing code to meet the HTTP specifications and be able to cope with real-world values; I consider this an entire project of its own, and I want to focus on developing my extension.
How can I achieve this?

How can I give the response file of a universal HTTP request a unique name?

I've developed an HTTP API Server (intended to be called by third-party applications, not necessarily by a web browser) which has one universal call to get (download) any and all types of files by passing a name parameter in the query string for the file requested. All calls, no matter for which file, are handled in the same custom request handler of mine called Get (not to be confused with the standard HTTP get). A query string includes a property name which identifies the unique file to get.
So a request may look like:
http://MyServerURL.com/Get?Key=SomeAPIKeyForAuthentication&Name=SomeUniqueIdentifier
First of all, I know I can obviously make the server fetch a file using only the URI, for example...
http://MyServerURL.com/SomeUniqueIdentifier?Key=SomeAPIKeyForAuthentication
...but the design is specifically meant to use this one universal get command, so I need to keep this unique identifier in the query string. The actual applications which connect to this API will never need to know this filename, but there may be an event when a URL is manually provided for someone to open in their browser to download a file.
However, whenever a file is downloaded through a web browser, since the call is get, the saved filename also winds up being just get.
Is there any trick in HTTP which I can implement on my server which will force the downloaded filename to be the unique identifier, rather than just get? For example, some method such as using re-direct?
I'm using Indy 10 TIdHTTPWebBrokerBridge in Delphi XE2 as the web server. I'm looking for a way in this component (technically in its corresponding TWebModule handler) when it handles this get request, to make the response's filename whatever string I want (in this case, SomeUniqueIdentifier). I've heard the term "URL Re-writing" but that's a rather different topic, and don't think it's what I need, yet it might.
That seems to a rather long winded way of saying you want to set the filename for an HTTP download indpendently of the URL used to fetch it. In which case you simply send a Content-Dispositon header specifying the desired filename. See section 19.5.1 of rfc 2616
e.g.
Content-Disposition: attachment; filename="stackoverlow.ans"

Sending Signed XML to secure WebService returns BadSignature

I am using Delphi 7's HTTPReqResp component to send a digitally signed SOAP XML Document to a HTTPS web service. I use Eldos XML BlackBox and have set all the transformAlgorithms, CanonicalizationMethod, signaturemethod, etc. to the ones the web service requires and have confirmed this with a tech support officer.
I have validated the signature using XML BlackBox and also this XML Verifier website.
Both ways confirm the signature is correct. However, when I send the XML document via HTTPReqResp.execute, the response I get back is BadSignature (The signature value is invalid).
Originally, I received back a different error messages due to XML errors (malformed, etc.). It appears that the service will do all the standard formatting checks first, then it will attempt to validate the signature. Since I get back the BadSignature response, the rest of the XML must be correct.
I suppose I have 2 questions here.
Does the HTTPReqResp component alter the XML.
Is it likely the webservice alters the XML.
The site is using Access Manager WebSEAL.
It's very likely that the receiving partner is getting a modified document somehow. Some minor modifications shouldn't affect the signature (that's the idea, at least) so you may want to check the following:
"Recommended" encoding used by the receiving partner. A very annoying practice by some receiving partners is to favor one form of encoding and completely ignore others. XML signatures should use utf-8 but I've seen servers that only accept iso-8859-1
Make sure you don't accidentally change encoding after signing.
Verify that the receiving partner is using a sane canonicalization method.
Verify with your receiving partner that no extraneous elements are being added to your document.
Also, have you tried to post this using the SecureBlackBox components? They also have an HTTP client that can do SSL, and that can be used to also verify the bytes being sent through the wire.

AFNetworking AFHTTPClient Different content types for success and Fail

I am trying to access a web service, via an AFHTTPClient subclass, that has a complication
If the request succeeds, the content is returned as JSON. If it fails for some reason, the error from the server is returned formatted as XML.
At the moment, the only way I figure I can deal with this is the not attempt to use the specific XML/JSON RequestOperations, and purely treat everything as a plain HTTP request, and then attempt to parse it manually myself, depending upon what the response looks like.
Sadly, I have no control over the web service, or I'd make sure it was all JSON.
Does anybody have any better suggestions for handling this?
[EDIT]
I guess one way of making it slightly cleaner, would be creating a new subclass of AFHTTPRequestOperation, that handled the detection of content type internally, and then passed back either parsed JSON or a GDataXML object depending upon what was returned from the server.
Thanks
This might not be the cleanest or most optimal solution, but you could do a check with an existing JSON library that the response is in fact valid JSON. If it is, proceed as usual; if it isn't, treat it with your hand-carved parsing solution.

Rails XML API design practices

I'm building an XML-based webservice in Rails to serve as the backend for an iPhone app, and I'm wondering how I can best achieve an auth scheme that will let me use both GET and POST requests -- i.e. one that doesn't require auth sent in the body of an XML payload.
The wrinkle here is that I'm not using regular HTTP auth. Instead, I'm creating a SHA1 digest of the iPhone's hardware ID (concatenated w/ a "secret" string pre-digest) along with the unhashed ID. I validate it on the server by attempting to re-create the digest w/ the hardware ID from the request and matching it against the hashed hardware ID from the request.
My question is this: should I create my service so that every action on every resource expects a payload of POSTed XML containing the security context in a common XML structure, or is there a better way to do it?
In other words, I'd like to use GET for things like /show, /index, etc. But as my app currently stands, I can't do that, since I need to send an XML payload containing the security context.
Perhaps there's a good way to achieve effectively the same thing with headers a la Google's web API's?
Every security context looks like this:
<request-wrapper>
<security-context>
<username>joefoo</username>
<hardware-id>AE7D128BCA9206E59901</hardware-id>
<hashed-hardware-id>cfd7983850301f97f6fdc26b553d1b6170f18bde</hashed-hardware-id>
</security-context>
...
(remainder of request payload)
...
</request-wrapper>
This is my first XML service in Rails, so I'd appreciate any general practice advice in this vein as well.
Thanks!
Your authentication scheme is subject to replay attacks if the "secret string" stays the same over the lifetime of the device.
Additionally, the "secret key" (if it is embedded in your application) can be dumped via strings (or other tool) breaking your scheme entirely.
I would instead use an asymmetric key to setup a one-time secret key, and then use it to hash a counter or something. If you need the hardware id for some reason, hash it plus the counter. This is basically a dumbed down SSL implementation, so you might as well just do that frankly (generating your own certificate, and doing the rare mutual authentication; but still...).
Remember, inventing your own security scheme is almost always a bad idea.
I'm thinking that it might be best to simply use custom headers for this and then access them in my controller filters w/ things like:
request.headers['username']
request.headers['hardware-id']
request.headers['hashedhardware-id']
Any thoughts on whether this is a good/bad idea?
How about creating an SHA1 digest of the entire XML request, instead of just the hardware-id? That way you're making replay attacks a lot harder. Sure, without a timestamp and (possibly) a nonce to make each request unique, a hacker could still replay the exact same request multiple times (maybe using up account credits or whatever), but at least they couldn't take the digest from an existing request and change the request details to make it do whatever they wanted.
Suggested steps:
Take your XML (without any hashed-hardware-id in it) and turn it into a byte array.
Create an SHA1 digest of the XML byte array.
Base-64 encode the XML byte array and the SHA1 digest byte array (separately).
Send the base-64-encoded XML as one request parameter, and the base-64-encoded signature as the other, either using GET or POST.

Resources