I'm writing an HTTP API wrapper to integrate with particular IP Cameras. Part of the process requires login credentials in the URI, for example:
http://user:pass_with_#_sign#address:port/somespecificpath/
I'm using Indy's TIdHTTP.Get to send the request to the device. However, the password consists of a '#' character. If this URI is placed in any browser with the plain password, the # character throws it off. Therefore, I need to encode the password's # to %23...
http://user:pass_with_%23_sign#address:port/somespecificpath/
When I paste this URI into any browser, it successfully logs in and does what it needs. However, when I pass the exact same URI into TIdHTTP.Get, it does not successfully log in, and therefore I cannot do anything as long as the password contains # (or %23). Changing the password to not include a # is far too sloppy of a solution. Indy must be messing something up with this URI/password.
Is this a bug in Indy, or is there something else I need to do to make Indy accept such an encoded password?
UPDATE
I added a new account on one of the cameras with username and password without any special characters which need encoding, and authentication still does not work. It seems as if Indy is stripping out the login credentials completely from the URI, and doesn't even send these credentials. Next thing I need to do is monitor the URI which is actually sent.
UPDATE 2
I did a packet capture via WireShark and have verified that the credentials ...user:pass#... are not even sent - they're stripped from the URI that Indy actually sends.
UPDATE 3
TLama suggested that I get a capture of what's sent when using a browser. I tried this, and sure enough even when using a browser the capture doesn't show these login credentials either... even though it works from a browser. So I have to figure out another way to identify whether or not these credentials are sent.
UPDATE 4
I tried (before seeing Remy's answer) to provide these credentials in Request.Username and Request.Password instead of in the URI, and I still have no success. I keep getting "Unauthorized" back from the device.
UPDATE 5
The documentation for this API mentions nothing relevant to how users are authenticated other than this paragraph:
Grandstream Video Surveillance API (Application Programming Interface) supports HTTP 1.0 protocol (RFC1945). This document explains in detail the parameter of functions in client side, via the supported GET/POST method. Users will require administrator privilege to retrieve or set the parameters.
And on that note, I did switch the TIdHTTP protocol version to 1.0.
UPDATE 6
Hopefully the last update needed... I did another comparison with the packet captures between a browser (Chrome) and TIdHTTP. Chrome actually sends two requests, the first one does not have any credentials, but in the second request there's a node in the header Authorization: Basic... and Credentials: User:Pass, whereas using TIdHTTP only sends 1 single request without these credentials.
UPDATE 7
7 is a lucky number :-) I just realized, the very first request I make to the device returns "Unauthorized", but all following requests I make (using the same TIdHTTP instance) are successful! So going back to my prior update, just like I see in the browser capture, it takes that second repetitive request for it to work.
# is an illegal character in a URL prior to the fragment portion of the URL, that is why it has to be encoded as %23 when used in other areas of the URL.
A username/password is not actually part of a real URL, that is why they get stripped off when TIdHTTP sends the URL to a server (monitor the traffic of any web browser and you will see the same thing happen).
To use HTTP authentication with TIdHTTP, you need to use the TIdHTTP.Request.Username and TIdHTTP.Request.Password properties instead (and you do not need to URL encode the values), eg:
IdHTTP1.Request.Username := 'user';
IdHTTP1.Request.Password := 'pass_with_#_sign';
IdHTTP1.Get('http://address:port/somespecificpath/');
If you pass a URL that has an encoded username/password in it, TIdHTTP will strip off the values and move them to the Request.Username and Request.Password properties for you, but they will remain in their original encoded format, eg:
IdHTTP1.Get('http://user:pass_with_%23_sign#address:port/somespecificpath/')
// this will set Request.Username to 'user',
// Request.Password to 'pass_with_%23_sign', and
// send a request for 'http://address:port/somespecificpath/'
If you are being given an encoded URL to start with, you can use the TIdURI class to manually decode it prior to then calling TIdHTTP.Get(), eg:
var
RequestUrl: string;
Uri: TIdURI;
begin
RequestUrl := 'http://user:pass_with_%23_sign#address:port/somespecificpath/';
...
Uri := TIdURI.Create(RequestURL);
try
IdHTTP1.Request.Username := TIdURI.URLDecode(Uri.UserName);
IdHTTP1.Request.Password := TIdURI.URLDecode(Uri.Password);
RequestURL := Uri.URI;
finally
Uri.Free;
end;
IdHTTP1.Get(RequestUrl);
...
end;
Update: either way, make sure you have appropriate IdAuthentication... units, or the IdAllAuthentications unit, in your uses clause to enable Indy's HTTP authentication classes for TIdHTTP to use.
I have solved the issue by sending two sequential Get requests. After observing the packet captures between a browser and Indy, I noticed browsers would always send one request without credentials, and then another identical request with credentials. So it only sends the credentials when it needs to. Indy was only sending one request, but if I send another request right afterward, I have success.
So, the request now looks like...
FWeb.Get(U);
FWeb.Get(U, R);
Of course it really should be in the order of "If the first request is unauthorized, then send another request with credentials".
Related
I have a Jmeter script where during replay, Post request is displaying as Get request and the parameters in the request are not sent to the server. Due to this, correlations are failing at this request.
One of the parameters in the request is ViewState with so many characters. Is this large parameter value causing the above issue? How to proceed now?
Most probably you're sending a malformed request therefore instead of properly responding to a POST request you're being redirected somewhere (most probably to Login page)
Use View Results Tree listener in HTML or Browser mode to see what page you're hitting in the reality
With regards to the ViewState, "so many characters" is not a problem, the problem is that these are not random characters. ViewState is being used for client-side state management and if you fail to provide the proper value you won't be able to move further so you need to design your test as follows:
Open first page
Extract ViewState using a suitable Post-Processor
Open second page
here you need to pass viewstate from the step 1 along with other parameters
More information: ASP.NET Login Testing with JMeter
Also don't forget to add HTTP Cookie Manager to your Test Plan
What I'm able to understand is the request may be getting redirected. This happens usually when the server expects a unique request. If you recorded the request, you may be possibly using older headers that carry old cookie information. Check your headers and then reconstruct the request.
Make sure you are not using old cookies anywhere. remove that cookie part from HTTP Header Manager everywhere.
I want to make an application to redirect websites.
It has a table with "domains" and "redirect domains".
Once it matched domain, it redirect to redirect domain.
If didn't matched, it redirect to default page.
So I created a Delphi application with IdHTTPProxyServer.
I have configured it to even work with https using "ssleay32.dll" and "libeay32.dll".
Everything works great.
It use "IdHTTPProxyServerHTTPBeforeCommand" event to redirect like this:
with AContext.Connection.IOHandler do
begin
WriteLn('HTTP/1.0 302 Moved Temporarily');
WriteLn('Location: ' + RedirectURL);
WriteLn('Connection: close');
WriteLn;
end;
But how do I distinguish the event call by main URL (user typed in the address bar) and other URLs?
"IdHTTPProxyServerHTTPBeforeCommand" event called lots of times when a page is loading for stat counters, facebook like buttons, etc. I don't want to redirect all of them to default page.
If this is not possible with IdHTTPProxyServer, is there any other options in Delphi or any other language (which can generate native executable. C++ preferred)?
Thank You
From the perspective of a proxy (or the target HTTP server, for that matter), there is no difference whatsoever between a user-typed URL and other URLs. Every HTTP request is self-contained and independent of every other HTTP request. They have to processed as-is on a per-request basis.
If you want to ignore dependent URLs (images, scripts, etc), you will have to know ahead of time what the initial URL is, parse the data that is retrieved from that URL, keep track of any URLs the data refers to, and then ignore those URLs if you see them being requested later. However, there is nothing in the HTTP protocol to tell you what the initial URL is. There is a Referer request header that may help at times, as it is filled in when a browser is requesting dependent resource files, but it is also filled in when the user navigates around from one page to another, so you can't rely on the Referer by itself. You will have to implement your own discovery logic to figure out the initial URL based on more analysis of the URLs being requested by a given client over time.
Only the client really knows what it is requesting and why, a proxy is just a gateway to reach it. So there is only so much smart filtering you can do in a proxy without knowing what the client is actually doing.
If a website makes a GET request, from a HTTPS page to another HTTPS page, is that secure? Specifically, is the data in the URL / query params secure?
I'm asking because, hen I call Stripe.createToken, a connection is made to a URL with the credit card number in it. Even though the query parameter says _method=POST, it is being transmitted over a GET query param:
Request URL: https://api.stripe.com/v1/tokens?card[number]=4242424242424242&card[cvc]=123&card[exp_month]=4&card[exp_year]=2016&key=pk_test_1236&callback=sjsonp11234&_method=POST
Request Method: GET
Status Code: 200 OK
Now, I understand this is all over HTTPS, but isn't the URL part insecure? I thought that URLs get logged in various places along the way to their destination.
URLs usually do get logged in webserver logs. It is a very bad idea to sent that information as part of a GET request. The hops a request takes between the client and the destination are encrypted though. So assuming there is no web proxy or anything the only place it might be logged is on https://api.stripe.com/'s webserver.
See Are querystring parameters secure in HTTPS (HTTP + SSL)?
for more information.
From Stripe:
Because of the nature of how HTTPS works, the only information that's transmitted in plaintext to an HTTPS connection is the hostname you're connecting to (in this case, "api.stripe.com"). All other parts of the communication - including the full URL - are encrypted such that they're only decryptable by our servers. At the transport level, including cardholder details as GET parameters of the URL is no different from including them in the POST body. We only use JSONP for Stripe.js and not for any server-side bindings, in case you are worried about having those requests come up in your server logs.
Once the details get to our server, we've made changes to the configurations on our servers to ensure that the query strings are never logged, and we have routines in place that check all log files for accidental inclusion of card numbers. We've worked with our PCI auditors (who also audit Google, Apple and AWS) to ensure that this meets the standards of PCI, and are confident that we're handling cardholder data in a way that is secure.
I've developed an HTTP API Server (intended to be called by third-party applications, not necessarily by a web browser) which has one universal call to get (download) any and all types of files by passing a name parameter in the query string for the file requested. All calls, no matter for which file, are handled in the same custom request handler of mine called Get (not to be confused with the standard HTTP get). A query string includes a property name which identifies the unique file to get.
So a request may look like:
http://MyServerURL.com/Get?Key=SomeAPIKeyForAuthentication&Name=SomeUniqueIdentifier
First of all, I know I can obviously make the server fetch a file using only the URI, for example...
http://MyServerURL.com/SomeUniqueIdentifier?Key=SomeAPIKeyForAuthentication
...but the design is specifically meant to use this one universal get command, so I need to keep this unique identifier in the query string. The actual applications which connect to this API will never need to know this filename, but there may be an event when a URL is manually provided for someone to open in their browser to download a file.
However, whenever a file is downloaded through a web browser, since the call is get, the saved filename also winds up being just get.
Is there any trick in HTTP which I can implement on my server which will force the downloaded filename to be the unique identifier, rather than just get? For example, some method such as using re-direct?
I'm using Indy 10 TIdHTTPWebBrokerBridge in Delphi XE2 as the web server. I'm looking for a way in this component (technically in its corresponding TWebModule handler) when it handles this get request, to make the response's filename whatever string I want (in this case, SomeUniqueIdentifier). I've heard the term "URL Re-writing" but that's a rather different topic, and don't think it's what I need, yet it might.
That seems to a rather long winded way of saying you want to set the filename for an HTTP download indpendently of the URL used to fetch it. In which case you simply send a Content-Dispositon header specifying the desired filename. See section 19.5.1 of rfc 2616
e.g.
Content-Disposition: attachment; filename="stackoverlow.ans"
I got an eccentric problem. I am trying to automate visiting a web site by using WebRequest and WebClient. I have observed all the post request header key-value pairs and posted data string in Firebug(request Header and Post tab). Then I simulated such request with WebRequest and put all the header parameter and posted data there. However when I do GetResponse() from this request instance, I always got an error page back that says some sessionID is short of.
Actually I have taken care to put previously(first step to open the Logon page) responded session cookie in the Header's cookie field for the request. And I can get the correct response back by simulating requesting the logon page(the 1st page), but cannot get through this authentication page. My post data is like userid=John&password=123456789&domain=highmark.And the authentication page request that carried out by browser succeeds every time.
Am I missing something in the request that may not be shown by firebug.If yes, can you give me some recommendation for the tools that may examine the entire request sent by browser.
I have solved this issue. The problem is I set the httpWebRequest instance's AllowAutoRedirect=true. Thus the effect is when I got the first response from the server, the httpWebRequest would continually to make another request asking for a different url that is replied in the response header's Location field.
The defect of HttpWebRequest class is when it is getting redirected, it does not include the Set-Cookies(Response's Header Field)'s cookies in the next request header, thus the server would deny such page request and may redirect again to another different page.
And the httpWebRequest.GetResponse() method only return the last responsed page back under the setting AllowAutoRedirect=true. And I got the totally different response than I expected.
Also in this solving process, I need to thank to a distinguish Http Traffic examining tool:IEInspector Http Analyzer(http://www.ieinspector.com/httpanalyzer/). The great feature of this tool is it can examine not only the http traffic from browser but also what your process's httpWebRequest made. And also it can display in text format the raw stream of those request and response. Although it is a commercial software, you can try it for 15 days. I am quite happy with what it tells me(in well-formed details) and I like to buy it as well.