Convert curl command with --form-string to a URL - url

I have command (line-breaks added between command-line parameters for readability):
curl
-s
--form-string "token=AppToken"
--form-string "user=UserToken"
--form-string "message=Msg"
--form-string "title=Title"
https://api.pushover.net/1/messages.json
Can you tell me if this command can be converted into a URL link?

Can you tell me if this command can be converted into a url link?
It cannot.
That curl command is for a POST with an application/x-www-form-urlencoded request body.
"Links" are always GET requests and never for POST requests.
<a href="#"> links in HTML and the web can only make GET requests without a request-body (at least, not without custom JavaScript interception).
In desktop software frameworks and toolkits (that have built-in Hyperlink widgets), I find (in my personal experience) that they're similarly designed around the assumption they'll be used to open a URL to a web-page and so pass the URL to the user's default browser, which will only make a GET request.
This is because following a link (i.e. executing a GET request) must always be "safe" (i.e. GET requests should not mutate resource state).
Additionally, "Links" cannot have a request body.
Though while GET requests can (technically) have a request-body, support for that is not widespread; and obviously single URIs for hyperlink GET requests don't have any request-body data associated with them.
GET request bodies are intended to allow user-agents to make GET requests with associated request/query data that is too long to fit into the querystring of the URI (due to the common 1024 or 2048 char limit).

Related

POST Request is Displaying as GET Request During Replay In Jmeter

I have a Jmeter script where during replay, Post request is displaying as Get request and the parameters in the request are not sent to the server. Due to this, correlations are failing at this request.
One of the parameters in the request is ViewState with so many characters. Is this large parameter value causing the above issue? How to proceed now?
Most probably you're sending a malformed request therefore instead of properly responding to a POST request you're being redirected somewhere (most probably to Login page)
Use View Results Tree listener in HTML or Browser mode to see what page you're hitting in the reality
With regards to the ViewState, "so many characters" is not a problem, the problem is that these are not random characters. ViewState is being used for client-side state management and if you fail to provide the proper value you won't be able to move further so you need to design your test as follows:
Open first page
Extract ViewState using a suitable Post-Processor
Open second page
here you need to pass viewstate from the step 1 along with other parameters
More information: ASP.NET Login Testing with JMeter
Also don't forget to add HTTP Cookie Manager to your Test Plan
What I'm able to understand is the request may be getting redirected. This happens usually when the server expects a unique request. If you recorded the request, you may be possibly using older headers that carry old cookie information. Check your headers and then reconstruct the request.
Make sure you are not using old cookies anywhere. remove that cookie part from HTTP Header Manager everywhere.

How do I make a dynamic URL for a 404 xhtml page?

I have defined a location for the page in the xml
<error-page>
<error-code>404</error-code>
<location>/faces/public/error-page-not-found.xhtml</location>
</error-page>
<error-page>
but I want the URL to be like below:
faces/{variable}/public/error-page-not-found.xhtml
where the value of the variable will change according to different situations
This question is a bit subjective though in general HTTP errors are handled by the server and most of the time by the scripting language on the server (and occasionally the HTTP server software directly).
In example the Apache HTTP web server software allows for rewrites. So you can request a page at example.com/123 though there is no "123" file there. In the code that would determine if you would have something that would be available for that request you would also determine if a resource exists for that request; if not then your server scripting code (PHP, ColdFusion, Perl, ASP.NET, etc) would need to return an HTTP 404. The server code would then have a small snippet that you would put in to the body of the code such as the code you have above.
You would not need to redirect to an error page, you would simply respond with the HTTP 404 response and any XML you'd use to notify the visitor that there is nothing there. HTTP server software such as Apache can't really produce code (it can only reference or rewrite some file to be used for certain requests).
Generally speaking if you have a website that uses a database you'd do the following...
Parse the URL requested so you can determine what the visitor requested.
Determine if a resource should be retrieved for that request (e.g. make a query to the database).
Once you know whether a resource is available or not you then either show the resource (e.g. a member's profile) or server the HTTP status (401: not signed in at all, 403:, signed in though not authorized where no increase in privileges will grant permission, 404: not found, etc) and display the corresponding content.
I would highly recommend that you read about Apache rewrites and PHP, especially it's $_SERVER array (e.g. <?php print_r($_SERVER);?>). You'd use Apache to rewrite all requests to a file so even if they request /1, /a, /about, /contact/, etc they all get processed by a single PHP file where you first determine what the requested URL is. There are tons of questions here and elsewhere on the web that will help you really get a good quick jump start on handling all that such as this: Redirect all traffic to index.php using mod_rewrite. If you do not know how to setup a local HTTP web server I highly recommend looking in to XAMPP, it's what I started out with years ago. Good luck!

Can a rails application identify when a request is from curl?

A rails application can use the request object to access user agent and more data about the request.
How to detect browser type and its version
But with curl, a developer can set the header data and more. How to use curl to get a GET request exactly same as using Chrome?
Can a rails application accurately detect when a request is sent by a software like curl versus a browser?
No. cURL can simulate any HTTP request with the correct configuration. There is no way to tell the difference between Chrome and cURL from an HTTP request alone.
If you're trying to make it harder to scrape data from your server, you'll want to use other methods (rate-limiting, authentication, etc.). But there is no perfect solution to prevent a determined scraper.

Dot-dot removed from URL by Firefox

When I enter an URL like this, with ..
http://SERVER:8085/../tn/d9dd6c39d71276487ae798d976f8f629_tn.jpg
I obtain a request in my Web-Server without ..-part
Does Firefox remove it silently? Are the .. not allowed in URLs?
P.S.: wget removes .. also :-(
I have recently begun seeing this and, despite what the marked answer states, adding this to a URL does make sense and is a valid folder path in the world of IT security where we intetionally bypass security measures in mis-configured sites, classified as Directory Traversal attacks.
Web (browsers, wget, curl, etc...) tools silently evaluate the URL path and strip out the "/../" making my job of finding vulnerabilities more difficult. To get around this, I use Firefox along with Burpsuite, a proxying assessment tool that captures the request and allows me to modify it before sending to the server.
Doing this, I can type:
https://example.com/vpn/../vpns/cfg/etc
in my browser URL, and when I capture it using Burpsuite, it looks like:
https://example.com/vpns/cfg/etc
showing me that Firefox has in fact changed my original intended URL string. So within Burpsuite, I modify the request back to say:
GET /vpn/../vpns/cfg/etc HTTP/1.1
send it to the server, and voila, the path remains intact and navigates to the correct location. Yes, in a normal well-configured application with proper request handling, doing this shouldn't be necessary. This particular string acts differently in these 2 formats, so modifying it necessary to cause the server to handle it in the manner we want to show there is a configuration problem with how the application handles the request (a Directory Traversal vulnerability).
This can also be proven using curl. If you send a normal curl command like below, curl will do the same as Firefox and evaluate the path, removing "/vpn/.." from it before sending to the server:
curl -i -s -k "https://example.com/vpn/../vpns/cfg/etc"
However, if you add the "--path-as-is" argument, curl will not modify it and send it as-is, and the "/vpn/.." remains intact:
curl -i -s -k "https://example.com/vpn/../vpns/cfg/etc" --path-as-is
After some additional reading, I found this behavior is due in part to URI Normalization standards (https://en.wikipedia.org/wiki/URI_normalization).
This points to RFC 3986 for defining URI Syntax https://www.rfc-editor.org/rfc/rfc3986#section-5.2.4.
".." means a relative path and used for moving up in the hierarchy. So ".." is not a valid name for a folder therefore you cannot use it in the middle of URL. It just makes no sense.
So to answer your question: ".." is allowed in url but only in the beginning.
Complementary information:
"../" will be stripped by the developer tools as well (up to 54.0.1 at least), meaning you cannot use the "Edit and resend" to hand-craft a valid request like this:
GET /../tn/d9dd6c39d71276487ae798d976f8f629_tn.jpg
... which could potentially result in a directory traversal and the file being retrieved.

How can I give the response file of a universal HTTP request a unique name?

I've developed an HTTP API Server (intended to be called by third-party applications, not necessarily by a web browser) which has one universal call to get (download) any and all types of files by passing a name parameter in the query string for the file requested. All calls, no matter for which file, are handled in the same custom request handler of mine called Get (not to be confused with the standard HTTP get). A query string includes a property name which identifies the unique file to get.
So a request may look like:
http://MyServerURL.com/Get?Key=SomeAPIKeyForAuthentication&Name=SomeUniqueIdentifier
First of all, I know I can obviously make the server fetch a file using only the URI, for example...
http://MyServerURL.com/SomeUniqueIdentifier?Key=SomeAPIKeyForAuthentication
...but the design is specifically meant to use this one universal get command, so I need to keep this unique identifier in the query string. The actual applications which connect to this API will never need to know this filename, but there may be an event when a URL is manually provided for someone to open in their browser to download a file.
However, whenever a file is downloaded through a web browser, since the call is get, the saved filename also winds up being just get.
Is there any trick in HTTP which I can implement on my server which will force the downloaded filename to be the unique identifier, rather than just get? For example, some method such as using re-direct?
I'm using Indy 10 TIdHTTPWebBrokerBridge in Delphi XE2 as the web server. I'm looking for a way in this component (technically in its corresponding TWebModule handler) when it handles this get request, to make the response's filename whatever string I want (in this case, SomeUniqueIdentifier). I've heard the term "URL Re-writing" but that's a rather different topic, and don't think it's what I need, yet it might.
That seems to a rather long winded way of saying you want to set the filename for an HTTP download indpendently of the URL used to fetch it. In which case you simply send a Content-Dispositon header specifying the desired filename. See section 19.5.1 of rfc 2616
e.g.
Content-Disposition: attachment; filename="stackoverlow.ans"

Resources