Getting only response header from HTTP POST using cURL - post

One can request only the headers using HTTP HEAD, as option -I in curl(1).
$ curl -I /
Lengthy HTML response bodies are a pain to get in command-line, so I'd like to get only the header as feedback for my POST requests. However, HEAD and POST are two different methods.
How do I get cURL to display only response headers to a POST request?

-D, --dump-header <file>
Write the protocol headers to the specified file.
This option is handy to use when you want to store the headers
that a HTTP site sends to you. Cookies from the headers could
then be read in a second curl invocation by using the -b,
--cookie option! The -c, --cookie-jar option is however a better
way to store cookies.
and
-S, --show-error
When used with -s, --silent, it makes curl show an error message if it fails.
from the man page. so
curl -sS -D - www.acooke.org -o /dev/null
follows redirects, dumps the headers to stdout and sends the data to /dev/null (that's a GET, not a POST, but you can do the same thing with a POST - just add whatever option you're already using for POSTing data)
note the - after the -D which indicates that the output "file" is stdout.

The other answers require the response body to be downloaded. But there's a way to make a POST request that will only fetch the header:
curl -s -I -X POST http://www.google.com
An -I by itself performs a HEAD request which can be overridden by -X POST to perform a POST (or any other) request and still only get the header data.

The Following command displays extra informations
curl -X POST http://httpbin.org/post -v > /dev/null
You can ask server to send just HEAD, instead of full response
curl -X HEAD -I http://httpbin.org/
Note: In some cases, server may send different headers for POST and HEAD. But in almost all cases headers are same.

For long response bodies (and various other similar situations), the solution I use is always to pipe to less, so
curl -i https://api.github.com/users | less
or
curl -s -D - https://api.github.com/users | less
will do the job.

Maybe it is little bit of an extreme, but I am using this super short version:
curl -svo. <URL>
Explanation:
-v print debug information (which does include headers)
-o. send web page data (which we want to ignore) to a certain file, . in this case, which is a directory and is an invalid destination and makes the output to be ignored.
-s no progress bar, no error information (otherwise you would see Warning: Failed to create the file .: Is a directory)
warning: result always fails (in terms of error code, if reachable or not). Do not use in, say, conditional statements in shell scripting...

Much easier – this also follows links.
curl -IL http://example.com/in-the-shadows

While the other answers have not worked for me in all situations, the best solution I could find (working with POST as well), taken from here:
curl -vs 'https://some-site.com' 1> /dev/null

headcurl.cmd (windows version)
curl -sSkv -o NUL %* 2>&1
I don't want a progress bar -s,
but I do want errors -S,
not bothering about valid https certificates -k,
getting high verbosity -v (this is about troubleshooting, is it?),
no output (in a clean way).
oh, and I want to forward stderr to stdout, so I can grep against the whole thing (since most or all output comes in stderr)
%* means [pass on all parameters to this script] (well(https://stackoverflow.com/a/980372/444255), well usually that's just one parameter: the url you are testing
real-world example (on troubleshooting proxy issues):
C:\depot>headcurl google.ch | grep -i -e http -e cache
Hostname was NOT found in DNS cache
GET HTTP://google.ch/ HTTP/1.1
HTTP/1.1 301 Moved Permanently
Location: http://www.google.ch/
Cache-Control: public, max-age=2592000
X-Cache: HIT from company.somewhere.ch
X-Cache-Lookup: HIT from company.somewhere.ch:1234
Linux version
for your .bash_aliases / .bash_rc:
alias headcurl='curl -sSkv -o /dev/null $# 2>&1'

-D, --dump-header
Write the protocol headers to the specified file.
This option is handy to use when you want to store the headers
that a HTTP site sends to you. Cookies from the headers could
then be read in a second curl invocation by using the -b,
--cookie option! The -c, --cookie-jar option is however a better
way to store cookies.

Related

Multi files upload using curl to ruby on rails app

I'm struggling about how I can send an unknow number of files to my rails app using curl.
This is my curl request to POST with one file :
curl -H 'Authorization: Token token=your_token' -X POST -F job[webapp_id]=2 -F job[file]=#test.txt localhost:3000/api/v0/jobs
It works.
I would like to allow the user to send as much as file as he wants with something like :
-F job[files][]=#test1.txt -F job[files][]=#test2.txt
-F job[files[]]=#test1.txt -F job[files[]]=#test2.txt
But it's not working.
I also tried with :
-F job[files[0]]=#test.txt -F job[files[1]]=#test2.txt
Still not working. I think it's because I don't know how to tweak my permit parameters. I get an empty array.
Any idea on how to do it with one request ?
RestClient may be easier to work with.

curl post data and file contemporary

I have tried:
curl -v --http1.0 --data "mac=00:00:00" -F "userfile=#/tmp/02-02-02-02-02-22" http://url_address/getfile.php
but it fails with the following message:
Warning: You can only select one HTTP request!
How can I send a mix of data and file by curl? Is it possible or not?
Thank you
Read up on how -F actually works! You can add any number of data parts and file parts in a multipart formpost that -F makes. -d however makes a "standard" clean post and you cannot mix -d with -F.
You need to first figure out which kind of post you want, then you pick either -d or -F depending on your answer.

DELETE requests directly from the command line

I'm following this tutorial http://ruby.railstutorial.org/chapters/updating-showing-and-deleting-users#fnref-9_4 and after the code in Listing 9.44 author says that in that specific moment any user can be deleted with a direct DELETE request from the command line, of course I believe that it's true but I don't know how to check this one
any sufficiently sophisticated attacker could simply issue DELETE requests directly from the command line to delete any user on the site.
Here's an example of a command line DELETE call using cURL
$ curl -X DELETE http://localhost:3000/users/1
http://localhost:3000/ is the path to your app, 1 is the ID of the record to delete.
With cURL you can simulate the same requests you will perform using your browser. You can use any other HTTP client.
You can use cURL from the command line to make HTTP requests, including DELETE requests.
curl -i -H "Accept: application/json" -X DELETE http://localhost:3000/persons/person/1
** Adapted from this blog post

What's the easiest way to emulate a post request from unix cmdline?

My job has a reservation system that requires me to periodically go to a particular page and do a refresh. When a seat becomes available, I can then reserve on a first come first served basis.
I want to write a script that will email/text me when something becomes available. To do this I need to emulate clicking in the browser.
I know that the page is an aspx script with post requests.
Is there some way to log clicking on the "go" button, dump that to a file and transform that into a curl command. In chrome, I can dump a HTTP Archive file. Perhaps there are other paths. I can run explorer or firefox too.
You
can use curl
with the POST verb
curl -X POST -u svnpenn -k \
-d '{"name":"tcl-8.5.13.tar.gz","size":130073}' \
https://api.github.com/repos/svnpenn/etc/downloads

curl needs to send '\r\n' - need transformation of a working solution

I need a transformation of the following working curl command:
curl --data-binary #"data.txt" http://www.example.com/request.asp
The data.txt includes this:
foo=bar
parameter1=4711
parameter2=4712
The key is I need to send the linebreaks and they are \r\n. Its working with the file because it has the right encoding but how do I manage to get this curl command run without the file? So a 1-liner sending the parameters with the correct \r\n on end of each.
All my tests with different URL encoding, etc. didn't work. I never got the same result like with the file.
I need this information because I have serious trouble to get this post run on my Ruby on Rails App using net/http.
Thanks!
One way to solve it is to generate the binary stream with something on the fly, like the printf command, and have curl read the data from stdin:
printf 'foo=bar\r\nparameter1=4711\r\nparameter2=4712' | curl --data-binary #- http://example.com

Resources