DELETE requests directly from the command line - ruby-on-rails

I'm following this tutorial http://ruby.railstutorial.org/chapters/updating-showing-and-deleting-users#fnref-9_4 and after the code in Listing 9.44 author says that in that specific moment any user can be deleted with a direct DELETE request from the command line, of course I believe that it's true but I don't know how to check this one

any sufficiently sophisticated attacker could simply issue DELETE requests directly from the command line to delete any user on the site.
Here's an example of a command line DELETE call using cURL
$ curl -X DELETE http://localhost:3000/users/1
http://localhost:3000/ is the path to your app, 1 is the ID of the record to delete.
With cURL you can simulate the same requests you will perform using your browser. You can use any other HTTP client.

You can use cURL from the command line to make HTTP requests, including DELETE requests.
curl -i -H "Accept: application/json" -X DELETE http://localhost:3000/persons/person/1
** Adapted from this blog post

Related

How to emulate curl requests in rails app?

I want to test that a PUT to an endpoint (products/:id) works, but when I try
curl -X PUT -d listing_id_created=True localhost:3000/products/27
it gives ActionController::InvalidAuthenticityToken, which I now realise is the expected result (since there's no authenticity token provided since the PUT is coming from curl and curl doesn't know anything about it).
So my question is how do I run some simple curl PUTs (or any other verbs) to check that endpoints work correctly? Is the only solution to simply disable/skip the authenticity token?

Can't get HTTP PATCH to work on Google Cloud Run instance

I have running a webserver called Postgrest which generates a REST API on top of a postgres DB. I have this running in Google Cloud run, and have it working for the most part. The HTTP actions I need to take are POST, GET, DELETE and PATCH.
Everything works correctly except PATCH, which I use to update an existing value in the DB.
When I run the command from curl command prompt, no error is given, but it doesnt'w work.
https://postgrest-q5mmtshbma-uc.a.run.app/notes?noteid=eq.3 -X PATCH -H "Authorization: Bearer $TOKEN" -H "Conte nt-Type: application/json" -d '{"note" : "updated it!"}'
When I run this against same postgrest version running locally, everything works correctly, so it has me thinking there might be an issue with Google Cloud run and not allowing/accepting PATCH requests? Again, POST, DELETE, GET all work fine.
Anyone have any insight what might be happening here?
I ultimately found the issue with this was related to using RLS (Row level security) in the PostGres DB, and I had setup specific policy's for insert, update, delete, and select.
The "update" policy I believe was incorrectly setup, so the update failed, but both postgre DB and the postgrest WebServer did not provide an errors to this effect.
Ultimately, when I re-created the update policy on this table, the PATCH (update) command ran successfully.

curl post data and file contemporary

I have tried:
curl -v --http1.0 --data "mac=00:00:00" -F "userfile=#/tmp/02-02-02-02-02-22" http://url_address/getfile.php
but it fails with the following message:
Warning: You can only select one HTTP request!
How can I send a mix of data and file by curl? Is it possible or not?
Thank you
Read up on how -F actually works! You can add any number of data parts and file parts in a multipart formpost that -F makes. -d however makes a "standard" clean post and you cannot mix -d with -F.
You need to first figure out which kind of post you want, then you pick either -d or -F depending on your answer.

What's the easiest way to emulate a post request from unix cmdline?

My job has a reservation system that requires me to periodically go to a particular page and do a refresh. When a seat becomes available, I can then reserve on a first come first served basis.
I want to write a script that will email/text me when something becomes available. To do this I need to emulate clicking in the browser.
I know that the page is an aspx script with post requests.
Is there some way to log clicking on the "go" button, dump that to a file and transform that into a curl command. In chrome, I can dump a HTTP Archive file. Perhaps there are other paths. I can run explorer or firefox too.
You
can use curl
with the POST verb
curl -X POST -u svnpenn -k \
-d '{"name":"tcl-8.5.13.tar.gz","size":130073}' \
https://api.github.com/repos/svnpenn/etc/downloads

Getting only response header from HTTP POST using cURL

One can request only the headers using HTTP HEAD, as option -I in curl(1).
$ curl -I /
Lengthy HTML response bodies are a pain to get in command-line, so I'd like to get only the header as feedback for my POST requests. However, HEAD and POST are two different methods.
How do I get cURL to display only response headers to a POST request?
-D, --dump-header <file>
Write the protocol headers to the specified file.
This option is handy to use when you want to store the headers
that a HTTP site sends to you. Cookies from the headers could
then be read in a second curl invocation by using the -b,
--cookie option! The -c, --cookie-jar option is however a better
way to store cookies.
and
-S, --show-error
When used with -s, --silent, it makes curl show an error message if it fails.
from the man page. so
curl -sS -D - www.acooke.org -o /dev/null
follows redirects, dumps the headers to stdout and sends the data to /dev/null (that's a GET, not a POST, but you can do the same thing with a POST - just add whatever option you're already using for POSTing data)
note the - after the -D which indicates that the output "file" is stdout.
The other answers require the response body to be downloaded. But there's a way to make a POST request that will only fetch the header:
curl -s -I -X POST http://www.google.com
An -I by itself performs a HEAD request which can be overridden by -X POST to perform a POST (or any other) request and still only get the header data.
The Following command displays extra informations
curl -X POST http://httpbin.org/post -v > /dev/null
You can ask server to send just HEAD, instead of full response
curl -X HEAD -I http://httpbin.org/
Note: In some cases, server may send different headers for POST and HEAD. But in almost all cases headers are same.
For long response bodies (and various other similar situations), the solution I use is always to pipe to less, so
curl -i https://api.github.com/users | less
or
curl -s -D - https://api.github.com/users | less
will do the job.
Maybe it is little bit of an extreme, but I am using this super short version:
curl -svo. <URL>
Explanation:
-v print debug information (which does include headers)
-o. send web page data (which we want to ignore) to a certain file, . in this case, which is a directory and is an invalid destination and makes the output to be ignored.
-s no progress bar, no error information (otherwise you would see Warning: Failed to create the file .: Is a directory)
warning: result always fails (in terms of error code, if reachable or not). Do not use in, say, conditional statements in shell scripting...
Much easier – this also follows links.
curl -IL http://example.com/in-the-shadows
While the other answers have not worked for me in all situations, the best solution I could find (working with POST as well), taken from here:
curl -vs 'https://some-site.com' 1> /dev/null
headcurl.cmd (windows version)
curl -sSkv -o NUL %* 2>&1
I don't want a progress bar -s,
but I do want errors -S,
not bothering about valid https certificates -k,
getting high verbosity -v (this is about troubleshooting, is it?),
no output (in a clean way).
oh, and I want to forward stderr to stdout, so I can grep against the whole thing (since most or all output comes in stderr)
%* means [pass on all parameters to this script] (well(https://stackoverflow.com/a/980372/444255), well usually that's just one parameter: the url you are testing
real-world example (on troubleshooting proxy issues):
C:\depot>headcurl google.ch | grep -i -e http -e cache
Hostname was NOT found in DNS cache
GET HTTP://google.ch/ HTTP/1.1
HTTP/1.1 301 Moved Permanently
Location: http://www.google.ch/
Cache-Control: public, max-age=2592000
X-Cache: HIT from company.somewhere.ch
X-Cache-Lookup: HIT from company.somewhere.ch:1234
Linux version
for your .bash_aliases / .bash_rc:
alias headcurl='curl -sSkv -o /dev/null $# 2>&1'
-D, --dump-header
Write the protocol headers to the specified file.
This option is handy to use when you want to store the headers
that a HTTP site sends to you. Cookies from the headers could
then be read in a second curl invocation by using the -b,
--cookie option! The -c, --cookie-jar option is however a better
way to store cookies.

Resources