Multi files upload using curl to ruby on rails app - ruby-on-rails

I'm struggling about how I can send an unknow number of files to my rails app using curl.
This is my curl request to POST with one file :
curl -H 'Authorization: Token token=your_token' -X POST -F job[webapp_id]=2 -F job[file]=#test.txt localhost:3000/api/v0/jobs
It works.
I would like to allow the user to send as much as file as he wants with something like :
-F job[files][]=#test1.txt -F job[files][]=#test2.txt
-F job[files[]]=#test1.txt -F job[files[]]=#test2.txt
But it's not working.
I also tried with :
-F job[files[0]]=#test.txt -F job[files[1]]=#test2.txt
Still not working. I think it's because I don't know how to tweak my permit parameters. I get an empty array.
Any idea on how to do it with one request ?

RestClient may be easier to work with.

Related

How can I test a Rails app using WGET

I want to test my rails app using "wget", specifically the part that returns JSON-data. I don't really understand the syntax I should use. I have tried this:
wget --user=username#example.com --password=somepass localhost:3000/folders/1.json
and variations of it, without any success. Which is the exact syntax? Would it be better to use CURL instead?
-- edit --
I found at this blog:
http://blogs.operationaldynamics.com/andrew/software/research/testing-rest-the-hard-way
this suggestion:
$ wget -S -q --header "Accept: application/json" -O - http://localhost:3000/folders/1
but even when I add
--user=username#example.com --password=somepass
...I get 401 Unauthorised. The username is correct, I can login via the browser.
curl -u username:password http://localhost:3000/folders/1.json
Read more here http://curl.haxx.se/docs/manpage.html#-u
An alternative to curl is the less well known but equally capable httpie - https://github.com/jkbrzt/httpie - I find it to be a bit more straightforward and friendly to use, and includes syntax colouring for output.
http -a user:pass :3030/folders/1.json would do the trick here, I think.

neo4j: load2neo not working with arrays

I'm trying to use load2neo to read in a graph in Geoff format (which I wrote out using load2neo!). The extension is installed properly and works with simple queries:
$ curl -X POST http://localhost:7474/load2neo/load/geoff -d '(alice)-[:KNOWS]->(bob)'
returns:
{"alice":70,"bob":69}
and
$ curl -X POST http://localhost:7474/load2neo/load/geoff -d '(bob)-[:KNOWS]->(carol)'
{"carol":72,"bob":71}
Both show up fine in the graph browser. But when I try to do both at once:
curl -X POST http://localhost:7474/load2neo/load/geoff -d '[(alice)<-[:KNOWS]->(bob),(bob)-[:KNOWS]->(carol)]'
it fails silently. It also fails silently with:
curl -X POST http://localhost:7474/load2neo/load/geoff -d #test.geoff
with the file contents:
[(alice)-[:KNOWS]->(bob),(bob)-[:KNOWS]->(carol)]
It's not an authentication problem, I don't think I have the syntax wrong (I've copied it directly from the files that load2neo itself output, and double-checked it against the spec), but I just can't figure out why it's not working. Any ideas?
This is with load2neo 0.6.0 downloaded from the website and Neo4J 2.3.1, community edition.

curl post data and file contemporary

I have tried:
curl -v --http1.0 --data "mac=00:00:00" -F "userfile=#/tmp/02-02-02-02-02-22" http://url_address/getfile.php
but it fails with the following message:
Warning: You can only select one HTTP request!
How can I send a mix of data and file by curl? Is it possible or not?
Thank you
Read up on how -F actually works! You can add any number of data parts and file parts in a multipart formpost that -F makes. -d however makes a "standard" clean post and you cannot mix -d with -F.
You need to first figure out which kind of post you want, then you pick either -d or -F depending on your answer.

curl needs to send '\r\n' - need transformation of a working solution

I need a transformation of the following working curl command:
curl --data-binary #"data.txt" http://www.example.com/request.asp
The data.txt includes this:
foo=bar
parameter1=4711
parameter2=4712
The key is I need to send the linebreaks and they are \r\n. Its working with the file because it has the right encoding but how do I manage to get this curl command run without the file? So a 1-liner sending the parameters with the correct \r\n on end of each.
All my tests with different URL encoding, etc. didn't work. I never got the same result like with the file.
I need this information because I have serious trouble to get this post run on my Ruby on Rails App using net/http.
Thanks!
One way to solve it is to generate the binary stream with something on the fly, like the printf command, and have curl read the data from stdin:
printf 'foo=bar\r\nparameter1=4711\r\nparameter2=4712' | curl --data-binary #- http://example.com

Getting only response header from HTTP POST using cURL

One can request only the headers using HTTP HEAD, as option -I in curl(1).
$ curl -I /
Lengthy HTML response bodies are a pain to get in command-line, so I'd like to get only the header as feedback for my POST requests. However, HEAD and POST are two different methods.
How do I get cURL to display only response headers to a POST request?
-D, --dump-header <file>
Write the protocol headers to the specified file.
This option is handy to use when you want to store the headers
that a HTTP site sends to you. Cookies from the headers could
then be read in a second curl invocation by using the -b,
--cookie option! The -c, --cookie-jar option is however a better
way to store cookies.
and
-S, --show-error
When used with -s, --silent, it makes curl show an error message if it fails.
from the man page. so
curl -sS -D - www.acooke.org -o /dev/null
follows redirects, dumps the headers to stdout and sends the data to /dev/null (that's a GET, not a POST, but you can do the same thing with a POST - just add whatever option you're already using for POSTing data)
note the - after the -D which indicates that the output "file" is stdout.
The other answers require the response body to be downloaded. But there's a way to make a POST request that will only fetch the header:
curl -s -I -X POST http://www.google.com
An -I by itself performs a HEAD request which can be overridden by -X POST to perform a POST (or any other) request and still only get the header data.
The Following command displays extra informations
curl -X POST http://httpbin.org/post -v > /dev/null
You can ask server to send just HEAD, instead of full response
curl -X HEAD -I http://httpbin.org/
Note: In some cases, server may send different headers for POST and HEAD. But in almost all cases headers are same.
For long response bodies (and various other similar situations), the solution I use is always to pipe to less, so
curl -i https://api.github.com/users | less
or
curl -s -D - https://api.github.com/users | less
will do the job.
Maybe it is little bit of an extreme, but I am using this super short version:
curl -svo. <URL>
Explanation:
-v print debug information (which does include headers)
-o. send web page data (which we want to ignore) to a certain file, . in this case, which is a directory and is an invalid destination and makes the output to be ignored.
-s no progress bar, no error information (otherwise you would see Warning: Failed to create the file .: Is a directory)
warning: result always fails (in terms of error code, if reachable or not). Do not use in, say, conditional statements in shell scripting...
Much easier – this also follows links.
curl -IL http://example.com/in-the-shadows
While the other answers have not worked for me in all situations, the best solution I could find (working with POST as well), taken from here:
curl -vs 'https://some-site.com' 1> /dev/null
headcurl.cmd (windows version)
curl -sSkv -o NUL %* 2>&1
I don't want a progress bar -s,
but I do want errors -S,
not bothering about valid https certificates -k,
getting high verbosity -v (this is about troubleshooting, is it?),
no output (in a clean way).
oh, and I want to forward stderr to stdout, so I can grep against the whole thing (since most or all output comes in stderr)
%* means [pass on all parameters to this script] (well(https://stackoverflow.com/a/980372/444255), well usually that's just one parameter: the url you are testing
real-world example (on troubleshooting proxy issues):
C:\depot>headcurl google.ch | grep -i -e http -e cache
Hostname was NOT found in DNS cache
GET HTTP://google.ch/ HTTP/1.1
HTTP/1.1 301 Moved Permanently
Location: http://www.google.ch/
Cache-Control: public, max-age=2592000
X-Cache: HIT from company.somewhere.ch
X-Cache-Lookup: HIT from company.somewhere.ch:1234
Linux version
for your .bash_aliases / .bash_rc:
alias headcurl='curl -sSkv -o /dev/null $# 2>&1'
-D, --dump-header
Write the protocol headers to the specified file.
This option is handy to use when you want to store the headers
that a HTTP site sends to you. Cookies from the headers could
then be read in a second curl invocation by using the -b,
--cookie option! The -c, --cookie-jar option is however a better
way to store cookies.

Resources