Upload file to server with wget command line - upload

I have web-page with button "Choose file" to upload file on server.
I am trying to upload file with wget command line to server. Exactly like we do it with "Choose file" button in web.
I do this:
wget --load-cookies=cookies.txt --post-file '/root/vTiger/Organization.csv' http://vtiger.mydomain.com/link.php
What I do wrong?

If that 'upload' operation is a submit to a multipart formpost HTML form, then the harsh truth is that wget doesn't support it. curl does though with its -F option.

Related

Download google sheets file as csv to cpanel using cron task

I have a specific task to accomplish which involves downloading a file from Google sheets. I need to always have just one file downloaded so the new file will overwrite any previous one (if it exists)
I have tried the following command but I can't quite get it to work. Not sure what's missing.
/usr/local/bin/php -q https://docs.google.com/spreadsheets/d/11rFK_fQPgIcMdOTj6KNLrl7pNrwAnYhjp3nIrctPosg/ -o /usr/local/bin/php /home/username/public_html/wp-content/uploads/wpallimport/files.csv
Managed to solve with the following:
curl --user-agent cPanel-Cron https://docs.google.com/spreadsheets/d/[...]/edit?usp=sharing --output /home/username/public_html/wp-content/uploads/wpallimport/files/file.csv

Curl command to push a war file to url in github actions

So I'm having in issue where I want this maven installed war file to be named artifactId##version.war in a curl commands url; in the curl command I tried putting in \ and %23 but both instances I get this message The Path is not formatted correctly - must begin with '/repositories//...' and when its just the name artifactId##version when it pushes or uploads it to the website I want it only names it artifactId and leaves out the rest, the problem isn't in getting the artifactID name or version number or the maven install it's in the curl command portion of the code and the reason why I have two FILE_TO_UPLOAD is because of one is the file name installed and another is whats gonna be the name uploaded and like I said if I just put ## it'll just upload the artifactId
run:|
mvn install
#CODE WHERE THE PROBLEM IS
FILE_TO_UPLOAD="ARTIFACTID\#\#VERSION.war"
FILE_TO_UPLOAD2="ARTIFACTID##VERSION.war"
echo ${FILE_TO_UPLOAD} #for testing
TARGET_URL=https://example.com/PATH/${FILE_TO_UPLOAD}
curl -u 'username:password' --upload-file ${FILE_TO_UPLOAD2} ${TARGET_URL}
#CODE WHERE THE PROBLEM IS

Scanning online URL for images

There is a web page www.somepage.com/images/
I know some of the images there (e.g. www.somepage.com/images/cat_523.jpg, www.somepage.com/images/dog_179.jpg)
I know there are some more but I don't know the names of those photos. How can I scan whole /images/ folder?
you can use wget to download all the files
--no-parent to grab all the files below in the directory hierachy
--recursive to look into subfolders
wget --recursive --no-parent -A jpeg,jpg,bmp,gif,png http://example.com/
If they are on the webpage as an img tag you could try just searching the page source for an img tag. If you are using terminal you could also try using a tool such as wget to download the web page and then try using grep on the file for the img tag.

Curl not downloading XML file as expected

When adding a URL into a web browser, I get the usual prompt to open the XML file and view it. However, when I use the same URL within a Curl batch file it only appears to download the login aspx page.
//stuff/stuff/Report.aspx?Report=All_Nodes_IP_Report&DataFormat=XML&AccountID=<UID>&Password=<password>
My batch file looks like this:
curl -L "//stuff/stuff/Report.aspx?Report=All_Nodes_IP_Report&DataFormat=XML&AccountID=<UID>&Password=<Password>" -o "local.xml" -v
pause
What am I doing wrong? There's no proxy server between me and the report URL..? The web site is https but I can't include that as the validation checker keeps moaning at me :)
why use CURL when you can use one application called MGET that i create.
Download Link:
http://bit.ly/1i1FpGE
Syntax of the command:
MGET //stuff/stuff/Report.aspx?Report=All_Nodes_IP_Report&DataFormat=XML&AccountID=<UID>&Password=<Password> local.xml
And if you want to use HTTPS do it, for best experience use HTTP

What's the easiest way to emulate a post request from unix cmdline?

My job has a reservation system that requires me to periodically go to a particular page and do a refresh. When a seat becomes available, I can then reserve on a first come first served basis.
I want to write a script that will email/text me when something becomes available. To do this I need to emulate clicking in the browser.
I know that the page is an aspx script with post requests.
Is there some way to log clicking on the "go" button, dump that to a file and transform that into a curl command. In chrome, I can dump a HTTP Archive file. Perhaps there are other paths. I can run explorer or firefox too.
You
can use curl
with the POST verb
curl -X POST -u svnpenn -k \
-d '{"name":"tcl-8.5.13.tar.gz","size":130073}' \
https://api.github.com/repos/svnpenn/etc/downloads

Resources