Upload XML file to server on another machine - upload

How can I upload a XML file that resides on a computer (let's call it workstation) onto a BaseX server that runs on another computer (server)?
To upload a XML file to the BaseX server on workstation I use
basexclient -n localhost -d -w -c "CREATE DATABASE ${db_name} ${file}"
When the hostname is changed from localhost to server, this command fails with
org.basex.core.BaseXException: Resource "[complete FILE path]" not found.
IIUC, the error happens because this command does not upload the XML file itself, but just asks the server to read it from the path ${file}. The command then fails because ${file} is not available on server but only on workstation.
What command should I use to upload the XML file to the remote server?
(Obviously without copying the file to the server and then executing the command locally on the server.)

Assuming that -n means what you seem to be using it to mean, and that a local client can in fact communicate with a remote server, and assuming also that your XML document is a standalone document, I'd try something like the following (not tested), with $server, $dbname, $file, and $baseurl defined as environment variables:
(echo CREATE DATABASE ${dbname};
echo ADD TO ${baseurl};
cat ${file};
echo EXIT ) | basexclient -n myserver -d -w
But otherwise I'd use the BaseX HTTP server and use curl or wget to sent a PUT request with the file to the address http://myserver.example.org:8984/webdav/mydb/myfile.xml (and of course, if necessary, I'd use curl multiple times to make the database and then add data to it).

Related

How to connect to IBM Cloud Redis from Ruby on Rails application

Migrating from one service to IBM Cloud for Redis.
I cannot find the correct configuration to connect using TLS. Everything I find on this is related to Heroku. and it ignores verifying the TLS/SSL connection.
I cannot find how to configure our Sidekiq/Redis to connect.
I do have a certificate from the IBM Cloud dashboard and I suspect I have to pass that along somehow.
Configure the Sidekiq.yml like this
:redis:
:url: "rediss://:< PWD >#< DB Name >:< PORT >/0"
:namespace: "app"
:ssl_params:
ca_file: 'path/to/cert'
I keep getting back the error Redis::CommandError - WRONGPASS invalid username-password pair or user is disabled.: however using these same credentials in the migration script I am able to connect to the DB, so the credentials are ok, I think it is not including the certificate correctly and I cannot find the correct way to do this
The sidekiq.yml configuration looks good to me, just make sure this has correct complete path
ca_file: 'path/to/cert'
and change the redis url to
:url: "rediss://< PWD >#< DB Name >:< PORT >/0"
further info you can read from here for TLS secured connection.
I'm not familiar with sidekiq.yml. But I've configured redlin with redis using a python script you can find here: https://github.com/IBM-Cloud/vpc-transit/blob/master/py/test_transit.py. Maybe the configuration is similar.
The relevant code is:
def vpe_redis_test(fip, resource):
"""execute a command in fip to verify postgresql is accessible"""
redis = resource["key"]
credentials = redis["credentials"]
cert_data = credentials["connection.rediss.certificate.certificate_base64"]
cli_arguments = credentials["connection.cli.arguments.0.1"]
command = f"""
#!/bin/bash
set -ex
if [ -x ./redli ]; then
echo redli already installed
else
curl -LO https://github.com/IBM-Cloud/redli/releases/download/v0.5.2/redli_0.5.2_linux_amd64.tar.gz
tar zxvf redli_*_linux_amd64.tar.gz
fi
./redli \
--long \
-u {cli_arguments} \
--certb64={cert_data} << TEST > redis.out
set foo working

upload file to eXist-db running on Docker container

I'm using eXist-db over docker container - installing Java over Ubuntu, installing the eXist installation headless jar, and also adding data Volume (Azure file) to store all the physical files and the db data files.
I need to upload automatically files to the eXist-db, after I generate a new file and save it to the volume drive (using C#).
According to the eXist documentation on uploading files there are several methods to upload files to eXist, but none of them work for me.
Dashboard or eXide - not relevant since these are GUI applications.
Java Admin Client - not working because have no GUI -> I'm getting this failure: 'No X11 DISPLAY variable was set, but this program performed an operation which requires it...'
Over REST or WebDAV via web client (using browser or by code), I can run XQuery for queries, but for storing new files, how?
So, the solution I found is to write an XQuery file, using the xmldb:store function.
This query saved the posted file using the specified name and location (in the volume), and the stored file can then be retrieved via REST or WebDAV.
But I feel that there must be a simpler solution...
Can anyone help?
BTW, here is the xmldb:store XQuery:
xquery version "3.1";
declare function local:upload() {
let $filename := request:get-uploaded-file-name("file")
let $log-in := xmldb:login("/db", "Admin", "admin")
let $file := "file:///usr/new_file_location.xml"
let $record := doc($file)
let $store := xmldb:store("/db/akn", "new_file_name.xml", $record)
return
<results>
<message>File {$file} has been stored.</message>
</results>
};
local:upload()
When starting eXist as described in the eXist Docker documentation - with it listening on port 8080 - you can access all of eXist's standard endpoints:
http://localhost:8080/exist/webdav/db for WebDAV
http://localhost:8080/exist/rest/db for REST
http://localhost:8080/exist/xmlrpc/db for XML-RPC
http://localhost:8080/exist/apps for apps installed in /db/apps.
Of course if you've configured Docker's eXist to listen on a different port, just switch the port.
Thus, to upload files to a Dockerized eXist programmatically, the methods outlined in the documentation article you referenced, Uploading files, should all work: WebDAV, client.sh, Ant, or even curl. For WebDAV, if you haven't configured users and passwords, you'd just connect with the URL http://localhost:8080/exist/webdav/db, username "admin", and a blank password. For Ant, see the Ant tasks documentation. For curl, you would perform an HTTP PUT request to the REST interface:
curl -s -f -H 'Content-Type: application/xml' \
-T <filename> \
--user <user>:<password> \
"http://localhost:8080/exist/rest/db/apps/<collection>/<filename>"
This is also possible:
echo put /home/files/<FILRPATH>/main.xml | /usr/local/eXist-db/
bin/client.sh -s

Deploying from Jenkins server to another server: Host key verification failed

I am trying to deploy to another server from Jenkins server, and I can't do it using Jenkins Build script.
When I am on the Jenkins server, I can deploy. For example:
:/var/lib/jenkins/workspace/MyProject$ scp my_file ubuntu#my_address:~/MyProject
Runs perfectly fine; however,
When I specify:
scp my_file ubuntu#my_address:~/MyProject
In "Execute Shell" under build in Jenkins window. I get the following error:
Host key verification failed.
I know that the first time I ran the above command directly on Jenkins server, I was prompted:
The authenticity of host 'my_address (my_address)' can't be established.
ECDSA key fingerprint is cf:4b:58:66:d6:d6:87:35:76:1c:aa:cf:9a:7c:78:cc.
Are you sure you want to continue connecting (yes/no)?
So I had to hit "yes" in order to continue. But since I already directly in the terminal, I don't have to do anything extra.
Second answer to this question: Jenkins Host key verification failed
indicates that, if I understand it correctly.
What am I missing? What can I do to fix my problem?
I got it working, I needed to do two things:
1) I had to use -o StrictHostKeyChecking=no:
scp -v -o StrictHostKeyChecking=no my_file ubuntu#my_address:~/MyProject
instead of:
scp my_file ubuntu#my_address:~/MyProject
2) I needed to copy my id_rsa to /var/lib/jenkins/.ssh
The /var/lib/jenkins/.ssh folder and files inside of it need to be owned by jenkins.
Old question but may be someone would find this useful:
ssh root#jenkinsMaster 'echo "$(ssh-keyscan -t rsa,dsa jenkinsSlave)" >> /root/.ssh/known_hosts'
Make sure that you're first settingup ssh connection to remote host then try to copy the things to remote host.
ssh -o StrictHostKeyChecking=no user#ip_addr
scp file_name user#ip_addr:/home/user

Connecting to a Progress Openedge database from ABL

This code works fine if I run it in the Progress Editor. If I save this as a .p file and click on right button "RUN", it gives me an error that database doesn't exist. I understand that maybe I should insert some code to connect to a database.
Does anybody know what statement I should use?
DEF STREAM st1.
OUTPUT STREAM st1 TO c:\temp\teste.csv.
FOR EACH bdName.table NO-LOCK:
PUT STREAM st1 UNFORMATTED bdName.Table.attr ";" SKIP.
END.
OUTPUT STREAM st1 CLOSE.
Exactly as you say you need to connect to your database. This can be done in a couple of different ways.
Connect by CONNECT statement
You can connect a database using the CONNECT statement. Basically:
CONNECT <database name> [options]
Here's a simple statement that is connecting to a local database named "database" running on port 43210.
CONNECT database.db -H localhost -S 43210.
-H specifies the host running the database. This can be a name or an IP-address. -S specifies the port (or service) that the database uses for connections. This can be a number or a service-name (in that case it must be specified in /etc/services or similar)
However you cannot connect to a database and work with it's tables in the same program. Instead you will need to connect in one program and then run the logic in a second program
/* runProgram.p */
CONNECT database -H dbserver -S 29000.
RUN program.p.
DISCONNECT database.
/* program.p */
FOR EACH exampletable NO-LOCK:
DISPLAY exampletable.
END.
Connect by command line parameters
You can simple add parameters in your startup command so that the new session connects to one or more databases from start.
Windows:
prowin32.exe -db mydatabase -H localhost -S 7777
Look at the option below (parameter file) before doing this
Connect by command line parameter (using a parameter file)
Another option is to use a parameter file, normally with the extension .pf.
Then you will have to modify how you start your session so instead of just doing prowin32.exe (if your on windows) you add the -pf parameter:
prowin32.exe -pf myparameterfile.pf
The parameterfile will then contain all your connection parameters:
# myparameterfile.pf
-db database -S localhost -P 12345
Hashtag (#) is used for comments in parameter files.
On Linux/Unix you would run:
pro -pf myparameterfile.pf
You can also mix the different ways for different databases used in the same session.

YAWS Websocket Trouble

I'm trying to get the YAWS websocket example from here running on my local. It's a basic ws-based echo server.
I have
YAWS set up and running on localhost:8080 (direct from the Debian repos; no conf changes except pointing it to a new root directory)
the code listing from the bottom of this page wrapped in <erl> tags saved as websockets_example_endpoint.yaws
this page saved as index.yaws (I literally copy/pasted the view-source for it, saved it as that file and pointed the socket request at localhost:8080 rather than yaws.hyber.org).
When I visit localhost:8080/websockets_example_endpoint.yaws in-browser, it displays the text "You're not a web sockets client! Go away!", as expected. When I visit localhost:8080, it points me to the javascript enabled form, but the "Connect" button does nothing when clicked. If I leave index.yaws pointed at yaws.hyber.org instead of localhost:8080, the echo server connects and works exactly as expected.
Can anyone give me a hint as to what I'm doing wrong (or alternatively, point me to source for a working example)?
There is a gitbub project, I've created:
https://github.com/f3r3nc/yaws-web-chat/
This is also an example for embedding yaws and extended with group chat.
Note, that the standard of WebSocket is under development thus yaws and the browser should support the same WS version in order to work correctly.
yaws 1.91 works with Safari Version 5.1.1 (6534.51.22) but does not with Chrome (15.0.874.102) and probably not with (14.x).
For me the problem was I did not have the basic_echo_callback module file because I installed yaws using a package repository rather than building form source.
The error was evident if running yaws in interactive mode ´yaws -i´:
=ERROR REPORT==== 7-Dec-2016::21:33:49 ===
Cannot load callback module 'basic_echo_callback': nofile
=ERROR REPORT==== 7-Dec-2016::21:33:49 ===
Failed to start websocket process: nofileq
This is pretty much my process from scratch on Ubuntu 16.01:
sudo apt install yaws
cd ~
mkdir yaws
cd yaws
mkdir src
cd src
cd src wget https://github.com/klacke/yaws/raw/master/examples/src/basic_echo_callback.erl
cd ..
mkdir www
cd www
wget https://raw.githubusercontent.com/klacke/yaws/master/www/websockets_example_endpoint.yaws
wget http://yaws.hyber.org/websockets_example.yaws
cd ..
#set-up server config file...
vim yaws.conf
Mine looks like:
src_dir = /home/pocketsand/yaws/src
<server localhost>
port = 8000
listen = 127.0.0.1
docroot = /home/pocketsand/yaws/www
</server>
Ensure endpoint is correct in the client:
vim www/websockets_example.yaws
...
Stop server if already running and start server with ´yaws -i´ and browse to: http://localhost:8000/websockets_example.yaws
It works because each time the server loads the configuration file it will compile any modules in the specified src directory. If other modules are missing for the other features they will also need to be downloaded.

Resources