how to execute sql files through sql plus using command prompt via java but without using JDBC - sqlplus

This is a Swing UI which is shown in the figure, this figure is showing a box in which multiple sql files are present, I can choose multiple sql files and execute them by clicking execute selected Scripts.
After clicking execute select Script button java should connect with command prompt, start sql plus in command prompt and execute all selected sql files one by one without using JDBC

Related

Performing denodo tasks from Jenkins

I am trying a create a working prototype for performing denodo activities from my Jenkins server.
Steps that i want to perform are :
Import a VSQL file from GIT to Denodo from Jenkins.
Create a view in Denodo from Jenkins.
Run this VSQL file in Denodo from Jenkins.
I am new to Denodo world and i am not sure if Denodo has any APIs for doing this.
Can someone let me know if this is really possible? If so where can i find a solution for this requirement. I tried searching in the internet for last few days, but couldn't find a solution.
The problem why you don't find to much on the web for this is that the files and query language in denodo is called vql not vsql. Try searching for that, you will find a lot there.
Anyways about your problem:
You have two options to work with CI and CD in Denodo. If you use Jenkins and just want to create views based on actions in other systems, e.g. create a base view as soon as a new table is created in the source you can just send the vql create script (containing create wrapper an create view) via jdbc or odbc to the server. For that create a technical user on denodo and load the driver to the jenkins server.
The other option is if you are using Denodo 7 to use the solution manager. There you have a rest API where you can create Revisions, test them on different environments and deploy them. Not sure if you can create a revision based on vql code that comes from Jenkins, but I think this should be possible.

Jenkins - Multiple terminal output

I have a Python code that when running, creates multiple other terminal windows, in addition to the one I was already running from.
subprocess.call(["gnome-terminal", "-e", "..."])
This opens multiple other terminals that runs the same program with different parameters.
In Jenkins, in a "Freestyle project", when I run the same script from the "Execute shell", the result is not the same (as I was expecting).
./python_file.py -p $MY_PARAMETER
The main console output is working fine, but the other windows terminal that were supposed to open, just don't execute. I want to be able to see the output of those terminals in the Jenkins console (or elsewhere ?)
Should I use another kind of project ? Or just add a new Plugin ? Is there an option in the project that I should checked ? I don't want to run the project on multiple nodes. I just need to see multiple terminals.
This is the error text :
Failed to parse arguments: Cannot open display:
It is not a common problem I supposed, but thanks for input!
I am not sure if a multiple windows output exists in Jenkins but I think that you can bypass this issue.
Instead of running one project in multiple console, I will modify my Python Script so that multiple projects will run one console at a time. Like that it will be easier to control which parameters I want to every single projects and what the outputs are for every single one of them also.
There is a couple way to do that ("multi-configuration project", multiple "freestyle project").

Liquibase maven update fails on second attempt

I'm trying to create a table (if not exists) to a remote database (postgres in docker container). The sql command is located in a sql file.
When I execute
mvn liquibase:update
by hand (bash terminal) everything works great. The build is a success and the message for the creation of the table is displayed.
Now if I delete the table and execute the same command, then although I'm seeing a success message for the completion of the task, the table is not created (of course the message about the creation of the table is not displayed either).
As I've mentioned the database is in a docker container. So it is a remote database. I've already included the below command in my pom.xml
<promptOnNonLocalDatabase>false</promptOnNonLocalDatabase>
Why is this happening?
Is this changeset run with "runAllways" mode? If not, than when you are running it first time change is applied and recorded as applied and it is not executed second time at all (and there is no difference how you are running it).
Liquibase keeps track of the changes it has applied in a separate table called DATABASECHANGELOG. If you manually delete the table (let's call this one "MyTable") as described above, and then re-run Liquibase, it just checks the DATABASECHANGELOG table to see if the change mentioned has been run, and if it has, it will not re-run it. If you want to see the table get re-created you would have to manually remove the row from DATABASECHANGELOG, or else (as the comment above suggests) mark the changeset in the changelog as "runAlways". Typically you do NOT want runAlways to be set because then Liquibase will try to do things like create a table that already exists.

In Rails how can i restore my indexes for my tables?

I have a Rails app, and originally I had the database setup in SQL Server locally. Recently, I moved it up to Azure SQL Server, and have just copied down the changes locally again. When I imported the database down all the tables and data replicated down, but it seems all my primary keys and indexes are gone. Is it possible to run a rake command or rails command to add these keys/indexes back?
You probably have a auto-generated SQL script file (through rake db:create, rake db:migrate) etc, and this file usually is available in 'db/structure.sql'. You can run that script - you may need to manually remove the CREATE TABLE statements, but retain ALTER SEQUENCE, ALTER TABLE, CREATE INDEX statements.
I ended up creating a bacpac file from my Azure database and imported it as a data-tier application within SQL Server.
Create BACPAC File
Go to your Azure Portal.
Select "Storage" from the menu and create a new storage. I just gave it a url and clicked create storage account. It will take a minute for the new storage to be created.
Once the storage is created, go to SQL Databases. Open the database you wish to create a bacpac file from and then at the bottom of the portal select 'Export'. And follow the export wizard, by selecting your storage account and providing your credentials.
SSMS Import Data-Tier Application
Right click your Databases folder under your connection in Object Explorer.
Click "Import Data-Tier Application" and follow the wizard. When you get to "Import Settings" select the bacpac file you downloaded from your Azure portal.

Neo4j web interface, make permanent custom scripts

Neo4j allows to save scripts used in the web interface, or import them (one at a time) from file (Drop Cypher script file to import).
The issue, these scripts disapear when neo4j is stopped and I have to import them one by one.
Is there a way to make those script permanent in the interface?

Resources