Importing mysql procedures generated from mysqldump --routines - stored-procedures

I use the mysqldump tool to make copies of my database. The problem is, when I use the --routines parameter to output my stored procedures along with my data, the generated output causes an error when I try to import it.
It goes something like this:
% mysqldump --routines MyDB | mysql MyDB2
(where MyDB2 already exists but is empty)
The error I get is the following:
ERROR 1064 (42000) at line 307: You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near '' at line 23
Everything works correctly if I omit the --routines.
Has anyone else encountered this?

I was able to get this to work by splitting it into two calls:
% mysqldump MyDB | mysql MyDB2
% mysqldump --routines --no-create-info --no-data --no-create-db --skip-opt MyDB | mysql MyDB2

If something's erroring when running the queries in MyDB2, it's best to:
Run mysqldump to save the output to a saved file.
Run the file bit by bit, to identify which part has the problem.
Fix that bit.
I once had a problem like this where I was exporting from an old version of mysql and importing into a newer one, which had declared one of my column names a reserved word. Are your two databases on different servers running different versions of mysql? Or is there some other difference between the databases (e.g. character set)?

Related

Hyperledger Explorer

I have installed all prerequisites for setting up the hyperledger Explorer but when I start it, I got the following error in log file:
And my config.json file is this:
Postgres' command also done:
1: https://i.stack.imgur.com/eTpSY.png
2: https://i.stack.imgur.com/IocQU.png
You're database setup is not done correctly, run these commands one by one.
Database setup
Connect to PostgreSQL database
sudo -u postgres psql
Run create database script
\i app/db/explorerpg.sql
\i app/db/updatepg.sql
Run db status commands.
\l view created fabricexplorer database
\d view created tables
Actually it postgres database error ...
In your error its clearly said that the chaincode_id doesnt exit ... so this is the problem .
if you want check what column are existed in the transaction table just follow below step
cd blockchain-explorer/app/persistence/postgreSQL/db
sudo -u postgres psql
\d transactions
check the corresponding column chaincode_id exist or not (it wont exist now ,Thats why you got this error)....
Solution for this type error
If you got any error like this first just go to the blockchain-explorer/app/persistence/postgreSQL/db directory
There you can see two file explorerpg.sql and updatepg.sql open this two file and check the corresponding column if existed on any of this file or not. If not you better to download explorer another version which contain the corresponding columns either of this two file mentioned above.
if existed just run below command on ubuntu
cd blockchain-explorer/app/persistence/postgreSQL/db
sudo -u postgres psql
\i explorerpg.sql
\i updatepg.sql
Once done this command just check the column "chaincode_id " is created or not by
\d transactions
it will list all column just check it on.
if the chaincode_id is exist run the explorer again ....

If neo4j-shell is deprecated then how do I dump the contents of the database (for backup)

I've just been looking into how to backup the database and have found that neo4j-shell -c dump > my-db-dumb.cql looks like a good solution, which exports everything to a cypher query which creates everything when run (a bit like mysqldump for MySQL).
However, according to the official documentation, neo4j-shell has beed deprecated in favour of cypher-shell, and I can't find the equivalent dump function for cypher-shell. Is there one? If not, what should I do instead of neo4j-shell -c dump? Or is there a better way of backing up the database (I have the community edition)? One advantage of the above solution is you don't have to stop the database.
The most useful option is to shutdown the data and then take backups using the new neo4j-admin command.
If you cannot shutdown the graph, then you can manually copy the "graph.db" directory to someplace else, and then use neo4j-shell using the -path option on new location. As far as version 3.1.1 is concerned, the neo4j-shell is working perfectly.

I need to restore db from dump and cannot do it

I have a database dump at D:/backup.dump. I try to restore my database min_ro: I open psql.exe plugin. There are words
min_ro=#
Then I write restore command:
min_ro=# psql min_ro < D:/backup.dump
Then happens nothing. My database is not restored. What is wrong? It's first time using psql.
Update. I don't need psql only - I need to restore db from dump and cannot do it.
psql is not a SQL statement, so it doesn't make sense to enter that at the psql prompt which is there to run SQL statements (or psql meta commands).
c:\> psql min_ro < D:/backup.dump
needs to be entered on the (Windows) command line, not inside psql.
You can however just run the SQL script (which I assume your dump is) by using the \i ("include") meta command in `psql``
c:\> psql min_ro
min_ro=# \i D:/backup.dump
When you restore your database at pgAdminIII (by right-click at database name then choice 'restore') you can't see .dump files at backup list by default. That was my mistake forced me to try another ways to restore DB from dump.
But if you simply change file types to 'All files' you can restore your database from dump as usially.

How to dump data from mysql database to postgresql database?

I have done the depot application using mysql... Now i am in need to use postgres... So i need to dump data from mysql database "depot_development" to postgres database "depot_develop"...
Here you can find some interesting links http://wiki.postgresql.org/wiki/Converting_from_other_Databases_to_PostgreSQL#MySQL
Have you tried to copy the tables from one database to the other:
a) export the data from MySQL as a CSV file like:
$> mysql -e "SELECT * FROM table" -h HOST -u USER -p PWD -D DB > /file/path.csv'
and then,
b) import it into Postgres like:
COPY table FROM '/file/path.csv' WITH CSV;
This question is a little old but a few days ago i was dealing with this situation and found pgloader.io.
This is by far the easiest way of doing it, you need to install it, and then run a simple lisp script (script.lips) with the following 3 lines:
/* content of the script.lisp */
LOAD DATABASE
FROM mysql://dbuser#localhost/dbname
INTO postgresql://dbuser#localhost/dbname;
/*run this in the terminal*/
pgload sctipt.lisp
And after that your postgresql DB will have all of the information that you had in your MySQL SB
On a side note, make you you compile pgloader since at the time of this post, the installer has a bug. (version 3.2.0)

How to write stored procedures to separate files with mysqldump?

The mysqldump option --tab=path writes the creation script of each table in a separate file. But I can't find the stored procedures, except in the screen dump.
I need to have the stored procedures also in separate files.
The current solution I am working on is to split the screen dump programatically. Is there a easier way?
The code I am using so far is:
#save all routines to a single file
mysqldump -p$PASSWORD --routines --skip-dump-date --no-create-info --no-data --skip-opt $DATABASE > $BACKUP_PATH/$DATABASE.sql
#save each table to its file
mysqldump -p$PASSWORD --tab=$BACKUP_PATH --skip-dump-date --no-data --skip-opt $DATABASE
Even if I add --routines to the second command, they will not get their own files.
I created a script to output to a separate file.
https://gist.github.com/temmings/c6599ff6a04738185596
example: mysqldump ${DATABASE} --routines --no-create-info --no-data --no-create-db --compact | ./seperate.pl
File is output to the directory(out/).
$ tree
.
└── out
├── FUNCTION.EXAMPLE_FUNCTION.sql
└── PROCEDURE.EXAMPLE_PROCEDURE.sql
The mysqldump command does not support dumping stored procedures into individual files.
But, it is possible to do it using the mysql command.
mysql --skip-column-names --raw mydatabase -e "SELECT CONCAT('CREATE PROCEDURE `', specific_name, '`(', param_list, ') AS ') AS `stmt`, body_utf8 FROM `mysql`.`proc` WHERE `db` = 'mydatabase' AND specific_name = 'myprocedure';" 1> myprocedure.sql
For a more complete example, using Windows Batch, look into my answer on another question.
MySQL - mysqldump --routines to only export 1 stored procedure (by name) and not every routine
I think the answer is: it is not possible without post-processing
This writes table definitions (not SPs) fwiw:
mysqldump -u<username> -p<password> -T<destination-directory> --lock-tables=0 <database>
One snag I ran into was, make sure you put enough permissions on . I just did chmod 777 on it.
A note on this--MySQL will write out the table structures in .sql files, and the data in .txt files. I wish it would just do it normal, thanks.

Resources