Exporting mysql database using mysqldump including procedures - stored-procedures

While exporting databases using mysqldump like this,
mysqldump -u mysqluser -p mysqlpassword databasename > /tmp/databasename.sql
Will this command also export stored procedures that listed using the following command,
SHOW PROCEDURE STATUS WHERE db = 'databasename';
If not, how to export mysql database using mysqldump along with its associated stored procedures from the Linux terminal? Also note that i cannot use phpMyAdmin for this purpose.

Try this.
mysqldump -u mysqluser -p mysqlpassword --routines databasename > /tmp/databasename.sql
Refer this link : http://www.ducea.com/2007/07/25/dumping-mysql-stored-procedures-functions-and-triggers/

We can use the -R flag as a substitute for --routines flag while dumping as the other answer suggest.
mysqldump -u mysqluser -p mysqlpassword -R databasename > /tmp/databasename.sql

Related

pgpass file that uses variables?

I'm working on a docker image that connects to postgres using psql.
My entrypoint:
psql analytics \
--host=${INPUT_HOST} \
--username=analytics \
--port=32648
If I run this I'm prompted for a password, I enter it and am able to connect. Great.
But if I try to make the password entry automatic I get an error:
psql analytics \
--host=${INPUT_HOST} \
--username=analytics \
--port=32648 \
--password=${INPUT_PASSWORD}
/usr/lib/postgresql/13/bin/psql: option '--password' doesn't allow an argument
Try "psql --help" for more information.
I found some docs on using .pgpass and this file which is to be added to a users home directory takes the form:
hostname:port:database:username:password
Now I'm going to have to do something like:
${INPUT_HOST}:5432:analytics:analytics:${INPUT_PASSWORD}
Then envsubst or sed on this file before adding to the image.
Open ended question, is there a better/more convenient way? ${INPUT_PASSWORD} comes from a docker secret. Is there anyway I can pass a password to my call to psql?
The best way is to use a connection string:
psql "password='${INPUT_PASSWORD}' dbname=analytics host='${INPUT_HOST}' user=analytics port=32648"

Restore a SQL Server DB.bak in a Dockerfile

I am running a .NET Razor application, an instance of gitea, and a SQL Server database each in separate containers that communicate with one another. I would like to start my database image with a database schema and data (by restoring a .bak file).
I can do this with my current Dockerfile, if once it is up and running, I run these additional commands:
docker exec -it myContainer /opt/mssql-tools/bin/sqlsmd -S localhost -U sa -P myPassword
/opt/mssql-tools/bin/sqlcmd -S localhost -U sa -P myPassword -Q "RESTORE DATABASE MY_DB_NAME FROM DISK='/var/opt/mssql/backup/MY_DB_NAME.bak' WITH MOVE 'MY_DB_NAME_TEST' TO '/var/opt/mssql/data/MY_DB_NAME_TEST.mdf', MOVE 'MY_DB_NAME_TEST_log' TO '/var/opt/mssql/data/MY_DB_NAME_TEST_log.ldf'"
This gets the job done, but I want to fully automate the process so that this is configured 100% by my docker-compose.yml and Dockerfile so I need only type: docker-compose up -d.
I don't think the content of my docker-compose.yml file is relevant, but here is my Dockerfile (where I am trying to run that script that I currently need to run after docker-compose up):
FROM microsoft/mssql-server-linux
ENV SA_PASSWORD=myPassword
ENV ACCEPT_EULA=Y
COPY ./ACES_DB.bak /var/opt/mssql/backup/MY_DB_NAME.bak
RUN docker exec -it myContainer bin/sh /opt/mssql-tools/bin/sqlcmd -S localhost -U sa -P myPassword -Q "RESTORE DATABASE MY_DB_NAME FROM DISK='/var/opt/mssql/backup/MY_DB_NAME.bak' WITH MOVE 'MY_DB_NAME_TEST' TO '/var/opt/mssql/data/MY_DB_NAME_TEST.mdf', MOVE 'MY_DB_NAME_TEST_log' TO '/var/opt/mssql/data/MY_DB_NAME_TEST_log.ldf'"
Any help would be much appreciated.
A friend and I puzzled through this together and eventually found this solution. Here's what the docker file looks like:
FROM microsoft/mssql-server-linux
ENV MSSQL_SA_PASSWORD=myPassword
ENV ACCEPT_EULA=Y
COPY ./My_DB.bak /var/opt/mssql/backup/My_DB.bak
COPY restore.sql restore.sql
RUN (/opt/mssql/bin/sqlservr --accept-eula & ) | grep -q "Starting database restore" && /opt/mssql-tools/bin/sqlcmd -S localhost -U sa -P 'myPassword' -d master -i restore.sql
*Note that I moved the SQL restore statement to a .sql file.
Expanding on #joshua-abbott 's answer. Here is my setup for restoring multiple DB to mssql 2019 docker image, and replacing the 'default' password used to restore the DB.
Dockerfile
FROM mcr.microsoft.com/mssql/server:2019-latest
ENV DEFAULT_MSSQL_SA_PASSWORD=myStrongDefaultPassword
ENV ACCEPT_EULA=Y
USER root
COPY restore-db.sh entrypoint.sh /opt/mssql/bin/
RUN chmod +x /opt/mssql/bin/restore-db.sh /opt/mssql/bin/entrypoint.sh
ADD data.tar.gz /var/opt/mssql/
RUN chown -R mssql:root /var/opt/mssql/data && \
chmod 0755 /var/opt/mssql/data && \
chmod -R 0650 /var/opt/mssql/data/*
USER mssql
RUN /opt/mssql/bin/restore-db.sh
CMD [ "/opt/mssql/bin/sqlservr" ]
ENTRYPOINT [ "/opt/mssql/bin/entrypoint.sh" ]
restore-db.sh
#!/bin/bash
export MSSQL_SA_PASSWORD=$DEFAULT_MSSQL_SA_PASSWORD
(/opt/mssql/bin/sqlservr --accept-eula & ) | grep -q "Server is listening on" && sleep 2
for restoreFile in /var/opt/mssql/data/*.bak
do
fileName=${restoreFile##*/}
base=${fileName%.bak}
/opt/mssql-tools/bin/sqlcmd -S localhost -U SA -P $MSSQL_SA_PASSWORD -Q "RESTORE DATABASE [$base] FROM DISK = '$restoreFile'"
rm -rf $restoreFile
done
entrypoint.sh
#!/bin/bash
/opt/mssql-tools/bin/sqlcmd \
-l 60 \
-S localhost -U SA -P "$DEFAULT_MSSQL_SA_PASSWORD" \
-Q "ALTER LOGIN SA WITH PASSWORD='${MSSQL_SA_PASSWORD}'" &
/opt/mssql/bin/permissions_check.sh "$#"
I voted for the answer of #Joshua Abbott , but I needed to customize the answer to match the question i.e. to restore from .bak file as it was required:
FROM mcr.microsoft.com/mssql/server:2017-latest
ENV ACCEPT_EULA=Y
ENV SA_PASSWORD=xxxxxxxx
ENV MSSQL_PID=Developer
ENV MSSQL_TCP_PORT=1433
WORKDIR /src
COPY ["API/db/db.bak", "dbbackups/"]
RUN (/opt/mssql/bin/sqlservr --accept-eula & ) | grep -q "Starting database restore" && /opt/mssql-tools/bin/sqlcmd -S localhost -U sa -P 'xxxxxxxx' -Q "RESTORE FILELISTONLY FROM DISK='/dbbackups/db.bak';"
just you need to change xxxxxxx with your password, you can name your container as you want using the docker compose file/override files
It is simple, I use SQL Server Management Studio, when you create your DOCKER you declare a var for the directory, just put de Backup there and then you just restore it on your SQL
You can create a stored procedure in one of your databases for creating an automatic backup, I found this an made some adaptations for my use.
------ If you create this and then execute it------
CREATE PROCEDURE [dbo].[P_M_Backup]
AS
DECLARE #name VARCHAR(50) -- database name
DECLARE #path VARCHAR(256) -- path for backup files
DECLARE #fileName VARCHAR(256) -- filename for backup
DECLARE #fileDate VARCHAR(20) -- used for file name
-- specify database backup directory
SET #path = '/var/opt/mssql/data/Backup/'
-- specify filename format
SELECT #fileDate = CONVERT(VARCHAR(20),GETDATE(),112)
DECLARE db_cursor CURSOR READ_ONLY FOR
SELECT name
FROM master.sys.databases
WHERE name NOT IN ('master', 'model', 'msdb', 'tempdb', 'Eikon_CDEEE') -- exclude these databases
AND state = 0 -- database is online
AND is_in_standby = 0 -- database is not read only for log shipping
OPEN db_cursor
FETCH NEXT FROM db_cursor INTO #name
WHILE ##FETCH_STATUS = 0
BEGIN
SET #fileName = #path + #name + '_' + #fileDate + '.BAK'
BACKUP DATABASE #name TO DISK = #fileName
FETCH NEXT FROM db_cursor INTO #name
END
CLOSE db_cursor
DEALLOCATE db_cursor
/** SET #path = '/var/opt/mssql/data/Backup/' the mssql/data/ is my directory where I have mounted the SQL Server from Docker, and Backup is a directory inside this directory, so you have to change it for your directory**/

How to backup/restore Rails db with Postgres?

I do the following on my server:
pg_dump -O -c register_production > register.sql
Then, after copying register.sql to my local environment, I try:
psql register_development < register.sql
This appears to work, but when I try to launch the Rails site locally, I get this:
PG::UndefinedTable: ERROR: relation "list_items" does not exist at character 28
How can I restore everything (including relations) from the server db to my local dev db?
I use this command to save my database:
pg_dump -F c -v -U postgres -h localhost <database_name> -f /tmp/<filename>.psql
And this to restore it:
pg_restore -c -C -F c -v -U postgres /tmp/<filename>.psql
This dumps the database in Postgres' custom format (-F c) which is compressed by default and allows for reordering of its contents. -C -c will drop the database if it exists already and then recreate it, helpful in your case. And -v specifies verbose so you can see exactly what's happening when this goes on.
Does the register_development database exist before you run the psql command? Because that form will not create it for you.
See http://www.postgresql.org/docs/8.1/static/backup.html#BACKUP-DUMP-RESTORE for more information.

restore mongodb database .bson and .json files

In this folder called my_backup I have a mongodb database dump with all my models/collections for example:
admins.bson
admins.metadata.json
categories.bson
categories.metadata.json
pages.bson
pages.metadata.json
.
.
.
I have a database called ubuntu_development on mongodb. I am working with rails 3 + mongoid
How can I import/restore all models/collections from the folder my_backup to my database ubuntu_development
Thank you very much!
Execute this command from the console (in this case):
mongorestore my_backup --db ubuntu_development
mongodbrestore is followed by my_backup, which is the folder name where the previous dump of the database is saved.
--db ubuntu_development specifies the database name where we want to restore the data.
To import .bson files
mongorestore -d db_name -c collection_name path/file.bson
Incase only for a single collection.Try this:
mongorestore --drop -d db_name -c collection_name path/file.bson
To import .json files
mongoimport --db db_name --collection collection_name --file name.json
You have to run this mongorestore command via cmd and not on Mongo Shell... Have a look at below command on...
Run this command on cmd (not on Mongo shell)
>path\to\mongorestore.exe -d dbname -c collection_name path\to\same\collection.bson
Here path\to\mongorestore.exe is path of mongorestore.exe inside bin folder of mongodb. dbname is name of databse. collection_name is name of collection.bson. path\to\same\collection.bson is the path up to that collection.
Now from mongo shell you can verify that database is created or not (If it does not exist, database with same name will be created with collection).

How to import a Heroku PG dump into local machine

I'm trying to import my production Heroku database into my development machine.
My local db is PostgreSQL.
First, I'm exporting the dump from Heroku to my machine
curl -o latest.dump `heroku pgbackups:url`
Then, I try to drop the local db with rake db:drop and then I create the empty database again by using rake db:create.
The problem I'm getting is when actually trying to import the dump to the database
psql -d app_development -U myusername -f mydumpfile.sql
I begin seeing errors like this
psql:latest.dump:24: ERROR: syntax error at or near "PGDMP"
LINE 1: PGDMP
^
psql:latest.dump:28: ERROR: syntax error at or near ""
LINE 1: INCREMENT BY 1
^
psql:latest.dump:36: ERROR: syntax error at or near ""
LINE 1: id integer NOT NULL,
^
psql:latest.dump:40: ERROR: syntax error at or near ""
LINE 1: INCREMENT BY 1
^
psql:latest.dump:45: ERROR: syntax error at or near ""
LINE 1: id integer NOT NULL,
^
psql:latest.dump:49: ERROR: syntax error at or near ""
LINE 1: INCREMENT BY 1
...
psql:latest.dump:1601: invalid command \S4???(?̭?A?|c?e0<00K?A?}FϚ?????A(??~?t?I?????G(? K???l??k"?H?ȁ?ͲS?,N*?[(#??a5J??j}
psql:latest.dump:1602: invalid command \??k???|??w???h?
psql:latest.dump:1603: invalid command \=??????o?h?
psql:latest.dump:1609: invalid command \????^.?????????E???/-???+??>#?ؚE?.2)Ȯ&???? g????"7},_??]?:?f?Tr|o???)?p????h?KO?08[Rqu???|3?cW?ڮ?ahbm??H?H8??$???2?a?-أ
psql:latest.dump:1613: invalid command \D!qVS???L??*??׬R??I!???
psql:latest.dump:1614: invalid command \??-?}Q
psql:latest.dump:12565: ERROR: invalid byte sequence for encoding "UTF8": 0xb0
Any idea what is happening this and how to solve it?
You see errors because psql tries to interpret SQL queries when you're actually giving him a compressed dump (that's what heroku uses).
While you can't read the dump, pg_restore -O latest.dump gives you valid SQL you could pipe to psql but the easy solution is the following one :
pg_restore -O -d app_development latest.dump
Notes :
Use -O because you probably don't use the random username of your remote heroku postgres db.
Heroku doesn't recommend to use taps but I don't know how really risky it is.
Follow these 4 simple steps in your terminal(Heroku Dev Center):
Create a backup copy of your database:
$ heroku pg:backups capture DATABASE_NAME
Download the copy from Heroku (to your local machine) using curl:
$ curl -o latest.dump `heroku pg:backups public-url`
Load it*:
$ pg_restore --verbose --clean --no-acl --no-owner -h localhost -U YOUR_USERNAME -d DATABASE_NAME latest.dump
get YOUR_USERNAME and choose the desired database from your config/database.yml file.
DATABASE_NAME can be your development/test/production db (Ex. mydb_development)
That's it!
I wanted to avoid having to set up Postgres on my local machine (blowing away and recreating the database is a pain if you're just looking for quick instructions). I put together some exact instructions for doing this with a local Postgres database running Docker. I'm adding a link here, since Google kept bringing me to this question (and it's a possible solution, though probably not what you're looking for):
https://gist.github.com/locofocos/badd43131f14b3e40c760741d5a26471
Heroku export the .dump extension file of the db to import in any of the relational DB by having its own norms and conditions.
While importing it to the local postgres DB, first you download the file latest.dump into your local machine and then run
pg_restore -h localhost -p 5432 -U postgres_username -d db_name -v latest.dump
and restart the rails server.
Late 2021 update for the highest voted answer to date (works great):
$ rails db:drop db:create db:migrate
$ heroku pg:backups capture DATABASE_URL
$ curl -o latest.dump heroku pg:backups public-url
$ pg_restore --verbose --clean --no-acl --no-owner -h localhost -U YOUR_USERNAME -d DATABASE_NAME latest.dump
get YOUR_USERNAME on your local machine
DATABASE_NAME can be your development/test/production db (Ex. rails_react_bootstrap_development) from your config/database.yml file.
DATABASE_URL is not a variable or example code you need to set. It is a valid heroku option

Resources