Postgresql-9.6 backup - postgresql-9.6

I have to backup a PostgreSQL database using this command:
sudo pg_dumpall -a mydb > app111618.bak
After I type that command, I get this error:
`pg_dumpall: too many command-line arguments (first is "mydb")`
In the command it say this:
Try pg_dumpall --help for more information.
How can I fix this?

pg_dumpall is a tool to dump all databases, so it doesn't support specifying a database name (assuming that mydb is the name of the database you want to dump).
If you want to dump only a single database, use pg_dump instead:
pg_dump -a mydb > app111618.bak
If you did intend to dump all databases, just leave out the database name:
pg_dumpall -a > app111618.bak

Related

How to set up Travis CI and postgresql using custom db credentials?

I'm trying to set up custom Postgres credentials with Travis CI as I'd like to avoid the existing credentials definition in the code to be tested.
The testing code defines that the database should be accessed on:
'sqlalchemy.url': 'postgresql://foo:bar#localhost/testing_db'
I've therefore created a database.travis.yml file:
postgresql: &postgresql
adapter: postgresql
username: foo
password: bar
database: testing_db
...and added the following to my .travis.yml:
services:
- postgresql
before_script:
- psql -c 'create database stalker_test;' -U postgres
- mkdir config && cp database.travis.yml config/database.yml
However, I am still getting this during testing:
OperationalError: (psycopg2.OperationalError) FATAL: role "foo" does not exist
What am I doing wrong?
Adding the following to .travis.yml solved my issue. No need for a database.travis.yml file.
before_script:
- psql -c "CREATE DATABASE testing_db;" -U postgres
- psql -c "CREATE USER foo WITH PASSWORD 'bar';" -U postgres
Database.yml seems to be a Ruby On Rails thing. Travis CI started with rails / ruby testing, so the docs might be reflecting that.
You most probably need to do your setup in a separate script or migration setup, and not rely on travis, except for running the service.

Error: Postgres database import in docker container

I'm running a ruby on rails application in docker container. I want to create and then restore the database dump in postgres container.
But I'm
Below is what I've done so far:
1) Added bash script in /docker-entrypoint-initdb.d folder. Script is just to create database:
psql -U docker -d postgres -c 'create database dbname;'
RESULT: Database created but rails server exited with code 0. Error: web_1 exited with code 0
2) Added script to be executed before docker-compose up.
# Run docker db container
echo "Running db container"
docker-compose run -d db
# Sleep for 10 sec so that container have time to run
echo "Sleep for 10 sec"
sleep 10
echo 'Copying db_dump.gz to db container'
docker cp db_dump/db_dump.gz $(docker-compose ps -q db):/
# Create database `dbname`
echo 'Creating database `dbname`'
docker exec -i $(docker-compose ps -q db) psql -U docker -d postgres -c 'create database dbname;'
echo 'importing database `dbname`'
docker exec -i $(docker-compose ps -q db) bash -c "gunzip -c /db_dump.gz | psql -U postgres dbname"
RESULT: Database created and restored data. But another container runs while running web application server using docker-compose up.
docker--compose.yml:
version: '2'
services:
db:
image: postgres
environment:
- POSTGRES_PASSWORD=docker
- POSTGRES_USER=docker
web:
build: .
command: bundle exec rails s -p 3000 -b '0.0.0.0' -d
image: uname/application
links:
- db
ports:
- "3000:3000"
depends_on:
- db
tty: true
Can some one please help to create and import database?
EDIT:
I've tried one more approach by adding POSTGRES_DB=db_name environment variable in docker-compose.yml file so that database will be created and after running the application (docker-compose up), I'll import the database. But getting an error: web_1 exited with code 0.
I'm confused why I'm getting this error (in first and third approach), seems to be something is messed up in docker-compose file.
Set up a database dump mount
You'll need to mount the dump into the container so you can access it. Something like this in docker-compose.yml:
db:
volumes:
- './db_dump:/db_dump'
Make a local directory named db_dump and place your db_dump.gz file there.
Start the database container
Use POSTGRES_DB in the environment (as you mentioned in your question) to automatically create the database. Start db by itself, without the rails server.
docker-compose up -d db
Import data
Wait a few seconds for the database to be available. Then, import your data.
docker-compose exec db gunzip /db_dump/db_dump.gz
docker-compose exec db psql -U postgres -d dbname -f /db_dump/db_dump.gz
docker-compose exec db rm -f /db_dump/db_dump.gz
You can also just make a script to do this import, stick that in your image, and then use a single docker-compose command to call that. Or you can have your entrypoint script check whether a dump file is present, and if so, unzip it and import it... whatever you need to do.
Start the rails server
docker-compose up -d web
Automating this
If you are doing this by hand for prep of a new setup, then you're done. If you need to automate this into a toolchain, you can do some of this stuff in a script. Just start the containers separately, doing the db import in between, and use sleep to cover any startup delays.
web_1 exited with code 0
Did you tried check the log of web_1 container? docker-compose logs web
I strongly recommend you don't initialize your db container manually, make it automatically within the process of start container.
Look into the entrypoint of postgres, we could just put the db_dump.gz into /docker-entrypoint-initdb.d/ directory of the container, and it will be automatic execute, so docker-compose.yml could be:
db:
volumes:
- './initdb.d:/docker-entrypoint-initdb.d'
And put your db_dump.gz into ./initdb.d on your local machine.
When you use command
docker-compose run -d db
you run a separate container it means you are running 3 containers where 1 is application 2 are dbs. The container you run using above command will not be a part of service. compose is using separate db.
So instead of running docker-compose up -d db run docker-compose up -d and continue with your script
I got it working by adding a container_name for db container. My db container have different name (app_name_db_1) and I was connecting to a container named db.
After giving the hard-coded container_name (db), it gets working.

Compose: running a container that exits

I have a docker-compose.yml with postgres and a web app (ghost). I would like to run a container between postgres and ghost to initialize postgres, add a database and user permissions, and exit.
My database initialization code looks something like:
ghostdb:
extends:
file: ./compose/ghost.yml
service: ghostdb
links:
- postgres
volumes:
- ./ghost-db/volumes/sql:/sql
Which in turn runs
#!/bin/bash
echo Importing SQL
until pg_isready -h postgres; do
sleep 1
done
for f in /sql/*.sql; do
echo Importing $f
psql -h postgres -f $f
done
I know I can extend postgres to add this functionality, but I would rather separate these two concerns. So I have two questions:
Is there a preferable pattern for initializing a database? Is it possible to run a container that exits between postgres and ghost?
Full repository can be viewed here: https://github.com/devpaul/ghost-compose

Deploying local database to production and the other way around

I have a rails app that has a model called 'Opportunity'. Say this model has several records stored in the database on my local development environment and I now want to deploy the app. With it, I want all the data in my local database to be deployed as well.
Is this possible? I have looked at rake tasks and seeding but both do not seem to be quite what I want.
Thanks so much for your help.
For postgres the command to export your database is called pg_dump
To dump a database:
$ pg_dump mydb > db.out
To reload this database:
$ psql -d database -f db.out
why cant you take table dump of development db and import it in production db
For mysql
in development :
mysqldump -u username -p developmentdb tablename > for_production.sql
in production:
mysql -u username -p productiondb < for_production.sql

Postgres database username in database.yml file

How do I find out what is my database's username in my local machine? I'm updating my database.yml file in my rails application, but anything I try isn't working.
When trying to use postgres (or other options such as mymacusername) I'm getting the following error:
FATAL: role "postgres" does not exist
Do I need to create it? How? Or can I use an existing username on my computer?
Update:
I am trying to create a user (this is soon after installation so the issue may be that it doesn't exist yet)
I ran createuser -s -U $USER
Name of role to add: postgres
But am getting the following error:
createuser: could not connect to database postgres: could not connect to server: No such file or directory
Is the server running locally and accepting
connections on Unix domain socket "/var/pgsql_socket/.s.PGSQL.5432"?
So this is the problem - How do I make it listen to the correct socket?
if you have root access:
sudo -u postgres psql DATABASENAME
inside it type
\l
to show all databases and its users
We simply need to create postgres role.
Commands to create a role name postgres:
1.sudo -u postgres psql
2.CREATE ROLE postgres WITH SUPERUSER CREATEDB CREATEROLE LOGIN ENCRYPTED PASSWORD 'password'

Resources