Postgres database username in database.yml file - ruby-on-rails

How do I find out what is my database's username in my local machine? I'm updating my database.yml file in my rails application, but anything I try isn't working.
When trying to use postgres (or other options such as mymacusername) I'm getting the following error:
FATAL: role "postgres" does not exist
Do I need to create it? How? Or can I use an existing username on my computer?
Update:
I am trying to create a user (this is soon after installation so the issue may be that it doesn't exist yet)
I ran createuser -s -U $USER
Name of role to add: postgres
But am getting the following error:
createuser: could not connect to database postgres: could not connect to server: No such file or directory
Is the server running locally and accepting
connections on Unix domain socket "/var/pgsql_socket/.s.PGSQL.5432"?
So this is the problem - How do I make it listen to the correct socket?

if you have root access:
sudo -u postgres psql DATABASENAME
inside it type
\l
to show all databases and its users

We simply need to create postgres role.
Commands to create a role name postgres:
1.sudo -u postgres psql
2.CREATE ROLE postgres WITH SUPERUSER CREATEDB CREATEROLE LOGIN ENCRYPTED PASSWORD 'password'

Related

keycloak using docker having issue with credential secret key

Currently, I am working on docker and docker-compose. I want to know that whenever I do docker-compose down and then after try to up all the service keycloak in which JSON file for the realm imported on the starter of keycloak server It started from zero as realm -> credential -> client secret key is different every time.
And one more I have to fire these two commands than only I can access http://ip:8080/auth
./kcadm.sh config credentials --server http://localhost:8080/auth --realm master --user admin --password ****
./kcadm.sh update realms/master -s sslRequired=NONE
What database setup are you using?
If nothing is configured then Keycloak will fallback on H2 in-memory database. Unless you do some volume mapping any configuration and users will be deleted on docker-compose down.
You can also use environment variables to create a Keycloak user on startup, see Keycloak docker documentation.
Example with volume mapping to persist h2 data and create user:
volumes:
keycloak_data:
volumes:
- keycloak_data:/opt/jboss/keycloak/standalone/data
environment:
- KEYCLOAK_USER=test
- KEYCLOAK_PASSWORD=test

PostgreSQL: su - Authentication failure in postgresql

I have installed PostgreSQL 10.6, Postgres client and Pgadmin 4.
I just created a new Rails application and I used Postgresql as the database for the application, only for me to get a message saying
su: Authentication Failure
each time I try to login into the postgres account to create the development database for the rails application.
I have tried a couple of times to fix this issue, I have also attempted to solve by trying out several solutions that I found online, but all to no avail.
What should I do? I need some assistance.
To solve this simply follow the solution below
Change the password for a user
Login to the server where PostgreSQL is installed.
Next, switch to the root user:
sudo su -
Log in to psql using the postgres database login role, connecting to the postgres database:
psql postgres postgres
Issue the \password command to alter the password of the user:
\password my-user
Note: This will prompt you to enter a new password twice.
And then exit the psql prompt:
\q
Afterwhich you can exit and then test the password using the command below to login to the postgres database:
psql -U my-user -W postgres
Change the password for the postgres user
If you do not know the password for the postgres user, run the psql command from the postgres user account:
sudo passwd postgres
Note: This will prompt you to enter a new password twice.
And then exit the psql prompt:
\q
To test and confirm the password change or setup, run the psql command from the postgres user account:
su - postgres
Enter the new password that you just setup.
That's all
I hope this helps.
Try to use the command:
sudo bash
then enter your users password else of your Mac or of any linux distribution
instead of using su

Could not find Docker hostname on Gitlab CI

I have an app inside a Docker container based on Elixir image, that need to connect to a database and run tests using a Gitlab runner.
The build stage works fine but there is a problem to connect to a database to run tests. I tried both connecting to a service and running another database container, but from the logs it looks like the problem is with the Phoenix app:
** (RuntimeError) :database is nil in repository configuration
lib/ecto/adapters/postgres.ex:121: Ecto.Adapters.Postgres.storage_up/1
lib/mix/tasks/ecto.create.ex:40: anonymous fn/3 in Mix.Tasks.Ecto.Create.run/1
(elixir) lib/enum.ex:675: Enum."-each/2-lists^foreach/1-0-"/2
(elixir) lib/enum.ex:675: Enum.each/2
(mix) lib/mix/task.ex:301: Mix.Task.run_task/3
(mix) lib/mix/cli.ex:75: Mix.CLI.run_task/2
This is how the config/test.exs file looks like
config :app, App.Repo,
adapter: Ecto.Adapters.Postgres,
username: System.get_env("POSTGRES_USER"),
password: System.get_env("POSTGRES_PASSWORD"),
database: System.get_env("POSTGRES_DB"),
hostname: System.get_env("POSTGRES_HOSTNAME"),
pool: Ecto.Adapters.SQL.Sandbox
This is the output from the runner:
$ docker run --rm -t $CONTAINER echo $MIX_ENV $POSTGRES_USER $POSTGRES_HOSTNAME $POSTGRES_DB
test username db test_db
I'm trying to figure out why I get this error :database is nil, and if it is related to Gitlab, Ecto or Phoenix.
Edit
I wrote static values in the config/*.exs files (for some reason it didn't pick them up), but now it can't find the postgresql hostname. Although the postgresql instance is running it can't find it.
I checked if the instance is running with docker ps
Based on the message :database is nil in repository configuration it seems to me like your POSTGRES_DB variable is not set. You can try to change that configuration line to
database: System.get_env("POSTGRES_DB") || "postgres"
to see whether you still get the same error. If you don't, you can debug from there.

How to set up Travis CI and postgresql using custom db credentials?

I'm trying to set up custom Postgres credentials with Travis CI as I'd like to avoid the existing credentials definition in the code to be tested.
The testing code defines that the database should be accessed on:
'sqlalchemy.url': 'postgresql://foo:bar#localhost/testing_db'
I've therefore created a database.travis.yml file:
postgresql: &postgresql
adapter: postgresql
username: foo
password: bar
database: testing_db
...and added the following to my .travis.yml:
services:
- postgresql
before_script:
- psql -c 'create database stalker_test;' -U postgres
- mkdir config && cp database.travis.yml config/database.yml
However, I am still getting this during testing:
OperationalError: (psycopg2.OperationalError) FATAL: role "foo" does not exist
What am I doing wrong?
Adding the following to .travis.yml solved my issue. No need for a database.travis.yml file.
before_script:
- psql -c "CREATE DATABASE testing_db;" -U postgres
- psql -c "CREATE USER foo WITH PASSWORD 'bar';" -U postgres
Database.yml seems to be a Ruby On Rails thing. Travis CI started with rails / ruby testing, so the docs might be reflecting that.
You most probably need to do your setup in a separate script or migration setup, and not rely on travis, except for running the service.

Deploying local database to production and the other way around

I have a rails app that has a model called 'Opportunity'. Say this model has several records stored in the database on my local development environment and I now want to deploy the app. With it, I want all the data in my local database to be deployed as well.
Is this possible? I have looked at rake tasks and seeding but both do not seem to be quite what I want.
Thanks so much for your help.
For postgres the command to export your database is called pg_dump
To dump a database:
$ pg_dump mydb > db.out
To reload this database:
$ psql -d database -f db.out
why cant you take table dump of development db and import it in production db
For mysql
in development :
mysqldump -u username -p developmentdb tablename > for_production.sql
in production:
mysql -u username -p productiondb < for_production.sql

Resources