Docker rails setup issue - ruby-on-rails

please can someone kindly help me with a docker setup issue i have been trying to debug for days now.
my containers seem to be running fine but i cannot get the database up and running when i run docker compose run app rake db:create.
i am running docker on manjaro in virtualbox
the error i am getting is:
PG::ConnectionBad: could not connect to server: Connection timed out Is the server running on host "172.18.0.3" and accepting TCP/IP connections on port 5432?
below are the configurations for both docker-compose.yml and config/database.yml
version: '3.8'
services:
db:
image: postgres:latest
restart: always
ports:
- "5432:5432"
environment:
- POSTGRES_PASSWORD=password
volumes:
- db_data:/var/lib/postgresql/data
redis:
image: redis
ports:
- 6379:6379
volumes:
- redis_data:/data
app:
build: .
command: bash -c "rm -f tmp/pids/server.pid && bundle exec rails s -p 3000 -b '0.0.0.0'"
volumes:
- .:/app
- gem_cache:/usr/local/bundle/gems
ports:
- "3000:3000"
depends_on:
- db
- redis
environment:
- POSTGRES_PASSWORD=password
- POSTGRES_HOST=db
- POSTGRES_USERNAME=postgres
volumes:
db_data:
gem_cache:
redis_data:
default: &default
adapter: postgresql
encoding: unicode
host: db
username: postgres
password: password
pool: 5

Related

Connect to PostgreSQL Database in Docker Container from DBeaver

I can't get the connection between my PostgreSQL database from my Rails app which is running in a Docker container working.
The application just works fine, I just can't connect to the database.
docker-compose.yml:
services:
app:
build:
context: .
dockerfile: app.Dockerfile
container_name: application_instance
command: bash -c "bundle exec puma -C config/puma.rb"
volumes:
- .:/app
- node-modules:/app/node_modules
- public:/app/public
depends_on:
- database
- redis
env_file:
- .env
database:
image: postgres
container_name: database_instance
restart: always
volumes:
- db_data:/var/lib/postgresql/data
ports:
- "5432:5432"
env_file:
- .env
environment:
POSTGRES_USER: ${POSTGRES_USER}
POSTGRES_PASSWORD: ${POSTGRES_PASSWORD}
POSTGRES_DB: ${POSTGRES_PRODUCTION_DB}
nginx:
build:
context: .
dockerfile: nginx.Dockerfile
depends_on:
- app
volumes:
- public:/app/public
ports:
- "80:80"
redis:
image: redis
container_name: redis_instance
ports:
- "6379:6379"
sidekiq:
container_name: sidekiq_instance
build:
context: .
dockerfile: app.Dockerfile
depends_on:
- redis
- database
command: bundle exec sidekiq
volumes:
- .:/app
env_file:
- .env
volumes:
db_data:
node-modules:
public:
If I try to connect via DBeaver I get the following message:
Any idea what's going wrong here? The port should be exposed on my local machine. I also tried with the IP of the container, but then I get a timeout exception.
This is because you most likely have postgres running locally on your machine (port 5432) and also on a docker (port 5432). Dbeaver wants to connect to database on your local machine, than on docker.
Any solution I figure out is to temporary stop/turn of your local postgres service (on Windows: Task manager -> Services -> (postgres service) -> stop).
I was also struggling with issue.

How to connect to postgres from within a docker file?

I'm new to docker and postgres. It should be mentioned that my problem only occurs with docker-compose.Here's my .yml file:
version: '3.5'
services:
postgres:
container_name: postgres_container
image: postgres
environment:
POSTGRES_USER: ${POSTGRES_USER:-postgres}
POSTGRES_PASSWORD: ${POSTGRES_PASSWORD:-changeme}
PGDATA: /data/postgres
volumes:
- postgres:/data/postgres
ports:
- "5432:5432"
networks:
- postgres
restart: unless-stopped
pgadmin:
container_name: pgadmin_container
image: dpage/pgadmin4
environment:
PGADMIN_DEFAULT_EMAIL: ${PGADMIN_DEFAULT_EMAIL:-pgadmin4#pgadmin.org}
PGADMIN_DEFAULT_PASSWORD: ${PGADMIN_DEFAULT_PASSWORD:-admin}
volumes:
- pgadmin:/root/.pgadmin
ports:
- "${PGADMIN_PORT:-5050}:80"
networks:
- postgres
restart: unless-stopped
networks:
postgres:
driver: bridge
volumes:
postgres:
pgadmin:**
When I run docker-compose run postgres bash and then run psql -U postgres i get the following error
psql: error: could not connect to server: No such file or directory
Is the server running locally and accepting
connections on Unix domain socket "/var/run/postgresql/.s.PGSQL.5432"?
Can someone help?
With docker-compose run postgres bash you are replacing the entry-point script of the image with bash. So the DB is not running. Try
$ docker-compose up --detach postgres
$ docker-compose exec postgres bash
instead.
You can use: psql -U postgres -h postgres
specifying the host with the network you've created.

i am running prisma deploy , it show me error Could not connect to server at http://localhost:3000. Please check if your server is running

Could not connect to server at http://localhost:3000. Please check if your server is running.
Get in touch if you need help: https://spectrum.chat/prisma
To get more detailed output, run $ export DEBUG="*"
I am running on windows 10 home , I have used docker toolbox
docker-compose file:
version: '3'
services:
prisma:
image: prismagraphql/prisma:1.32
restart: always
ports:
- '3000:3000'
environment:
PRISMA_CONFIG: |
port: 3000
databases:
default:
connector: mysql
host: mysql
port: 3306
user: root
password: prisma
mysql:
image: mysql:5.7
restart: always
environment:
MYSQL_ROOT_PASSWORD: prisma
volumes:
- mysql:/var/lib/mysql
volumes:
mysql: ~
prisma.yml file:
endpoint: http://localhost:3000
datamodel: datamodel.prisma

Rails send request to other container [Failed to open TCP connection]

I am trying to do multiple node distribution systems by multiple Rails node + docker + MySQL + Redis.
Therefore, my main node needs to communicate with other nodes.
Here is my docker-compose.yml
**version: '2'
services:
db:
image: mysql:5.7
restart: always
environment:
- MYSQL_ROOT_PASSWORD=password
- MYSQL_DATABASE=pubsub_1_development
- MYSQL_USER=appuser
- MYSQL_PASSWORD=password
ports:
- "3308:3306"
redis:
image: 'redis:4.0-alpine'
app:
image: pubsub_2:1.0.8
command: /bin/sh -c "rm -f ./tmp/pids/server.pid && bundle exec rails s -p 3000 -b '0.0.0.0'"
volumes:
- ".:/pubsub_2"
ports:
- "3001:3000"
depends_on:
- db
- redis
- sub_node
- pub_node
links:
- db
- redis
- sub_node
- pub_node
environment:
DB_USER: root
DB_NAME: pubsub_1_development
DB_PASSWORD: password
DB_HOST: db
REDIS_CABLE_PORT: redis://redis:6379/1
SUB_NODE: 0.0.0.0:4001
PUB_NODE: 0.0.0.0:4000
stdin_open: true
tty: true
sub_node:
image: sub_node:1.0.1
command: /bin/sh -c "rm -f /sub_node/tmp/pids/server.pid && bundle exec rails s -p 4001 -b '0.0.0.0'"
ports:
- "4001:4001"
environment:
DB_USER: root
DB_NAME: pubsub_1_development
DB_PASSWORD: password
DB_HOST: db
REDIS_CABLE_PORT: redis://redis:6379/1
tty: true
expose:
- "4001"
pub_node:
image: pub_node:1.0.1
command: /bin/sh -c "rm -f /pub_node/tmp/pids/server.pid && bundle exec rails s -p 4000 -b '0.0.0.0'"
ports:
- "4000:4000"
environment:
DB_USER: root
DB_NAME: pubsub_1_development
DB_PASSWORD: password
DB_HOST: db
REDIS_CABLE_PORT: redis://redis:6379/1
tty: true
expose:
- "4000"**
However, when I try to use app node to send the request to pub_node, it threw this error.
Errno::ECONNREFUSED: Failed to open TCP connection to 127.0.0.1:4000 (Connection refused - connect(2) for "127.0.0.1" port 4000)
from /usr/local/lib/ruby/2.5.0/net/http.rb:939:in `rescue in block in connect'
I was doing post by this code.
rvlist = '127.0.0.1:4000'
HTTParty.post("http://#{rvlist}/publish", options)
It works in my development mode without the docker environment.
Networking in docker-compose:
Each container for a service joins the default network and is both
reachable by other containers on that network, and discoverable by
them at a hostname identical to the container name.
Reference: https://docs.docker.com/compose/networking/
In your case, any container within docker-compose can open the TCP connection to pub_node container using its container name and port pub_node:4000

How to connect with client to my postgres container in docker?

I'm following this guide:
https://docs.docker.com/compose/rails/
I have all set up and running, and I'm trying to figure out how to connect with one of my DB client (Sequel Pro or pgAdmin) to the postgres container.
I also tried to map the postgres port in order to have it served outside from the container (docker-compose.yml), without success:
version: '3'
services:
db:
image: postgres
##### Trying this...
ports:
- "5432:5432"
#####
web:
build: .
command: bundle exec rails s -p 3030 -b '0.0.0.0'
volumes:
- .:/myapp
ports:
- "3030:3030"
depends_on:
- db

Resources