Confused by docker networking / connections - ruby-on-rails

OK so I am trying to deploy a Rails app to a docker container (host machine is a mac). I was thinking to deploy in development first to check everything is working.
I have setup a phpmyadmin service and I can connect to the server by typing in server name moviedb_mariamovie_1 with user root and corresponding PW.
But whatever I put into my database.yml for Rails doesn't work: I tried localhost, I tried 127.0.0.1, I tried "mariamovie" and I tried "moviedb_mariamovie_1", and it always says "host not found" when I tried rails db:create (or anything actually that involves the DB).
I am totally confused by this. I read the database section of the docker manuals and I seem to be too stupid for that.
(I have other problems with this but one after the other :)
docker-compose.yml:
version: "3.7"
services:
moviedb:
image: tkhobbes/moviedb
restart: unless-stopped
ports:
- 3001:3000
depends_on:
- mariamovie
environment:
MYSQL_ROOT_PASSWORD: redacted
RAILS_ENV: development
volumes:
- /Users/thomas/Documents/Production/moviedb/storage:/opt/activestorage
mariamovie:
image: mariadb
restart: unless-stopped
ports:
- 3333:3306
environment:
MYSQL_ROOT_PASSWORD: redacted
phpmymaria:
image: phpmyadmin
restart: unless-stopped
ports:
- 8021:80
depends_on:
- mariamovie
environment:
PMA_PORT: 3333
PMA_ARBITRARY: 1
image: nginx:1.21-alpine
volumes:
- /Users/thomas/Documents/Production/moviedb/vendor/nginx:/etc/nginx/user.conf.d:ro
ports:
- 8020:8020
depends_on:
- moviedb
restart: unless-stopped
database.yml:
default: &default
adapter: mysql2
encoding: utf8mb4
pool: <%= ENV.fetch("RAILS_MAX_THREADS") { 5 } %>
host: 127.0.0.1
port: 3333
username: redacted
password: redacted
development:
<<: *default
database: newmovie_development
...

You're inside your docker "network". Your database should be accessible from your Rails app (which is inside too) via mariamovie:3306.

Related

Localhost not found even if my docker containers are up?

I am relatively new to dev in general, to the Docker universe and to Rails in particular, apologize in advance if it sounds like a silly question.
I am trying to run an application in a monorepo composed of 4 services (2 websites and 2 APIs) + Postgresql, with the help of Docker Compose. The final goal is to run it on a VPS with Traefik (once I get the current app to work locally).
Here are the different services :
Postgres (through the Postgres image available in Dockerhub)
a B2C website (NextJS)
an admin website (React with create Vite)
an API (Rails). It should be linked to the Postgres database
a Strapi API (for the content of the B2C website). Strapi has its own SQLite database. Only the B2C website requires the data coming from Strapi.
When I run the docker compose up -d command, it seems to be working (see pic below)
but when I go to one of the websites (except for the Strapi that seems to be correctly working) (https://localhost:3009, or 3008 or 3001), I get nothing (see below).
However, I don't see any error in the logs of any apps. For instance the Rails API logs below:
I assume that I have mistakes in my config, especially in the database.yml config of the Rails api and the docker-compose.yml file.
database.yml :
default: &default
adapter: postgresql
encoding: unicode
pool: <%= ENV.fetch("RAILS_MAX_THREADS") { 5 } %>
host: pg
development:
<<: *default
database: chana_api_v2_development
test:
<<: *default
database: chana_api_v2_test
production:
<<: *default
database: chana_api_v2_production
username: chana
password: <%= ENV["CHANA_DATABASE_PASSWORD"] %>
docker-compose.yml
version: '3'
services:
# ----------------POSTGRES -----------------
pg:
image: postgres:14.6
container_name: pg
networks:
- chana_postgres_network
ports:
- "5432:5432"
environment:
POSTGRES_DB: chana_development
POSTGRES_USER: chana
POSTGRES_PASSWORD: chana
volumes:
- ./data:/var/lib/postgresql/data
# ----------------- RAILS API -----------------
api:
build: ./api
container_name: api
networks:
- chana_postgres_network
- api_network
volumes:
- ./api:/chana_api
ports:
- "3001:3000"
depends_on:
- pg
# ----------------- STRAPI -----------------
strapi:
build:
context: ./strapi
args:
BASE_VERSION: latest
STRAPI_VERSION: 4.5.0
container_name: chana-strapi
restart: unless-stopped
env_file: .env
environment:
NODE_ENV: ${NODE_ENV}
HOST: ${HOST}
PORT: ${PORT}
volumes:
- ./strapi:/srv/app
- strapi_node_modules:/srv/app/node_modules
ports:
- "1337:1337"
# ----------------- B2C website -----------------
public-front:
build: ./public-front
container_name: public-front
restart: always
command: yarn dev
ports:
- "3009:3000"
networks:
- api_network
- chana_postgres_network
depends_on:
- api
- strapi
volumes:
- ./public-front:/app
- /app/node_modules
- /app/.next
# ----------------- ADMIN website -----------------
admin-front:
build: ./admin-front
container_name: admin-front
restart: always
command: yarn dev
ports:
- "3008:3000"
networks:
- api_network
- chana_postgres_network
depends_on:
- api
volumes:
- ./admin-front:/app
- /app/node_modules
- /app/.next
volumes:
strapi_node_modules:
networks:
api_network:
chana_postgres_network:
Do you have any idea why I cannot see anything on the website pages?
I tried to change the code of the different files that are relevant, especially database.yml, docker-compose.yml, and the dockerfiles of each app.
Also, I tried to look into the api container (Rails) with the command docker exec -it api /bin/sh to check the database through the Rails console, and I get this error message:
activeRecord::ConnectionNotEstablished could not connect to server: No such file or directory. Is the server running locally and accepting connections on Unix domain socket "/var/run/postgresql/.s.PGSQL.5432"?
instead of localhost press ctrl and click on the url that it passes sometimes does not open on the localhost port of your website
looks like your host in docker is 127.0.0.1:3000 normally this is fine but in docker when you want to expose the app to your host machine you need to change the app to run on 0.0.0.0:3000 and then docker will be able to pass the app through to your host machine. Without specific Dockerfiles this is the best I can do. I have run into this issue with strapi and some other apps before so hopefully it helps.
It will still be localhost:3000 on the host machine if i wasn't clear.

How to refer from one container to another using localhost

I'm working on building docker containers for a Ruby-on-Rails project I'm currently working on, so that I can develop this project using the remote feature of Visual Studio Code. This project should still continue to work without using docker containers, so I cannot make breaking changes to the existing code that would compromise this.
The application server (Rails) needs to connect to a MySQL database that's running in a separate container. The database container is named db, and I can connect from the application container to this container by using the db hostname.
The database.yml config file for rails defines how to connect to the database, but this is where my problem is situated. I don't want to change the host to db instead of localhost as this would mean that regular users (that do not use Docker containers) will no longer be able to connect to the database without changing this file. How can I somehow start or change my docker config so that db is accessible as localhost instead of db inside of the application container?
database.yml:
default: &default
adapter: mysql2
username: ****
password: ****
pool: <%= ENV.fetch("RAILS_MAX_THREADS") { 5 } %>
development:
<<: *default
database: ****
# setup local port forwarding for this to work
host: db
port: 3306
docker-compose.yml:
version: '3.7'
services:
db:
build: ./unipept-db
environment:
MYSQL_ROOT_PASSWORD: ****
MYSQL_DATABASE: ****
MYSQL_USER: ****
MYSQL_PASSWORD: ****
restart: always
ports:
- "3306:3306"
hostname: mysql
phpmyadmin:
depends_on:
- db
image: phpmyadmin/phpmyadmin
ports:
- '8080:80'
environment:
PMA_HOST: db
MYSQL_ROOT_PASSWORD: ****
restart: always
app:
depends_on:
- db
build: ./unipept-application
command: sleep infinity
ports:
- '5000:5000'
volumes:
- ~/.gitconfig:/root/.gitconfig
- ..:/workspace
user network_mode: "host"in your APP config then you can call the DBfrom your APP using localhost:PORT
phpmyadmin:
depends_on:
- db
image: phpmyadmin/phpmyadmin
ports:
- '8080:80'
environment:
PMA_HOST: db
MYSQL_ROOT_PASSWORD: ****
restart: always
network_mode: "host"
app:
depends_on:
- db
build: ./unipept-application
command: sleep infinity
ports:
- '5000:5000'
network_mode: "host"
volumes:
- ~/.gitconfig:/root/.gitconfig
- ..:/workspace
PS: Published ports are discarded when using host network mode
If you make it an environment variable in your database.yml.erb file, it will be configurable. You even already have an example of this. You can set:
host: <%= ENV.fetch('DB_HOST', 'localhost') %>
In development, just don't set the environment variable, and it will use localhost. In a Docker environment, do set it, and it will use that hostname.
version: '3'
services:
db:
image: mysql
app:
build: .
environment:
DB_HOST: db
ports:
- '5000:5000'
You should also pass things like database credentials the same way.

i am running prisma deploy , it show me error Could not connect to server at http://localhost:3000. Please check if your server is running

Could not connect to server at http://localhost:3000. Please check if your server is running.
Get in touch if you need help: https://spectrum.chat/prisma
To get more detailed output, run $ export DEBUG="*"
I am running on windows 10 home , I have used docker toolbox
docker-compose file:
version: '3'
services:
prisma:
image: prismagraphql/prisma:1.32
restart: always
ports:
- '3000:3000'
environment:
PRISMA_CONFIG: |
port: 3000
databases:
default:
connector: mysql
host: mysql
port: 3306
user: root
password: prisma
mysql:
image: mysql:5.7
restart: always
environment:
MYSQL_ROOT_PASSWORD: prisma
volumes:
- mysql:/var/lib/mysql
volumes:
mysql: ~
prisma.yml file:
endpoint: http://localhost:3000
datamodel: datamodel.prisma

Plug Postgres database on Navicat using prisma

Im currently doing an app using Prisma and a Postgres database and I can't connect my database to Navicat. I'm a beginner with docker and dont understand completely how services work. My current docker-compose.yml is
version: '3'
services:
prisma:
image: prismagraphql/prisma:1.8
restart: always
ports:
- "4466:4466"
environment:
PRISMA_CONFIG: |
port: 4466
# uncomment the next line and provide the env var PRISMA_MANAGEMENT_API_SECRET=my-secret to activate cluster security
# managementApiSecret: my-secret
databases:
default:
connector: postgres
host: postgres
port: 5432
user: prisma
password: prisma
migrations: true
postgres:
image: postgres
restart: always
environment:
POSTGRES_USER: prisma
POSTGRES_PASSWORD: prisma
volumes:
- postgres:/var/lib/postgresql/data
volumes:
postgres:
And what I tried on Navicat is this, what seems to me correct but it would appear that no.
Thank you for you help !
You need to use port mapping using the ports property for your postgres container:
version: '3'
services:
prisma:
image: prismagraphql/prisma:1.8
restart: always
ports:
- "4466:4466"
environment:
PRISMA_CONFIG: |
port: 4466
# uncomment the next line and provide the env var PRISMA_MANAGEMENT_API_SECRET=my-secret to activate cluster security
# managementApiSecret: my-secret
databases:
default:
connector: postgres
host: postgres
port: 5432
user: prisma
password: prisma
migrations: true
postgres:
image: postgres
restart: always
ports:
- "5432:5432"
environment:
POSTGRES_USER: prisma
POSTGRES_PASSWORD: prisma
volumes:
- postgres:/var/lib/postgresql/data
volumes:
postgres:
Then you should be able to connect to localhost:5432 with a Postgres client, like Navicat.

Getting "FATAL: role "root" does not exist" when trying to do "docker up"

I'm starting to work with an existing Rails project that uses Docker. I've used Rails for a long time but never Docker.
After I do a docker build . I try to do a docker-compose up, but I get:
FATAL: role "root" does not exist
/usr/local/bundle/gems/activerecord-4.2.5.2/lib/active_record/connection_adapters/postgresql_adapter.rb:661:in `rescue in connect': FATAL: role "root" does not exist (ActiveRecord::NoDatabaseError)
It seems to me that the Docker machine is probably trying to connect to the database as the root user, but there's no role called root, so the connection is rightly failing.
The thing I don't know is why Docker is apparently trying to connect to the database as root and how to get it to use the right user.
Here's my database.yml:
development:
database: my_app_development
adapter: postgresql
encoding: unicode
pool: 5
Any help is appreciated.
Edit: here's my docker-compose.yml:
web:
build: .
volumes:
- .:/my_app
ports:
- "3000:3000"
links:
- postgres
- redis
- mailcatcher
env_file:
- 'config/application.yml'
postgres:
image: postgres:9.4
ports:
- "5432"
env_file:
- 'config/database.yml'
redis:
image: redis:3.0.6
mailcatcher:
image: schickling/mailcatcher
ports:
- "1080:1080"
Postgres image expects POSTGRES_USER, POSTGRES_PASSWORD and POSTGRES_DB to be provided. Otherwise it will use default values. Create a .env file in the root directory of your project:
# .env file
POSTGRES_USER=testuser
POSTGRES_PASSWORD=testpass
POSTGRES_DB=db_development
and change you docker-compose file as:
web:
build: .
volumes:
- .:/my_app
ports:
- 3000:3000
depends_on:
- postgres
- redis
- mailcatcher
postgres:
image: postgres:9.4
ports:
- 5432:5432
env_file: .env
redis:
image: redis:3.0.6
mailcatcher:
image: schickling/mailcatcher
ports:
- 1080:1080
You could also provide the environment variables without .env file:
web:
build: .
volumes:
- .:/my_app
ports:
- 3000:3000
depends_on:
- postgres
- redis
- mailcatcher
postgres:
image: postgres:9.4
ports:
- 5432:5432
environment:
- POSTGRES_USER=testuser
- POSTGRES_PASSWORD=testpass
- POSTGRES_DB=db_development
redis:
image: redis:3.0.6
mailcatcher:
image: schickling/mailcatcher
ports:
- 1080:1080
and update your database.yml file:
default: &default
adapter: postgresql
encoding: unicode
pool: <%= ENV.fetch("RAILS_MAX_THREADS") { 5 } %>
username: <%= ENV['POSTGRES_USER'] %>
password: <%= ENV['POSTGRES_PASSWORD'] %>
host: postgres
development:
<<: *default
database: db_development
test:
<<: *default
database: db_test
production:
<<: *default
database: db_production
pool: <%= ENV.fetch("RAILS_MAX_THREADS") { 25 } %>
WARNING: Don't use links, it'll be removed soon
You might want to update your compose and database yml as follows. With the expected db user and password in the database.yml. Also, you can make this is an environment variable. But try the default for the postgres db docker image first as follows;
database.yml
development:
database: my_app_development
adapter: postgresql
encoding: unicode
pool: 5
username: postgres
password:
host: postgres(db name in docker-compose.yml)
docker-compose.yml
web:
build: .
command: bundle exec rails s -p 3000 -b '0.0.0.0'
volumes:
- .:/my_app
ports:
- "3000:3000"
links:
- postgres
- redis
- mailcatcher
postgres:
image: postgres:9.4
ports:
- "5432"
redis:
image: redis:3.0.6
mailcatcher:
image: schickling/mailcatcher
ports:
- "1080:1080"
I dont think you want to keep
env_file:
- 'config/database.yml'
and
env_file:
- 'config/application.yml'
Then create the databases with docker-compose run web rake db:create
I added the command instruction because I dont know what your Dockerfile looks like. But if you have a successful build of the app image with docker build -t app-name ., you can remove it and just run docker-compose up.
Without an explicit user set in the database.yml it will attempt to use the root user as this is the same user postgres is running under in the container. To fix this, try setting your database.yml as:
development:
database: my_app_development
adapter: postgresql
encoding: unicode
pool: 5
username: postgres
Note the addition of the username field.
With docker-compose, it tries to preserve volumes between runs; since the user is only created on a new database, you probably need to docker-compose rm -v database to delete the container and associated volume.
And try docker-compose up --build again.
Source: https://github.com/docker-library/postgres/issues/41#issuecomment-167603905

Resources