How can I run psql from another docker container? - docker

I have 2 containers. One of then is for postgresql. Another for php.
version: '3.7'
networks:
backend-network:
driver: bridge
frontend-network:
driver: bridge
services:
php-fpm:
container_name: k4fntr_php-fpm
build: ./docker/php-fpm
ports:
- "9000:9000"
volumes:
- ./:/var/www/k4fntr
depends_on:
- database
networks:
- backend-network
- frontend-network
database:
container_name: k4fntr_database
build: ./docker/postgres
restart: always
environment:
ENV: ${APP_ENV}
TESTING_DB: ${DB_DATABASE_TESTING}
POSTGRES_DB: ${DB_DATABASE}
POSTGRES_USER: ${DB_USERNAME}
POSTGRES_PASSWORD: ${DB_PASSWORD}
ports:
- "15432:5432"
volumes:
- ./docker/postgres/pg-data:/var/lib/postgresql/data:Z
networks:
- backend-network
I also have sh script which is copy production database into my local.
#!/bin/bash
ssh -p 22000 -C -l user host 'sh /project/copy_prod.sh' > /tmp/backup.sql
(
echo "DROP DATABASE prod_copy;"
echo "CREATE DATABASE prod_copy OWNER k4fntr;"
) | (export PGPASSWORD=secret && psql -h 127.0.0.1 -U local)
export PGPASSWORD=secret && psql -h 127.0.0.1 -U local prod_copy < /tmp/backup.sql
but got an error
bash: psql: command not found
Is it possible to run psql from database container?

Related

Redis, sidekiq and docker-compose using AWS elasticache(clusters)

The main problem I have got when I started this project is to assign to sidekiq the redis connection and to use elasticache redis cluster inside redis.
this is my docker-compose file:
version: "3.3"
services:
db:
image: postgres:12.9-alpine
ports:
- "5432:5432"
volumes:
- db_data_postgres:/var/lib/postgresql/data
networks:
appnet:
ipv4_address: 172.20.0.3
redis:
image: redis:latest
restart: always
ports:
- '6379:6379'
command: bash -c "redis-server" # what should go here?
networks:
appnet:
ipv4_address: 172.20.0.4
app:
build:
context: .
volumes:
- .:/app
- gem_cache:/usr/local/bundle/gems
- node_modules:/app/node_modules
depends_on:
- db
- worker_sidekiq
links:
- db
ports:
- "8000:8000"
env_file:
- .env
environment:
- REDIS_URL=redis://redis:6379
command: bash -c "rm -f tmp/pids/server.pid && RAILS_ENV=development bin/rails s -b '0.0.0.0' -p 8000"
networks:
appnet:
ipv4_address: 172.20.0.5
worker_sidekiq:
build: .
image: app
command: bash -c "bundle exec sidekiq"
volumes:
- .:/app
- gem_cache:/usr/local/bundle/gems
- node_modules:/app/node_modules
depends_on:
- redis
environment:
- REDIS_URL=redis://redis:6379
networks:
appnet:
ipv4_address: 172.20.0.6
networks:
appnet:
ipam:
config:
- subnet: 172.20.0.0/16
volumes:
gem_cache:
db_data_postgres:
node_modules:
I've tried command: redis-cli -h *.*.use1.cache.amazonaws.com -p 6379 but I got errors on start.
I have tried to figured it out for a while and i don't know how to link the AWS Redis Cluster in my docker-compose file. Any ideeas?
ElastiCache is can not be connected from outside AWS by default. You can connect to these clusters by bastion host or AWS Client VPN.
AWS Official Doc: https://docs.aws.amazon.com/AmazonElastiCache/latest/red-ug/accessing-elasticache.html#access-from-outside-aws

How to connect to postgres from within a docker file?

I'm new to docker and postgres. It should be mentioned that my problem only occurs with docker-compose.Here's my .yml file:
version: '3.5'
services:
postgres:
container_name: postgres_container
image: postgres
environment:
POSTGRES_USER: ${POSTGRES_USER:-postgres}
POSTGRES_PASSWORD: ${POSTGRES_PASSWORD:-changeme}
PGDATA: /data/postgres
volumes:
- postgres:/data/postgres
ports:
- "5432:5432"
networks:
- postgres
restart: unless-stopped
pgadmin:
container_name: pgadmin_container
image: dpage/pgadmin4
environment:
PGADMIN_DEFAULT_EMAIL: ${PGADMIN_DEFAULT_EMAIL:-pgadmin4#pgadmin.org}
PGADMIN_DEFAULT_PASSWORD: ${PGADMIN_DEFAULT_PASSWORD:-admin}
volumes:
- pgadmin:/root/.pgadmin
ports:
- "${PGADMIN_PORT:-5050}:80"
networks:
- postgres
restart: unless-stopped
networks:
postgres:
driver: bridge
volumes:
postgres:
pgadmin:**
When I run docker-compose run postgres bash and then run psql -U postgres i get the following error
psql: error: could not connect to server: No such file or directory
Is the server running locally and accepting
connections on Unix domain socket "/var/run/postgresql/.s.PGSQL.5432"?
Can someone help?
With docker-compose run postgres bash you are replacing the entry-point script of the image with bash. So the DB is not running. Try
$ docker-compose up --detach postgres
$ docker-compose exec postgres bash
instead.
You can use: psql -U postgres -h postgres
specifying the host with the network you've created.

Access denied for user 'root'#'192.168.xxx.xxx' (using password: YES) in docker container

I have created mysql docker container using docker-compose file, container is up and running fine now. docker-compose.yaml is:
version: '3'
services:
db:
image: "mysql:8.0.21"
container_name: test-mysql
#- mysql-data:/var/lib/mysql/data
ports:
- 3306:3306
environment:
MYSQL_ROOT_PASSWORD: Test123
MYSQL_DB=imonitor
volumes:
- vol-data:/var/lib/mysql
networks:
- my-app
backend-app:
build: ./back-end
container_name: back-end-container
environment:
- DB_SERVER= test-mysql
- MYSQL_DB=testdb
- MYSQL_ALLOW_EMPTY_PASSWORD=yes
ports:
- 3000:3000
networks:
- my-app
links:
- my-app
ui:
build: ./front-end
container_name: front-end-container
ports:
- 8090:80
links:
- backend-app
networks:
- my-app
networks:
my-app:
driver: bridge
volumes:
vol-data:
Using interactive terminal command I have tried to get into MySQL database:
docker exec -it 90585b0a534a bash root#90585b0a534a:/# mysql -u root -pTest123 mysql: [Warning] Using a password on the command line interface can be insecure. ERROR 1045 (28000): Access denied for user 'root'#'localhost' (using password: YES)
Also, unable to connect using dB visualizer. is there any fault in the YAML file or on the configuration file?

Rails send request to other container [Failed to open TCP connection]

I am trying to do multiple node distribution systems by multiple Rails node + docker + MySQL + Redis.
Therefore, my main node needs to communicate with other nodes.
Here is my docker-compose.yml
**version: '2'
services:
db:
image: mysql:5.7
restart: always
environment:
- MYSQL_ROOT_PASSWORD=password
- MYSQL_DATABASE=pubsub_1_development
- MYSQL_USER=appuser
- MYSQL_PASSWORD=password
ports:
- "3308:3306"
redis:
image: 'redis:4.0-alpine'
app:
image: pubsub_2:1.0.8
command: /bin/sh -c "rm -f ./tmp/pids/server.pid && bundle exec rails s -p 3000 -b '0.0.0.0'"
volumes:
- ".:/pubsub_2"
ports:
- "3001:3000"
depends_on:
- db
- redis
- sub_node
- pub_node
links:
- db
- redis
- sub_node
- pub_node
environment:
DB_USER: root
DB_NAME: pubsub_1_development
DB_PASSWORD: password
DB_HOST: db
REDIS_CABLE_PORT: redis://redis:6379/1
SUB_NODE: 0.0.0.0:4001
PUB_NODE: 0.0.0.0:4000
stdin_open: true
tty: true
sub_node:
image: sub_node:1.0.1
command: /bin/sh -c "rm -f /sub_node/tmp/pids/server.pid && bundle exec rails s -p 4001 -b '0.0.0.0'"
ports:
- "4001:4001"
environment:
DB_USER: root
DB_NAME: pubsub_1_development
DB_PASSWORD: password
DB_HOST: db
REDIS_CABLE_PORT: redis://redis:6379/1
tty: true
expose:
- "4001"
pub_node:
image: pub_node:1.0.1
command: /bin/sh -c "rm -f /pub_node/tmp/pids/server.pid && bundle exec rails s -p 4000 -b '0.0.0.0'"
ports:
- "4000:4000"
environment:
DB_USER: root
DB_NAME: pubsub_1_development
DB_PASSWORD: password
DB_HOST: db
REDIS_CABLE_PORT: redis://redis:6379/1
tty: true
expose:
- "4000"**
However, when I try to use app node to send the request to pub_node, it threw this error.
Errno::ECONNREFUSED: Failed to open TCP connection to 127.0.0.1:4000 (Connection refused - connect(2) for "127.0.0.1" port 4000)
from /usr/local/lib/ruby/2.5.0/net/http.rb:939:in `rescue in block in connect'
I was doing post by this code.
rvlist = '127.0.0.1:4000'
HTTParty.post("http://#{rvlist}/publish", options)
It works in my development mode without the docker environment.
Networking in docker-compose:
Each container for a service joins the default network and is both
reachable by other containers on that network, and discoverable by
them at a hostname identical to the container name.
Reference: https://docs.docker.com/compose/networking/
In your case, any container within docker-compose can open the TCP connection to pub_node container using its container name and port pub_node:4000

Can't access docker url via curl or Postman

I can access my site in the browser with the following path: my-dash.docker.localhost:8000
I am trying to write an API endpoint and can't access the site via Postman or curl.
curl my-dash.docker.localhost:8000
curl: (6) Could not resolve host: my-dash.docker.localhost
This is my docker-compose.yml:
version: "2"
services:
mariadb:
image: wodby/mariadb:10.1-2.1.0
environment:
MYSQL_ROOT_PASSWORD: root
MYSQL_DATABASE: drupal
MYSQL_USER: drupal
MYSQL_PASSWORD: drupal
volumes:
- ./mariadb-init:/docker-entrypoint-initdb.d # Place init .sql file(s) here.
php:
image: wodby/drupal:8-7.1-2.1.2
environment:
PHP_SENDMAIL_PATH: /usr/sbin/sendmail -t -i -S mailhog:1025
DB_HOST: mariadb
DB_USER: drupal
DB_PASSWORD: drupal
DB_NAME: drupal
DB_DRIVER: mysql
PHP_XDEBUG: 1
PHP_XDEBUG_DEFAULT_ENABLE: 1
PHP_XDEBUG_REMOTE_CONNECT_BACK: 0 # This is needed to respect remote.host setting bellow
PHP_XDEBUG_REMOTE_HOST: "10.254.254.254" # You will also need to 'sudo ifconfig lo0 alias 10.254.254.254'
volumes:
- mydash-sync:/var/www/html:nocopy # Docker-sync for macOS users
nginx:
image: wodby/drupal-nginx:8-1.10-2.1.0
depends_on:
- php
environment:
NGINX_STATIC_CONTENT_OPEN_FILE_CACHE: "off"
NGINX_ERROR_LOG_LEVEL: debug
NGINX_BACKEND_HOST: php
NGINX_SERVER_ROOT: /var/www/html/web
volumes:
- mydash-sync:/var/www/html:nocopy # Docker-sync for macOS users
labels:
- 'traefik.backend=nginx'
- 'traefik.port=80'
- 'traefik.frontend.rule=Host:my-dash.docker.localhost'
solr:
image: wodby/drupal-solr:8-6.4-2.0.0
environment:
SOLR_HEAP: 1024m
labels:
- 'traefik.backend=solr'
- 'traefik.port=8983'
- 'traefik.frontend.rule=Host:solr.my-dash.docker.localhost'
mailhog:
image: mailhog/mailhog
labels:
- 'traefik.backend=mailhog'
- 'traefik.port=8025'
- 'traefik.frontend.rule=Host:mailhog.my-dash.docker.localhost'
traefik:
image: traefik
command: -c /dev/null --web --docker --logLevel=INFO
ports:
- '8000:80'
- '8080:8080' # Dashboard
volumes:
- /var/run/docker.sock:/var/run/docker.sock
volumes:
mydash-sync:
external: true
You have to edit your /etc/hosts file to get your computer resolving that name. Add this at the end of the file:
127.0.0.1 my-dash.docker.localhost

Resources