I am using docker-compose for airflow, as described in the docker-compose.yml file
version: '3'
services:
postgres:
image: postgres:9.6
environment:
- POSTGRES_USER=airflow
- POSTGRES_PASSWORD=airflow
- POSTGRES_DB=airflow
ports:
- "5432:5432"
webserver:
image: puckel/docker-airflow:1.10.1
build:
context: https://github.com/puckel/docker-airflow.git#1.10.1
dockerfile: Dockerfile
args:
AIRFLOW_DEPS: gcp_api,s3
PYTHON_DEPS: sqlalchemy==1.2.0
restart: always
depends_on:
- postgres
environment:
- LOAD_EX=n
- EXECUTOR=Local
- FERNET_KEY=jsDPRErfv8Z_eVTnGfF8ywd19j4pyqE3NpdUBA_oRTo=
volumes:
- ./examples/intro-example/dags:/usr/local/airflow/dags
# Uncomment to include custom plugins
# - ./plugins:/usr/local/airflow/plugins
ports:
- "8080:8080"
command: webserver
healthcheck:
test: ["CMD-SHELL", "[ -f /usr/local/airflow/airflow-webserver.pid ]"]
interval: 30s
timeout: 30s
retries: 3
that I am tooking from tuanavu github repo.
My aim is to enable importing airflow variables from a json file with the following comment :
airflow variables --import /variables.json.
I want to override the entrypoint.sh file used by the docker image puckel/docker-airflow:1.10.1
This can be done by adding the fllowing bloc to the entrypoint.sh file used by the docker image puckel/docker-airflow:1.10.1:
if [ -e "/variables.json" ]; then
airflow variables --import /variables.json
fi
in a specific place of the entrypoint.sh of the docker image puckel/docker-airflow:1.10.1
Is there a way please I can do this in the docker-compose.yml file
Related
I am running an Airflow workflow in Docker to ingest data into GCP but I keep getting this error message in the terminal.
After trying the following docker compose command I get the following errors
$ docker compose build
$ docker compose up
Error:
Error response from daemon: Mounts denied:
The path /.google/credentials/google_credentials.json is not shared from the host and is not known to Docker.
You can configure shared paths from Docker -> Preferences... -> Resources -> File Sharing.
Below is the file structure and Docker YAML file too
YAML
version: "3"
services:
postgres:
image: postgres:13
env_file:
- .env
volumes:
- postgres-db-volume:/var/lib/postgresql/data
healthcheck:
test: ["CMD", "pg_isready", "-U", "airflow"]
interval: 5s
retries: 5
restart: always
scheduler:
build: .
command: scheduler
restart: on-failure
depends_on:
- postgres
env_file:
- .env
volumes:
- ./dags:/opt/airflow/dags
- ./logs:/opt/airflow/logs
- ./plugins:/opt/airflow/plugins
- ./scripts:/opt/airflow/scripts
- ~/.google/credentials/:/.google/credentials.json
webserver:
build: .
entrypoint: ./scripts/entrypoint.sh
restart: on-failure
depends_on:
- postgres
- scheduler
env_file:
- .env
volumes:
- ./dags:/opt/airflow/dags
- ./logs:/opt/airflow/logs
- ./plugins:/opt/airflow/plugins
- /.google/credentials/google_credentials.json:/.google/credentials:ro
- ./scripts:/opt/airflow/scripts
user: "${AIRFLOW_UID:-50000}:0"
ports:
- "8082:8080"
healthcheck:
test: ["CMD-SHELL", "[ -f /home/airflow/airflow-webserver.pid ]"]
interval: 30s
timeout: 30s
retries: 3
volumes:
postgres-db-volume:
enviroment file <.env>
AIRFLOW_UID=50000
#PG_HOST=pgdatabase
#PG_USER=root
#PG_PASSWORD=root
#PG_PORT=5432
#PG_DATABASE=ny_taxi
# Custom
COMPOSE_PROJECT_NAME=de-queries
GOOGLE_APPLICATION_CREDENTIALS=/.google/credentials/google_credentials.json
AIRFLOW_CONN_GOOGLE_CLOUD_DEFAULT=google-cloud-platform://?extra__google_cloud_platform__key_path=/.google/credentials/google_credentials.json
# AIRFLOW_UID=
GCP_PROJECT_ID=
GCP_GCS_BUCKET=
# Postgres
POSTGRES_USER=airflow
POSTGRES_PASSWORD=airflow
POSTGRES_DB=airflow
# Airflow
AIRFLOW__CORE__EXECUTOR=LocalExecutor
AIRFLOW__SCHEDULER__SCHEDULER_HEARTBEAT_SEC=10
AIRFLOW__CORE__SQL_ALCHEMY_CONN=postgresql+psycopg2://${POSTGRES_USER}:${POSTGRES_PASSWORD}#postgres:5432/${POSTGRES_DB}
AIRFLOW_CONN_METADATA_DB=postgres+psycopg2://airflow:airflow#postgres:5432/airflow
AIRFLOW_VAR__METADATA_DB_SCHEMA=airflow
_AIRFLOW_WWW_USER_CREATE=True
_AIRFLOW_WWW_USER_USERNAME=airflow
_AIRFLOW_WWW_USER_PASSWORD=airflow
AIRFLOW__CORE__DAGS_ARE_PAUSED_AT_CREATION=True
AIRFLOW__CORE__LOAD_EXAMPLES=False
shell file <.sh>
#!/usr/bin/env bash
export GOOGLE_APPLICATION_CREDENTIALS=${GOOGLE_APPLICATION_CREDENTIALS}
export AIRFLOW_CONN_GOOGLE_CLOUD_DEFAULT=${AIRFLOW_CONN_GOOGLE_CLOUD_DEFAULT}
airflow db upgrade
airflow users create -r Admin -u admin -p admin -e admin#example.com -f admin -l airflow
# "$_AIRFLOW_WWW_USER_USERNAME" -p "$_AIRFLOW_WWW_USER_PASSWORD"
airflow webserver
I have tried changing my environment variable to a relative path
GOOGLE_APPLICATION_CREDENTIALS=Users/<username>/.google/credentials/google_credentials.json
I have also tried using the terminal to export the path with
export GOOGLE_APPLICATION_CREDENTIALS=/.google/credentials/google_credentials.json
I have a project structure:
configs
-config.yaml
server
...
docker-compose.yaml
the docker file is :
version: '3.8'
services:
volumes:
- /configs:/configs
postgres:
image: postgres:12
restart: always
ports:
- '5432:5432'
volumes:
- ./db_data:/var/lib/postgresql/data
- ./server/scripts/init.sql:/docker-entrypoint-initdb.d/create_tables.sql
env_file:
- local.env
healthcheck:
test: [ "CMD", "pg_isready", "-q", "-d", "devdb", "-U","postgres" ]
timeout: 45s
interval: 10s
retries: 10
app:
build:
context: ./server/app
dockerfile: Dockerfile
env_file:
- local.env
environment:
- POSTGRES_USER=postgres
- POSTGRES_PASSWORD=postgres
- POSTGRES_DB=devdb
volumes:
configs:
The app uses config.yml and I'm wondering how to add the configs folder to the container? I tried to do this :
volumes:
- /configs:/configs
but it gives me services.volumes must be a mapping.
How can this be resolved?
You need to put volumes directive inside a service. Probably something like this:
app:
build:
context: ./server/app
dockerfile: Dockerfile
env_file:
- local.env
environment:
- POSTGRES_USER=postgres
- POSTGRES_PASSWORD=postgres
- POSTGRES_DB=devdb
volumes:
- ./configs:/configs
If multiple containers need it you'll have to repeat it in multiple services.
I just learned how to use docker-compose and I'm having some problems dockerizing my php-magento project
My project looks like this
app (magento)
nginx
mysql
redis
I'm getting an error when I try to execute these lines or when I add redis connection to magento env
Dockerfile - app
Error - without redis
Error- with redis
But if I comment these lines, it work's fine and I can execute they after the container is up.
I imagine that it's someting with the container's nettwork , but it's just a guess and I already put depends on and ensure that app is running after db and redis
Can someone help?
Docker-compose:
version: '3.8'
services:
app:
build:
context: .
dockerfile: Dockerfile-app
args:
...
volumes:
...
ports:
- 1000:80
healthcheck:
test: ["CMD","wait-for-it", "-h", "localhost", "-p", "80", "-t", "1", "-q"]
interval: 1m
timeout: 10s
retries: 3
start_period: 60s
environment:
...
#depends_on:
# - nginx
#entrypoint: ["sleep", "1200"]
nginx:
build:
context: .
dockerfile: Dockerfile-nginx
ports:
- "80:80"
restart: on-failure
volumes:
...
environment:
VIRTUAL_HOST: localhost
#entrypoint: ["sleep", "1200"]
redis:
image: redis
ports:
- "6379:6379"
volumes:
...
restart: always
database:
image: mysql:5.7
ports:
- 3306:3306
environment:
...
volumes:
...
volumes:
...
Using the below docker compose files, i am unable to bring up my app correctly. Docker says my LAPIS_ENV environment variable is not set, but i am setting it in my second compose file which I am expecting to be merged into the first one. I have tried including them in reverse order to no avail.
version: '2.4'
services:
backend:
mem_limit: 50mb
memswap_limit: 50mb
build:
context: ./backend
dockerfile: Dockerfile
depends_on:
- postgres
volumes:
- ./backend:/var/www
- ./data:/var/data
restart: unless-stopped
command: bash -c "/usr/local/bin/docker-entrypoint.sh ${LAPIS_ENV}"
postgres:
build:
context: ./postgres
dockerfile: Dockerfile
environment:
PGDATA: /var/lib/postgresql/data/pgdata
POSTGRES_HOST_AUTH_METHOD: trust
volumes:
- postgres:/var/lib/postgresql/data
- ./postgres/pg_hba.conf:/var/lib/postgres/data/pg_hba.conf
- ./data/backup:/pgbackup
restart: unless-stopped
volumes:
postgres:
version: '2.4'
services:
backend:
environment:
LAPIS_ENV: development
ports:
- 8080:80
#!/usr/bin/env bash
docker compose -f docker-compose.yml -f docker-compose.dev.yml up
Here is my docker-compose.yml:
version: '3.4'
services:
nginx:
restart: always
image: nginx:latest
ports:
- 80:80
volumes:
- ./misc/nginx.conf:/etc/nginx/conf.d/default.conf
- /static:/static
depends_on:
- web
web:
restart: always
image: celery-with-docker-compose:latest
build: .
command: bash -c "python /code/manage.py collectstatic --noinput && python /code/manage.py migrate && /code/run_gunicorn.sh"
volumes:
- /static:/data/web/static
- /media:/data/web/media
- .:/code
env_file:
- ./.env
depends_on:
- db
volumes:
- ./app:/deploy/app
worker:
image: celery-with-docker-compose:latest
restart: always
build:
context: .
command: bash -c "pip install -r /code/requirements.txt && /code/run_celery.sh"
volumes:
- .:/code
env_file:
- ./.env
depends_on:
- redis
- web
db:
restart: always
image: postgres
env_file:
- ./.env
volumes:
- pgdata:/var/lib/postgresql/data
ports:
- "5432:5432"
redis:
restart: always
image: redis:latest
privileged: true
command: bash -c "sysctl vm.overcommit_memory=1 && redis-server"
ports:
- "6379:6379"
volumes:
pgdata:
When I run docker stack deploy -c docker-compose.yml cryptex I got
Non-string key at top level: true
And docker-compose -f docker-compose.yml config gives me
ERROR: In file './docker-compose.yml', the service name True must be a quoted string, i.e. 'True'.
I'm using latest versions of docker and compose. Also I'm new to compose v3 and started to use it for getting availability of docker stack command. If you see any mistakes or redudants in config file please, let me know. Thanks
you have to validate you docker compose file, is probably that the have low value inside
Validating your file now is as simple as docker-compose -f
docker-compose.yml config. As always, you can omit the -f
docker-compose.yml part when running this in the same folder as the
file itself or having the