new dags not shown on docker airflow - docker

Problem: new dags not shown on docker airflow, no error when running airflow dags list-import-errors
Docker image: official airflow image
Dags path inside docker-compose.yaml (this is the default path):
volumes:
- ./dags:/opt/airflow/dags
I put the dag file inside the dags folder on the main directory, as shown below:
However, the dags is still not shown on both webserver UI and airflow dags list. Running airflow dags list-import-errors also yield no result.
When I open the docker terminal, I can see my dag inside the dags folder via ls command. I also tried to make the owner root by using chown, but both of my dag still not shown up on the list.
The airflow run successfully (via docker compose) as I can see example dags, but not my own dags.
Any help will be appreciated. Thanks!

I would try a few things:
rename files to sth like gcp_dag.py and python_dag.py
ensure import airflow is present in each file
ensure you create DAG object in each file
add __init__.py (empty) file to dags folder
It would be also helpful to see contents of at least one of those files.

do not name your python DAGs test_* that probably does not matter but this is usually a convention for unit tests
make sure there is a DAG word in your dags (or change https://airflow.apache.org/docs/apache-airflow/stable/configurations-ref.html#dag-discovery-safe-mode to False) - the DAGs can be ignored if they do not contain "DAG" or "airflow"
Exec into your scheduler container and check if it can see and read the DAGs and whether the env vars are properly set airflow info run in your scheduler container (providing that it is run with the same env as the running scheduler) should show all the configuration
Check if your scheduler is runnning
look at the logs of the scheduler and see if it is scanning the right folders - enabling DEBUG log might be helpful https://airflow.apache.org/docs/apache-airflow/stable/configurations-ref.html#logging-level
see if you can see the dags with airflow dags sub commands

It might also be a problem with permissions. Check what user is assigned to the DAG files or better try to view the files from inside the container, like this:
docker exec -it <container> bash
cat /opt/airflow/dags/test_gcp.py

Related

Airflow run bash command on an existing docker container

I have Airflow running in a Docker container and I want to trigger a python script that resides in another container. I tried the regular bash operator but that seems to be only for local. Also looked at the Docker operator but that one seems to want to create a new container.
The airflow container must be able to access the python script to be executed. If the script is in another container, either you mount a volume that airflow can access it or you can execute DAG with KubernetesPodOperator.

How to run a plotly dashboard on my localhost through Airflow - Docker?

I added to my DAG the python file that builds an interactive dashboard on my localserver but when it runs in the DAG the site can't be reached. Do I need to set something up on my container?

Run Airflow with docker-compose from a different folder

Using the docker-compose.yml from here and running docker-compose up I am able to run Airflow without problems (Ubuntu). But this only works if the parent folder is named /airflow. If it is named something else, the airflow-init service will fail:
ERROR: for airflow-init Container "f28089f55f79" is unhealthy.
ERROR: Encountered errors while bringing up the project.
I want to be able to run Airflow using docker-compose up when my project lives in a different folder, e.g.: myproject/docker-compose.yaml
What should I do to make this work?
By checking Apache airflow page and the docs it says that $AIRFLOW_HOME by default is ~/airflow
Just change this variable.

Where is Docker storing the source files for the downloaded image(apache airflow )

I downloaded the official Docker image for Apache Airflow but I cannot seem to locate any of the Airflow files on my Mac. Where did they go? I tried to echo $AIRFLOW_HOME but it returns nothing.
I can see the sample DAGs on the UI, but where are they on my machine? I opened the docker compose.yaml, but couldn’t find much.
Also when i run docker-compose up airflow init airflow info, i get a bunch of paths like /opt/ airflow that dont even exist on my Mac.

Drupal folders within docker

I succesfully installed drupal 7 with docker.
Using docker4drupal, now my question when I start editing my drupal site is, where are the folders containing drupal?
Let's say I installed a new theme and want to swap the images for the banner, how do I access the drupal folder containing the images, or would it be preciser to ask : Where does Docker storage them?
My docker compose line is :
-codebase : /var/www/html
I know that installing it using :
./:/var/www/html
Would install drupal in the same directory my docker-compose.yml is, but for some reason it doesn't work and still doesn't show me where the files are.
Any help is welcome!
If you are not using volumes to mount your existing code, the code resides inside the docker container. You can access it only by getting inside the container using docker exec. If you are using the default docker-compose.yml that came with the repo, then the name of the container will be "docker4drupal_nginx_1" (since nginx is the default).
Run this code to get inside the container:
docker exec -it docker4drupal_nginx_1 /bin/bash
exec allows you to execute commands inside the container.
-it allows you to start an interactive terminal
/bin/bash allows you to start the bash terminal inside the container
Once you are inside container run ls and you will see drupal files including "web".
MORE USEFUL
However, this is not a useful way if you want to work on the files and probably use an editor. Instead, mount a directory on host machine. First make a new directory where your docker-compose.yml file is with the name "codebase".
Then, update the docker-compose.yml so that:
- codebase:/var/www/html
becomes
- ./codebase:/var/www/html
Do this in both php and nginx service definisions. Of course, you should do this after you run docker-compose down with your previous set up. Then restart containers using docker-compose up -d.
Then, you will notice that the Drupal files are present in the codebase directory.
If you see at the bottom of the yml file, you will see that "codebase" is defined as a Docker volume. This implies the storage is managed by Docker and it will get stored somewhere in /var/lib/docker/ along with the container itself.
Hope this helps.

Resources