-1
With this Docker-Compose I run my local images on Windows.
version: "3.7"
x-airflow-environment: &airflow-environment
AIRFLOW__CORE__AIRFLOW_HOME: /usr/local/airflow
AIRFLOW__CORE__DAGS_FOLDER: /usr/local/airflow/dags
AIRFLOW__CORE__BASE_LOG_FOLDER: /usr/local/airflow/logs
AIRFLOW__CORE__EXECUTOR: LocalExecutor
AIRFLOW__CORE__LOAD_EXAMPLES: "False"
AIRFLOW__CORE__SQL_ALCHEMY_CONN: postgresql+psycopg2://airflow:airflow@postgres:5432/airflow
AIRFLOW__CORE__FERNET_KEY: W2imcBfFi9Bjy0Xy-zBDg6v2Xhkf573CmNCOscx1Efc=
AIRFLOW__DAG_DEFAULT_VIEW: graph
services:
postgres:
image: banco-post:nginx
environment:
POSTGRES_USER: airflow
POSTGRES_DB: airflow
POSTGRES_PASSWORD: airflow
init:
image: init_1:nginx
environment:
<<: *airflow-environment
depends_on:
- postgres
volumes:
- ./dags:/usr/local/airflow/dags
- ./plugins:/usr/local/airflow/plugins
- ./logs:/usr/local/airflow/logs
entrypoint: /bin/bash
command: >
-c "airflow list_users || (airflow initdb
&& airflow create_user --role Admin --username airflow --password airflow -e [email protected] -f airflow -l airflow)"
restart: on-failure
webserver:
image: webserver:nginx
ports:
- 8080:8080
environment:
<<: *airflow-environment
depends_on:
- init
volumes:
- ./dags:/usr/local/airflow/dags
- ./plugins:/usr/local/airflow/plugins
- ./logs:/usr/local/airflow/
entrypoint: /bin/bash
command: -c "airflow webserver"
restart: always
scheduler:
image: scheduler:nginx
environment:
<<: *airflow-environment
depends_on:
- webserver
volumes:
- ./dags:/usr/local/airflow/dags
- ./plugins:/usr/local/airflow/plugins
- ./logs:/usr/local/airflow/
entrypoint: /bin/bash
command: -c "airflow scheduler"
restart: always
The problem is that I cannot access local folders on my Ubuntu 18.04 to load the Dags. Here are my Dags on Ubuntu:
Here one can see that the Dags do not appear:
It’s as if my contents didn’t really "see" the local folder of Ubuntu. How do I do this?