问题
I am using apache airflow for running my dags.
I am getting an exception as:
*** Log file does not exist: /opt/airflow/logs/download2/download2/2020-07-26T15:00:00+00:00/1.log
*** Fetching from: http://fb3393f5f01e:8793/log/download2/download2/2020-07-26T15:00:00+00:00/1.log
*** Failed to fetch log file from worker. HTTPConnectionPool(host='fb3393f5f01e', port=8793): Max retries exceeded with url: /log/download2/download2/2020-07-26T15:00:00+00:00/1.log (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7f8ba66d7b70>: Failed to establish a new connection: [Errno 111] Connection refused',))
My docker compose file for webserver, scheduler and postgres is:
version: "2.1"
services:
postgres_airflow:
image: postgres:12
environment:
- POSTGRES_USER=airflow
- POSTGRES_PASSWORD=airflow
- POSTGRES_DB=airflow
ports:
- "5432:5432"
postgres_Service:
image: postgres:12
environment:
- POSTGRES_USER=developer
- POSTGRES_PASSWORD=secret
- POSTGRES_DB=service_db
ports:
- "5433:5432"
scheduler:
image: apache/airflow
restart: always
depends_on:
- postgres_airflow
- postgres_Service
- webserver
env_file:
- .env
volumes:
- ./dags:/opt/airflow/dags
command: scheduler
healthcheck:
test: ["CMD-SHELL", "[ -f /usr/local/airflow/airflow-webserver.pid ]"]
interval: 30s
timeout: 30s
retries: 3
webserver:
image: apache/airflow
restart: always
depends_on:
- pg_airflow
- pg_metadata
- tenants-registry-api
- metadata-api
env_file:
- .env
volumes:
- ./dags:/opt/airflow/dags
- ./scripts:/opt/airflow/scripts
ports:
- "8080:8080"
entrypoint: ./scripts/airflow-entrypoint.sh
healthcheck:
test: ["CMD-SHELL", "[ -f /usr/local/airflow/airflow-webserver.pid ]"]
interval: 30s
timeout: 30s
retries: 3
I am getting this exception while using the PythonVirtualenvOperator.
My dag file is:
from datetime import datetime
from airflow import DAG
from airflow.operators.python_operator import PythonOperator
default_args = {'owner': 'airflow',
'start_date': datetime(2018, 1, 1)
}
dag = DAG('download2',
schedule_interval='0 * * * *',
default_args=default_args,
catchup=False)
def hello_world_py():
return "data"
with dag:
t1 = PythonOperator(
task_id='download2',
python_callable=hello_world_py,
op_kwargs=None,
provide_context=True,
dag=dag
)
env file:
AIRFLOW__CORE__SQL_ALCHEMY_CONN=postgresql://airflow:airflow@postgres_airflow:5432/airflow
AIRFLOW__CORE__FERNET_KEY=XXXX
AIRFLOW_CONN_METADATA_DB=postgres://developer:secret@postgres_Service:5432/service_db
AIRFLOW__VAR__METADATA_DB_SCHEMA=service_db
AIRFLOW__WEBSERVER__BASE_URL=http://0.0.0.0:8080/
I have also explicitly set AIRFLOW__CORE__REMOTE_LOGGING=False to disable the remote logs, still getting an exception. Also tried placing everything inside the bridge network. Nothing worked for me, though the DAG passes.
Also tried adding:
image: apache/airflow
restart: always
depends_on:
- scheduler
volumes:
- ./dags:/opt/airflow/dags
env_file:
- .env
ports:
- 8793:8793
command: worker
Did not work for me
回答1:
You need to expose worker log-server port (worker_log_server_port setting in airflow.cfg, 8793 by default) in docker-compose, like:
worker:
image: apache/airflow
...
ports:
- 8793:8793
来源:https://stackoverflow.com/questions/63103207/airflow-log-file-exception