问题
We use DBT with GCP and BigQuery for transformations in BigQuery, and the simplest approach to scheduling our daily run dbt
seems to be a BashOperator
in Airflow. Currently we have two separate directories / github projects, one for DBT and another for Airflow. To schedule DBT to run with Airflow, it seems like our entire DBT project would need to be nested inside of our Airflow project, that way we can point to it for our dbt run
bash command?
Is it possible to trigger our dbt run
and dbt test
without moving our DBT directory inside of our Airflow directory? With the airflow-dbt package, for the dir
in the default_args
, maybe it is possible to point to the gibhub link for the DBT project here?
回答1:
My advice would be to leave your dbt and airflow codebases separated. There is indeed a better way:
- dockerise your dbt project in a simple python-based image where you COPY the codebase
- push that to DockerHub or ECR or any other docker repository that you are using
- use the DockerOperator in your airflow DAG to run that docker image with your dbt code
I'm assuming that you use the airflow LocalExecutor here and that you want to execute your dbt run
workload on the server where airflow is running. If that's not the case and that you have access to a Kubernetes cluster, I would suggest instead to use the KubernetesPodOperator
.
回答2:
Accepted the other answer based on the consensus via upvotes and the supporting comment, however I'd like to post a 2nd solution that I'm currently using:
dbt
andairflow
repos / directories are next to each other.- in our airflow's
docker-compose.yml
, we've added our DBT directory as a volume so that airflow has access to it. - in our airflow's
Dockerfile
, install DBT and copy ourdbt
code. - use
BashOperator
torun dbt
andtest dbt
.
来源:https://stackoverflow.com/questions/64890144/how-to-run-dbt-in-airflow-without-copying-our-repo