I would like to create S3 connection without interacting Airflow GUI. Is it possible through airflow.cfg or command line?
We are using AWS role and following connection parameter works for us: {"aws_account_id":"xxxx","role_arn":"yyyyy"}
So, manually creating connection on GUI for S3 is working, now we want to automate this process and want to add it as part of the Airflow deployment process. Any work around?
You can use the airflow CLI. Unfortunately there is no support for editing connections, so you would have to remove and add as part of your deployment process, e.g.:
airflow connections -d --conn_id 'aws_default'
airflow connections -a --conn_id 'aws_default' --conn_uri 'aws:' --conn_extra '{"region_name": "eu-west-1"}'
The query part of the URI will be transformed into a JSON and copied into the extra field of the connection so you can do this:
export AIRFLOW_CONN_S3_DEFAULT=s3://s3/?aws_account_id=99999999,role_arn=bbbbb
Looks silly but it should work. See Connection.
I was able to figure it out after checking out S3_hook.py
For example:
export AIRFLOW_CONN_S3_DEFAULT={"aws_account_id":"99999999","role_arn":"bbbbb"}
Here:
- "S3_DEFAULT" is the connection id here
- And passing AWS account id and role_arn and creating an environment variable.
It's probably late but now there is a section in the documentation about this:
When referencing the connection in the Airflow pipeline, the conn_id should be the name of the variable without the prefix. For example, if the conn_id is named postgres_master the environment variable should be named AIRFLOW_CONN_POSTGRES_MASTER (note that the environment variable must be all uppercase). Airflow assumes the value returned from the environment variable to be in a URI format (e.g. postgres://user:password@localhost:5432/master or s3://accesskey:secretkey@S3).
来源:https://stackoverflow.com/questions/44707085/creating-connection-outside-of-airflow-gui