google-cloud-composer

How we can use SFTPToGCSOperator in GCP composer enviornment(1.10.6)?

痞子三分冷 提交于 2020-06-23 08:46:10
问题 Here I want to use SFTPToGCSOperator in composer enviornment(1.10.6) of GCP. I know there is a limitation because The operator present only in latest version of airflow not in composer latest version 1.10.6. See the refrence - https://airflow.readthedocs.io/en/latest/howto/operator/gcp/sftp_to_gcs.html I found the alternative of operator and I created a plugin class, But again I faced the issue for sftphook class, Now I am using older version of sftphook class. see the below refrence - from

Google Composer- How do I install Microsoft SQL Server ODBC drivers on environments

时光毁灭记忆、已成空白 提交于 2020-06-17 03:37:54
问题 I am new to GCP and Airflow and am trying to run my python pipelines via a simple PYODBC connection via python 3. However, I believe I have found what I need to install on the machines [Microsoft doc]https://docs.microsoft.com/en-us/sql/connect/odbc/linux-mac/installing-the-microsoft-odbc-driver-for-sql-server?view=sql-server-2017 , but I am not sure where to go in GCP to run these commands. I have gone down several deep holes looking for answers, but don't know how to solve the problem Here

Using Cloud Functions as operators in a GC Composer DAG

独自空忆成欢 提交于 2020-05-30 07:59:25
问题 Fellow coders, For a project I'm interested in using Google Cloud Composer to handle several workflows that consist of operations that can be shared between workflows. It seems to me that Cloud Functions are a perfect way of performing tasks as these operations in a Composer DAG. For what I understood of it, I would need an operator that invokes a cloud function with data that is specific for the task in the specific DAG. I found a Google Cloud Function operator in the Airflow documentation,

Using Cloud Functions as operators in a GC Composer DAG

∥☆過路亽.° 提交于 2020-05-30 07:58:05
问题 Fellow coders, For a project I'm interested in using Google Cloud Composer to handle several workflows that consist of operations that can be shared between workflows. It seems to me that Cloud Functions are a perfect way of performing tasks as these operations in a Composer DAG. For what I understood of it, I would need an operator that invokes a cloud function with data that is specific for the task in the specific DAG. I found a Google Cloud Function operator in the Airflow documentation,

Why is there an automatic DAG 'airflow_monitoring' generated in GCP Composer?

妖精的绣舞 提交于 2020-05-29 10:35:47
问题 When creating an Airflow environment on GCP Composer, there is a DAG named "airflow_monitoring" automatically created and that is impossible to delete. Why ? How to handle it ? Should I copy this file inside my DAG folder and resign myself to make it part of my code ? I noticed that each time I upload my code it stops the execution of this DAG as it could not be found inside the DAG folder until it magically reappears. I have already tried deleting it inside the DAG folder, delete the logs,

Why is there an automatic DAG 'airflow_monitoring' generated in GCP Composer?

跟風遠走 提交于 2020-05-29 10:34:23
问题 When creating an Airflow environment on GCP Composer, there is a DAG named "airflow_monitoring" automatically created and that is impossible to delete. Why ? How to handle it ? Should I copy this file inside my DAG folder and resign myself to make it part of my code ? I noticed that each time I upload my code it stops the execution of this DAG as it could not be found inside the DAG folder until it magically reappears. I have already tried deleting it inside the DAG folder, delete the logs,

How to export large data from Postgres to S3 using Cloud composer?

廉价感情. 提交于 2020-05-15 18:34:06
问题 I have been using the Postgres to S3 operator to load data from Postgres to S3. But recently, I had to export a very large table and my Airflow composer fails without any logs, this could be because we are using the NamedTemporaryFile function of Python's tempfile module to create a temporary file and we are using this temporary file to load to S3. Since we are using Composer, this will be loaded to Composer's local memory, and since the size of the file is very large, it is failing. Refer

How to export large data from Postgres to S3 using Cloud composer?

六眼飞鱼酱① 提交于 2020-05-15 18:33:04
问题 I have been using the Postgres to S3 operator to load data from Postgres to S3. But recently, I had to export a very large table and my Airflow composer fails without any logs, this could be because we are using the NamedTemporaryFile function of Python's tempfile module to create a temporary file and we are using this temporary file to load to S3. Since we are using Composer, this will be loaded to Composer's local memory, and since the size of the file is very large, it is failing. Refer

Where is Airflow webserver running on Google Composer?

我是研究僧i 提交于 2020-05-14 09:05:09
问题 I have following pods: NAME READY STATUS RESTARTS AGE airflow-database-init-job-ggk95 0/1 Completed 0 3h airflow-redis-0 1/1 Running 0 3h airflow-scheduler-7594cd584-mlfrt 2/2 Running 9 3h airflow-sqlproxy-74f64b8b97-csl8h 1/1 Running 0 3h airflow-worker-5fcd4fffff-7w2sg 2/2 Running 0 3h airflow-worker-5fcd4fffff-m44bs 2/2 Running 0 3h airflow-worker-5fcd4fffff-mm55s 2/2 Running 0 3h composer-agent-0034135a-3fed-49a6-b173-9d3f9d0569db-ktwwt 0/1 Completed 0 3h composer-agent-0034135a-3fed-49a6

Any success story installing private dependency on GCP Composer Airflow?

匆匆过客 提交于 2020-03-22 07:55:09
问题 Background info Normally within a container environment I can easily install my private dependency with a requirements.txt like this: --index-url https://user:pass@some_repo.jfrog.io/some_repo/api/pypi/pypi/simple some-private-lib The package "some-private-lib" is the one I wanted to install. Issue Within the GCP Composer environment, I tried with the GCloud command ( gcloud composer environments update ENV_NAME --update-pypi-packages-from-file ./requirements.txt --location LOCATION ), but it