Is it possible to use service accounts to schedule queries in BigQuery “Schedule Query” feature ?

前端 未结 3 420
挽巷
挽巷 2021-01-14 05:41

We are using the Beta Scheduled query feature of BigQuery. Details: https://cloud.google.com/bigquery/docs/scheduling-queries

We have few ETL scheduled queries runn

相关标签:
3条回答
  • 2021-01-14 06:01

    BigQuery Scheduled Query now does support creating a scheduled query with a service account and updating a scheduled query with a service account. Will these work for you?

    0 讨论(0)
  • 2021-01-14 06:10

    As far as I know, unfortunately you can't use a service account to directly schedule queries yet. Maybe a Googler will correct me, but the BigQuery docs implicitly state this:

    https://cloud.google.com/bigquery/docs/scheduling-queries#quotas

    A scheduled query is executed with the creator's credentials and project, as if you were executing the query yourself

    If you need to use a service account (which is great practice BTW), then there are a few workarounds listed here. I've raised a FR here for posterity.

    0 讨论(0)
  • 2021-01-14 06:12

    While it's not supported in BigQuery UI, it's possible to create a transfer (including a scheduled query) using python GCP SDK for DTS, or from BQ CLI.

    The following is an example using Python SDK:

    r"""Example of creating TransferConfig using service account.
    
    Usage Example:
    1. Install GCP BQ python client library.
    2. If it has not been done, please grant p4 service account with
    iam.serviceAccout.GetAccessTokens permission on your project.
      $ gcloud projects add-iam-policy-binding {user_project_id} \
       --member='serviceAccount:service-{user_project_number}@'\
       'gcp-sa-bigquerydatatransfer.iam.gserviceaccount.com' \
       --role='roles/iam.serviceAccountTokenCreator'
    
       where {user_project_id} and {user_project_number} are the user project's
       project id and project number, respectively. E.g.,
      $ gcloud projects add-iam-policy-binding my-test-proj \
      --member='serviceAccount:service-123456789@'\
      'gcp-sa-bigquerydatatransfer.iam.gserviceaccount.com'\
      --role='roles/iam.serviceAccountTokenCreator'
    
    3. Set environment var PROJECT to your user project, and
    GOOGLE_APPLICATION_CREDENTIALS to the service account key path. E.g.,
       $ export PROJECT_ID='my_project_id'
       $ export GOOGLE_APPLICATION_CREDENTIALS=./serviceacct-creds.json'
    4. $ python3 ./create_transfer_config.py
    """
    
    import os
    from google.cloud import bigquery_datatransfer
    from google.oauth2 import service_account
    from google.protobuf.struct_pb2 import Struct
    
    PROJECT = os.environ["PROJECT_ID"]
    SA_KEY_PATH = os.environ["GOOGLE_APPLICATION_CREDENTIALS"]
    
    credentials = (
        service_account.Credentials.from_service_account_file(SA_KEY_PATH))
    
    client = bigquery_datatransfer.DataTransferServiceClient(
        credentials=credentials)
    # Get full path to project
    parent_base = client.project_path(PROJECT)
    
    params = Struct()
    params["query"] = "SELECT CURRENT_DATE() as date, RAND() as val"
    transfer_config = {
        "destination_dataset_id": "my_data_set",
        "display_name": "scheduled_query_test",
        "data_source_id": "scheduled_query",
        "params": params,
    }
    
    parent = parent_base + "/locations/us"
    
    response = client.create_transfer_config(parent, transfer_config)
    print response
    
    0 讨论(0)
提交回复
热议问题