问题
I'm using Cloud Build with the gcloud
builder. I override the entrypoint
to be bq
so I can run some BigQuery SQL in my build step. Previously, I had the SQL embedded directly in the YAML config for Cloud Build. This works fine:
steps:
- name: gcr.io/cloud-builders/gcloud
entrypoint: 'bq'
args: ['query', '--use_legacy_sql=false', 'SELECT 1']
Now I'd like to refactor the SQL out of the YAML and into a file instead. According to here, you can cat
the file or pipe it to bq
. This works on the command line without any problems.
But, I can't get it to work with Cloud Build. I've tried lots of different combinations, and escaping chars etc. but no matter what I try the shell doesn't evaluate/execute the cat my_query.sl
backticks, and instead thinks that it's the query itself:
Works fine:
Build in Cloud Build it won't work:
steps:
- name: gcr.io/cloud-builders/gcloud
entrypoint: 'bq'
args: ['query', '--use_legacy_sql=false', '`cat my_query.sql`']
I also tried piping it instead of using cat
, but I get the same error.
I must be missing something obvious here, but I can't see it. I could build a custom docker image, and wrap everything in a shell script, but I'd rather not have to do that if possible.
How do you use Cloud Build with shell evaluation inside a build step?
回答1:
You can create a custom Bash script, e.g.:
#!/bin/bash
if [ $# -eq 0 ]; then
echo "No arguments supplied"
fi
bq query --use_legacy_sql=false < $1
Name this run_query.sh
, then define your steps as:
steps:
- name: gcr.io/cloud-builders/gcloud
entrypoint: 'bash'
args: ['run_query.sh', 'my_query.sql']
Disclaimer: this is based on reading the docs, but I haven't actually used Cloud Build.
回答2:
I have done this:
- name: 'gcr.io/cloud-builders/gcloud'
entrypoint: 'bash'
dir: 'my/directory'
args: ['-c', 'bq --project_id=my-project-name query --use_legacy_sql=false < ./my_query.sql']
Which works with gcloud builds submit ...
and eliminates one file if you prefer.
来源:https://stackoverflow.com/questions/55349260/execute-a-bigquery-query-in-cloud-build-step