问题
I am new to google cloud services and I am trying to set up an automatic build of my production requiring to download a heavy file.
I would like to download a file from a dedicated Google Storage bucket inside the Docker build process. To do so, I have added the following line to my Dockerfile
:
RUN curl https://storage.cloud.google.com/[bucketname]/[filename] -o [filename]
Since files from this bucket shouldn't be publicly accessible, I disabled object level permission and added to the member [ProjectID]@cloudbuild.gserviceaccount.com
the right Storage Object Viewer
.
But when the docker file script run, the file downloaded is empty
Step 7/9 : RUN curl https://storage.cloud.google.com/[bucketname]/[filename] -o [filename]
---> Running in 5d1a5a1bbe87
% Total % Received % Xferd Average Speed Time Time Time Current
Dload Upload Total Spent Left Speed
0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0
0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0
Removing intermediate container 5d1a5a1bbe87
---> 42938a9cc8d1
Step 8/9 : RUN ls -l [filename]
---> Running in 34ac112051a1
-rw-r--r-- 1 root root 0 Jun 15 00:37 [filename]
This link works perfectly well if I login in google.console and access it through my navigator.
I tried changing the permission settings, and ended up adding cloud build account
, storage legacy bucket reader
, storage legacy object reader
, storage object viewer
together without much success.
I am obviously doing something wrong. But its not clear to me if:
- This link format is only valid in the console and I should use an other URL to get this file
- The permission configuration is wrong
- I still have to process to some http authorization through curl
- I am overlooking something else.
Thanks for your help :)
回答1:
After a long research, try and error, I managed to find out a good way to do it. here is the recipe for those whou may need to reproduce a similar setup, and my future me.
- Create a service account in console.cloud.google.com / AIM & Admin dedicated to this task (You can also use cloud shell https://cloud.google.com/iam/docs/creating-managing-service-accounts)
- Generate a key for this service account (you can use the web console or type
gcloud iam service-accounts keys create ~/key.json --iam-account [SA-NAME]@[PROJECT-ID].iam.gserviceaccount.com
in Cloud Shell (https://cloud.google.com/iam/docs/creating-managing-service-account-keys#iam-service-account-keys-create-gcloud) - Add this file to your repository. Install Cloud Shell to your Docker so that you can log in (right, you are going to install a whole sdk just to log in...) :
RUN curl https://sdk.cloud.google.com | bash > /dev/null
ENV PATH="${PATH}:/root/google-cloud-sdk/bin"
- Now, in a shell script run by
RUN ./myscript.sh
from your docker file, you will add:- Activate your service account with
gcloud auth activate-service-account [ACCOUNT] --key-file=~/key.json
(https://cloud.google.com/sdk/gcloud/reference/auth/activate-service-account) - You can generate an authentification token associated to this account with
TOKEN=`gcloud auth print-access-token [ACCOUNT]`
(https://cloud.google.com/sdk/gcloud/reference/auth/print-access-token) - Finaly, you can add the curl command :
I did not usedRUN curl -L -H "Authorization: Bearer [TOKEN]" https://www.googleapis.com/storage/v1/b/[bucketname]/o/[objectname]?alt=media -o filename
gcloud cp gs://bucketname/bucketfile ./
becausepython2
was not available in my Docker image and python3 isn't supported by google. - Activate your service account with
Congratulate yourself with a chocolat cake or a sugary treat. (:
Bonus: If like me your docker build timeout, you have to add a cloudbuild.yaml
next to your Dockerfile
. Here is a generic file I use for my builds:
steps:
- name: 'gcr.io/cloud-builders/docker'
args: [ 'build', '-t', 'gcr.io/$PROJECT_ID/$REPO_NAME:$BUILD_ID', '.' ]
images:
- 'gcr.io/$PROJECT_ID/$REPO_NAME:$BUILD_ID'
timeout: 900s
来源:https://stackoverflow.com/questions/56608777/give-access-to-a-google-storage-bucket-to-google-build-while-building-a-docker-i