Copy files from GCS into a Cloud Run docker container during build

白昼怎懂夜的黑 提交于 2020-12-26 12:13:55

问题


I am trying to use gsutil to copy a file from GCS into a Run container during the build step.

The steps I have tried:

RUN pip install gsutil
RUN gsutil -m cp -r gs://BUCKET_NAME $APP_HOME/artefacts

The error:

ServiceException: 401 Anonymous caller does not have storage.objects.get access to the Google Cloud Storage object.
CommandException: 1 file/object could not be transferred.
The command '/bin/sh -c gsutil -m cp -r gs://BUCKET_NAME $APP_HOME/artefacts' returned a non-zero code: 1
ERROR
ERROR: build step 0 "gcr.io/cloud-builders/docker" failed: step exited with non-zero status: 1

The service account (default compute & cloudbuild) does have access to GCS, and I have also tried to gsutil config -a and with various other flags with no success!

I am not sure on exactly how I should authenticate to successfully access the bucket.


回答1:


Here my github action job

jobs:
  build:
    name: Build image
    runs-on: ubuntu-latest

    env:
      BRANCH: ${GITHUB_REF##*/}
      SERVICE_NAME: ${{ secrets.SERVICE_NAME }}
      PROJECT_ID: ${{ secrets.PROJECT_ID }}

    steps:
      - name: Checkout
        uses: actions/checkout@v2

      # Setup gcloud CLI
      - uses: google-github-actions/setup-gcloud@master
        with:
          service_account_key: ${{ secrets.SERVICE_ACCOUNT_KEY }}
          project_id: ${{ secrets.PROJECT_ID }}
          export_default_credentials: true

      # Download the file locally
      - name: Get_file
        run: |-
          gsutil cp gs://BUCKET_NAME/path/to/file .


      # Build docker image
      - name: Image_build
        run: |-
          docker build -t gcr.io/$PROJECT_ID/$SERVICE_NAME .

      # Configure docker to use the gcloud command-line tool as a credential helper
      - run: |
          gcloud auth configure-docker -q

      # Push image to Google Container Registry
      - name: Image_push
        run: |-
          docker push gcr.io/$PROJECT_ID/$SERVICE_NAME

You have to set 3 secrets:

  • SERVICE_ACCOUNT_KEY: which is your service account key file
  • SERVICE_NAME: the name of your container
  • PROJECT_ID: the project where to deploy your image

Because you download the file locally, the file is locally present in the Docker build. Then, simply COPY it in the docker file and do what you want with it.


UPDATE

If you want to do this in docker, you can achieve this like that

Dockerfile

FROM google/cloud-sdk:alpine as gcloud
WORKDIR /app
ARG KEY_FILE_CONTENT
RUN echo $KEY_FILE_CONTENT | gcloud auth activate-service-account --key-file=- \
  && gsutil cp gs://BUCKET_NAME/path/to/file .

....
FROM <FINAL LAYER>
COPY --from=gcloud /app/<myFile> .
....

The Docker build command

docker build --build-arg KEY_FILE_CONTENT="YOUR_KEY_FILE_CONTENT" \
  -t gcr.io/$PROJECT_ID/$SERVICE_NAME .

YOUR_KEY_FILE_CONTENT depends on your environment. Here some solution to inject it:

  • On Github Action: ${{ secrets.SERVICE_ACCOUNT_KEY }}
  • On your local environment: $(cat my_key.json)



回答2:


I see you tagged Cloud Build,

You can use step like this:

steps:
- name: gcr.io/cloud-builders/gsutil
  args: ['cp', 'gs://mybucket/results.zip', 'previous_results.zip']
# operations that use previous_results.zip and produce new_results.zip
- name: gcr.io/cloud-builders/gsutil
  args: ['cp', 'new_results.zip', 'gs://mybucket/results.zip']


来源:https://stackoverflow.com/questions/64969644/copy-files-from-gcs-into-a-cloud-run-docker-container-during-build

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!