问题
I'm trying to use the standalone gsutil
tool from within a container running in a GKE cluster, but I cannot get it to work. I believe the cluster has adequate permissions (see below). However, running
./gsutil ls gs://my-bucket/
yields
ServiceException: 401 Anonymous users does not have storage.objects.list access to bucket my-bucket.
Am I missing anything? I don't have a .boto
file, as I believe it shouldn't be necessary—or is it? This is the list of scopes that the cluster and the node pool have:
- https://www.googleapis.com/auth/compute
- https://www.googleapis.com/auth/devstorage.full_control
- https://www.googleapis.com/auth/logging.write
- https://www.googleapis.com/auth/monitoring.write
- https://www.googleapis.com/auth/pubsub
- https://www.googleapis.com/auth/servicecontrol
- https://www.googleapis.com/auth/service.management.readonly
- https://www.googleapis.com/auth/trace.append
回答1:
Short answer:
Yes, you'll need some sort of boto file.
Long answer:
Generally, for GCE instances, you don't need a ~/.boto
file because the /etc/boto.cfg
file is already present -- the Boto library that GSUtil uses knows to look for this by default. On Debian images, it contains these lines:
# This file is automatically created at boot time by the /usr/lib/python
# 2.7/dist-packages/google_compute_engine/boto/boto_config.pyc script.
# Do not edit this file directly. If you need to add items to this file,
# create or edit /etc/boto.cfg.template instead and then re-run the
# script.
[GSUtil]
default_project_id = <PROJECT NUMBER HERE>
default_api_version = 2
[GoogleCompute]
service_account = default
[Plugin]
plugin_directory = /usr/lib/python2.7/dist-packages/google_compute_engine/boto
If you want to mimic this behavior on your GKE container, you'll have to have the google-compute-engine
python package installed, along with a having a boto file that tells gsutil to load that plugin from where ever it was installed to, as seen above. On GCE (and I'm assuming GKE as well, although I've not tested it), this plugin allows a VM to talk to its metadata server to obtain credentials for the specified service account.
回答2:
You can use gsutil inside a docker container on GKE with a service account, or with your own credentials.
Service Account
1) Add the service-account.json
file to your project.
2) Add a .boto
file to your project pointing to the service-account.json
file:
[Credentials]
gs_service_key_file = /path/to/service-account.json
3) In your Dockerfile, set the BOTO_CONFIG
environment variable to point to this .boto
file:
ENV BOTO_CONFIG=/path/to/.boto
Own Credentials
1) Locally, run gcloud auth login. A .boto
file will be created at ~/.config/gcloud/legacy_credentials/your@account.com/.boto with the following structure:
[OAuth2]
client_id = <id>.apps.googleusercontent.com
client_secret = <secret>
[Credentials]
gs_oauth2_refresh_token = <token>
2) Copy this .boto
file into your project
3) In your Dockerfile, set the BOTO_CONFIG
environment variable to point to this .boto
file:
ENV BOTO_CONFIG=/path/to/.boto
I installed standalone gsutil in the docker container using pip install gsutil
来源:https://stackoverflow.com/questions/44442354/using-standalone-gsutil-from-within-gke