问题
I am working on a bitbucket pipeline for pushing image to gc container registry. I have created a service account with Storage Admin role. (bitbucket-authorization@mgcp-xxxx.iam.gserviceaccount.com)
gcloud auth activate-service-account --key-file key.json
gcloud config set project mgcp-xxxx
gcloud auth configure-docker --quiet
docker push eu.gcr.io/mgcp-xxxx/image-name
Although that the login is successful, i get: Token exchange failed for project 'mgcp-xxxx'. Caller does not have permission 'storage.buckets.get'. To configure permissions, follow instructions at: https://cloud.google.com/container-registry/docs/access-control
Can anyone advice on what i am missing?
Thanks!
回答1:
In the past I had another service account with same name and different permissions. After discovering that service account names are cached, I created a new service account with different name and it's pushing properly.
回答2:
for anyone else coming across this, my issue was that I had not granted my service account Storage legacy bucket reader
. I'd only granted it Object viewer
. Adding that legacy permission fixed it.
It seems docker is still using a legacy method to access GCR
回答3:
These are step-by step commands which got me to push first container to a GCE private repo:
export PROJECT=pacific-shelter-218
export KEY_NAME=key-name1
export KEY_DISPLAY_NAME='My Key Name'
sudo gcloud iam service-accounts create ${KEY_NAME} --display-name ${KEY_DISPLAY_NAME}
sudo gcloud iam service-accounts list
sudo gcloud iam service-accounts keys create --iam-account ${KEY_NAME}@${PROJECT}.iam.gserviceaccount.com key.json
sudo gcloud projects add-iam-policy-binding ${PROJECT} --member serviceAccount:${KEY_NAME}@${PROJECT}.iam.gserviceaccount.com --role roles/storage.admin
sudo docker login -u _json_key -p "$(cat key.json)" https://gcr.io
sudo docker push gcr.io/pacific-shelter-218/mypcontainer:v2
回答4:
For anyone reading all the way here. The other suggestions here did not help me, however I found that the Cloud Service Build Account role was also required. Then the storage.buckets.get
dissappears.
This is my minimal role (2) setup to push docker images:
The Cloud Service Build Account role however adds many more permissions that simply storage.buckets.get
. The exact permissions can be found here.
note: I am well aware the Cloud Service Build Account role also adds the storage.objects.get
permission. However, adding roles/storage.objectViewer
did not resolve my problem. Regardless of the fact it had the storage.objects.get
permission.
If the above does not work you might have the wrong account active. This can be resolved with:
gcloud auth activate-service-account --key-file key.json
If that does not work you might need to set the docker credential helpers with:
gcloud auth configure-docker --project <project_name>
On one final note. There seemed to be some delay between setting a role and it working via the gcloud
tool. This was however minimal, think of a scope less than a minute.
Cheers
回答5:
Here in the future, I've discovered that I no longer have any Legacy options. In this case I was forced to grant full Storage Admin. I'll open a ticket with Google about this, that's a bit extreme to allow me to push an image. This might help someone else from the future.
回答6:
add service account role
on google cloud IAM
Editor
Storage object Admin
Storage object Viewer
fix for me
回答7:
GCR just uses GCS to store images check the permissions on your artifacts. folder in GCS within the same project.
回答8:
Tried several things, but it seems you have to run gcloud auth configure-docker
来源:https://stackoverflow.com/questions/51873072/cant-push-image-to-google-container-registry-caller-does-not-have-permission