I am using google colab pro and the provided TPU. I need to upload a pre-trained model into the TPU.
I've been struggling with this scenario myself (although with the free version of Colab) and just got it to work. This specific use case doesn't appear to be very well-documented—it seems the official documentation mostly deals with cases involving a Compute Engine VM, rather than an auto-assigned TPU. The process that worked for me went as follows:
!gcloud auth login
!gcloud config set project [Project ID of Storage Bucket]
and
from google.colab import auth
auth.authenticate_user()
resolver = tf.distribute.cluster_resolver.TPUClusterResolver(tpu='grpc://' + os.environ['COLAB_TPU_ADDR'])
tf.config.experimental_connect_to_cluster(resolver)
tf.tpu.experimental.initialize_tpu_system(resolver)
strategy = tf.distribute.experimental.TPUStrategy(resolver)
model = tf.keras.models.load_model('gs://[Bucket name and path to saved model]')
This initially failed, but the error message included the service account of the TPU trying to access the directory, and this is the address I gave access to as described in the Cloud Storage docs. The address is in the
service-[PROJECT_NUMBER]@cloud-tpu.iam.gserviceaccount.com
format but the project number isn't the Project ID of the project my bucket is in, nor a value I've been able to find anywhere else.
After I gave permissions to that service account (which I was only able to find in the error message), I was able to load and save models from my private bucket.
As stated in the public documentation in order to find the service account of your Colab TPU you just need to replace the project number in the following mail address:
service-[PROJECT_NUMBER]@cloud-tpu.iam.gserviceaccount.com
You can find your project number in the dashboard of your Google Cloud Project
After doing this you should set the access to your bucket as fine-grained access and provide access for this this account in the ACL of your bucket