问题
colab offers free TPUs. It's easy to see how many cores are given, but I was wondering if its possible to see how much memory per core?
回答1:
As far as I know we don't have an Tensorflow op or similar for accessing memory info, though in XRT we do. In the meantime, would something like the following snippet work?
import os
from tensorflow.python.profiler import profiler_client
tpu_profile_service_address = os.environ['COLAB_TPU_ADDR'].replace('8470', '8466')
print(profiler_client.monitor(tpu_profile_service_address, 100, 2))
Output looks like:
Timestamp: 22:23:03
TPU type: TPU v2
Utilization of TPU Matrix Units (higher is better): 0.000%
TPUv2 has 8GB per-core and TPUv3 has 16GB HBM per-core (https://cloud.google.com/tpu).
来源:https://stackoverflow.com/questions/63009227/in-google-colab-is-there-a-way-to-check-what-tpu-verison-is-running