Multiple TPUv2 devices in one training script?

后端 未结 0 1624
旧时难觅i
旧时难觅i 2021-01-15 10:07

As part of the Tensorflow Research Cloud initiative, I have access to 100 TPU v2 machines with 8 TPUs on them (TPU v2-8s).

I need to achieve model data parallelism.

相关标签:
回答
  • 消灭零回复
提交回复
热议问题