Ways to implement multi-GPU BN layers with synchronizing means and vars

前端 未结 3 1652
盖世英雄少女心
盖世英雄少女心 2021-02-05 22:53

I\'d like to know the possible ways to implement batch normalization layers with synchronizing batch statistics when training with multi-GPU.

Caffe May

3条回答
  •  再見小時候
    2021-02-05 23:18

    A specialized keras layer SyncBatchNormalization is available Since TF2.2 https://www.tensorflow.org/api_docs/python/tf/keras/layers/experimental/SyncBatchNormalization

提交回复
热议问题