Does DDP always split one model across multiple GPU?

前端 未结 0 1646
谎友^
谎友^ 2021-01-18 19:21

I\'m new in pytorch.
Because I want to increase the batch size, and the model is too heavy, so I received feedback that using pytorch DP(DataParallel) and DDP(Distribute

相关标签:
回答
  • 消灭零回复
提交回复
热议问题