Questions About Bert's Pre Training

后端 未结 0 1949
余生分开走
余生分开走 2020-12-09 15:34

Just to get a better understanding of how does bert work.... I wanted to know:

  1. In the Next Sequence Prediction Training, does the model get only 2 sentences

相关标签:
回答
  • 消灭零回复
提交回复
热议问题