BERT Pre-training accuracy not increasing

后端 未结 0 1392
小蘑菇
小蘑菇 2021-01-06 08:49

I am trying to pretrain BERT on dataset (wiki103) which contains 150k sentences. After 12 epochs nsm task gives accuracy around 0.76 (overfits if I continue with more epochs

相关标签:
回答
  • 消灭零回复
提交回复
热议问题