create_pretraining_data.py is writing 0 records to tf_examples.tfrecord while training custom BERT model

后端 未结 0 419
广开言路
广开言路 2021-02-07 09:00

I am writing a custom BERT model on my own corpus, I generated the vocab file using BertWordPieceTokenizer and then running below code

!python create_pretraining_         


        
相关标签:
回答
  • 消灭零回复
提交回复
热议问题