Is there a way to use GPU instead of CPU for BERT tokenization?

后端 未结 0 639
面向向阳花
面向向阳花 2021-01-23 16:51

I\'m using a BERT tokenizer over a large dataset of sentences (2.3M lines, 6.53bn words):

#creating a BERT tokenizer
tokenizer = BertTokenizer.from_pretrained(\'b         


        
相关标签:
回答
  • 消灭零回复
提交回复
热议问题