T5Tokenizer requires the SentencePiece library but it was not found in your environment

后端 未结 0 1015
礼貌的吻别
礼貌的吻别 2020-12-25 13:18

I am trying to explore T5

this is the code


!pip install -U transformers
!pip install sentencepiece
tokenizer = T5Tokenizer.from_pretrained(\'t5-small\         


        
相关标签:
回答
  • 消灭零回复
提交回复
热议问题