KeyError: 'tokenized'

后端 未结 0 1725
故里飘歌
故里飘歌 2021-01-16 21:20

Here is my code Please See the image for error


    # prepare data 
    from sklearn.feature_extraction.text import TfidfVectorizer
    vectorizer = TfidfVect         


        
相关标签:
回答
  • 消灭零回复
提交回复
热议问题