问题
This is literally all the code that I am trying to run:
from transformers import AutoModelWithLMHead, AutoTokenizer
import torch
tokenizer = AutoTokenizer.from_pretrained("microsoft/DialoGPT-small")
model = AutoModelWithLMHead.from_pretrained("microsoft/DialoGPT-small")
I am getting this error:
---------------------------------------------------------------------------
ImportError Traceback (most recent call last)
<ipython-input-14-aad2e7a08a74> in <module>
----> 1 from transformers import AutoModelWithLMHead, AutoTokenizer
2 import torch
3
4 tokenizer = AutoTokenizer.from_pretrained("microsoft/DialoGPT-small")
5 model = AutoModelWithLMHead.from_pretrained("microsoft/DialoGPT-small")
ImportError: cannot import name 'AutoModelWithLMHead' from 'transformers' (c:\python38\lib\site-packages\transformers\__init__.py)
What do I do about it?
回答1:
I solved it! Apperantly AutoModelWithLMHead is removed on my version.
Now you need to use AutoModelForCausalLM
for causal language models, AutoModelForMaskedLM
for masked language models and AutoModelForSeq2SeqLM
for encoder-decoder models.
So in my case code looks like this:
from transformers import AutoModelForCausalLM, AutoTokenizer
import torch
tokenizer = AutoTokenizer.from_pretrained("microsoft/DialoGPT-small")
model = AutoModelForCausalLM.from_pretrained("microsoft/DialoGPT-small")
来源:https://stackoverflow.com/questions/63141267/importerror-cannot-import-name-automodelwithlmhead-from-transformers