PyTorch batching at char-level and token-level

前端 未结 0 1509
囚心锁ツ
囚心锁ツ 2020-11-22 03:06

I\'m trying to train an NLP model that uses a character level and token level LSTM to create and concatenate embeddings. The problem I\'m having is batching the data so that

相关标签:
回答
  • 消灭零回复
提交回复
热议问题