I\'ve been trying to move on from word2vec and other implementations to build word embeddings from pretrained transformers, for example models such as BERT and GTP-2. My mai