I am new to tensorflow and to word2vec. I just studied the word2vec_basic.py which trains the model using Skip-Gram algorithm. Now I want to train using C
Skip-Gram
C
Basically, yes:
for the given text the quick brown fox jumped over the lazy dog:, the CBOW instances for window size 1 would be
the quick brown fox jumped over the lazy dog:
([the, brown], quick), ([quick, fox], brown), ([brown, jumped], fox), ...