What is a projection layer in the context of neural networks?

前端 未结 3 1704
暖寄归人
暖寄归人 2021-01-30 02:14

I am currently trying to understand the architecture behind the word2vec neural net learning algorithm, for representing words as vectors based on their context.

<
3条回答
  •  野的像风
    2021-01-30 02:52

    The continuous bag of words is used to predict a single word given its prior and future entries: thus it is a contextual result.

    The inputs are the computed weights from the prior and future entries: and all are given new weights identically: thus the complexity / features count of this model is much smaller than many other NN architectures.

    RE: what is the projection layer: from the paper you cited

    the non-linear hidden layer is removed and the projection layer is shared for all words (not just the projection matrix); thus, all words get projected into the same position (their vectors are averaged).

    So the projection layer is a single set of shared weights and no activation function is indicated.

    Note that the weight matrix between the input and the projection layer is shared for all word positions in the same way as in the NNLM

    So the hidden layer is in fact represented by this single set of shared weights - as you correctly implied that is identical across all of the input nodes.

提交回复
热议问题