What does it mean to “break symmetry”? in the context of neural network programming? [duplicate]
问题 This question already has answers here : Why should weights of Neural Networks be initialized to random numbers? [closed] (9 answers) Closed last year . I have heard a lot about "breaking the symmetry" within the context of neural network programming and initialization. Can somebody please explain what this means? As far as I can tell, it has something to do with neurons performing similarly during forward and backward propagation if the weight matrix is filled with identical values during