问题
I recently asked a question being confused about the weights I was receiving for the synthetic dataset I created. The answer I received was that the weights are being normalized. You can look at the details here. I'm wondering why LogisticRegressionWithSGD gives normalized weights whereas everything is fine in case of LBFGS in the same spark implementation. Is it possible that the weights weren't converging after all?
Weights I'm getting
[0.466521045342,0.699614292387,0.932673108363,0.464446310304,0.231458578991,0.464372487994,0.700369689073,0.928407671516,0.467131704168,0.231629845549,0.46465456877,0.700207596219,0.935570594833,0.465697758292,0.230127949916]
Weights I'm expecting ~
[2,3,4,2,1,2,3,4,2,1,2,3,4,2,1]
来源:https://stackoverflow.com/questions/44422875/why-does-my-weights-get-normalized-when-i-perform-logistic-regression-with-sgd-i