How can I get the relative importance of features of a logistic regression for a particular prediction?

醉酒当歌 提交于 2019-12-03 15:14:34

If you want the importance of the features for a particular decision, why not simulate the decision_function (Which is provided by scikit-learn, so you can test whether you get the same value) step by step? The decision function for linear classifiers is simply:

intercept_ + coef_[0]*feature[0] + coef_[1]*feature[1] + ...

The importance of a feature i is then just coef_[i]*feature[i]. Of course this is similar to looking at the magnitude of the coefficients, but since it is multiplied with the actual feature and it is also what happens under the hood it might be your best bet.

I suggest to use eli5 which already have similar things implemented.

For you question: Actual question: What is the best way to interpret the importance of each feature, at the moment of a decision, with a linear classifier?

I would say the answer come the the function show_weights() from eli5.

Furthermore this can be implemented with many other classifiers.

For more info you can see this question in related question.

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!