问题
I want to reproduce the custom loss function for LightGBM. This is what I tried:
lgb.train(params=params, train_set=dtrain, num_boost_round=num_round, fobj=default_mse_obj)
With default_mse_obj being defined as:
residual = y_true - y_pred.get_label()
grad = -2.0*residual
hess = 2.0+(residual*0)
return grad, hess
However, eval metrics are different for the default "regression" objective, compared to the custom loss function defined. I would like to know, what is the default function used by LightGBM for the "regression" objective?
回答1:
as you can see here, this is the default loss function for regression task
def default_mse_obj(y_pred, dtrain):
y_true = dtrain.get_label()
grad = (y_pred - y_true)
hess = np.ones(len(grad))
return grad, hess
来源:https://stackoverflow.com/questions/60290304/reproduce-lightgbm-custom-loss-function-for-regression