muti output regression in xgboost

前端 未结 1 932
借酒劲吻你
借酒劲吻你 2021-02-05 04:15

Is it possible to train a model in Xgboost that have multiple continuous outputs (multi regression)? What would be the objective to train such a model?

Thanks in advance

相关标签:
1条回答
  • 2021-02-05 04:40

    My suggestion is to use sklearn.multioutput.MultiOutputRegressor as a wrapper of xgb.XGBRegressor. MultiOutputRegressor trains one regressor per target and only requires that the regressor implements fit and predict, which xgboost happens to support.

    # get some noised linear data
    X = np.random.random((1000, 10))
    a = np.random.random((10, 3))
    y = np.dot(X, a) + np.random.normal(0, 1e-3, (1000, 3))
    
    # fitting
    multioutputregressor = MultiOutputRegressor(xgb.XGBRegressor(objective='reg:linear')).fit(X, y)
    
    # predicting
    print np.mean((multioutputregressor.predict(X) - y)**2, axis=0)  # 0.004, 0.003, 0.005
    

    This is probably the easiest way to regress multi-dimension targets using xgboost as you would not need to change any other part of your code (if you were using the sklearn API originally).

    However this method does not leverage any possible relation between targets. But you can try to design a customized objective function to achieve that.

    0 讨论(0)
提交回复
热议问题