I am using XGBoost GPU version in Python and it crashes whenever I try to run .predict. It works for a smaller data set but for my current problem it is not working.
Saving the model, deleting the booster then loading the model again should achieve this.
# training bst = xgb.train(param, dtrain, num_round) #save model joblib.dump(bst, 'xgb_model.dat') bst.__del__() #load saved model bst = joblib.load('xgb_model.dat') preds = bst.predict(dtest)