如果这样的树的根在 xgboost 中被修剪了怎么办?
数据挖掘
机器学习
决策树
xgboost
助推
梯度提升决策树
2022-03-03 22:17:21
1个回答
您将留下单节点树。拆分的损失减少受到以下惩罚,但根本身不会被修剪。这很容易测试:
import xgboost as xgb
import numpy as np
from sklearn.datasets import load_boston
X, y = load_boston(return_X_y=True)
model = xgb.XGBRegressor(gamma=1e12) # outrageously large gamma
model.fit(X, y)
# model makes a single prediction for everything:
print(np.unique(model.predict(X)))
# out: [22.532211]
print(y.mean())
# out: 22.532806324110677
# Check out the trees more directly:
model.get_booster().trees_to_dataframe()
# out: frame, one row per tree; just the root node, which is a leaf
(与单个预测之间的细微差异y.mean()是由于学习率缩小了单叶树;我们正在收敛到平均值。)
其它你可能感兴趣的问题
