A question about the hyperparameter
Opened this issue · 1 comments
szhang1112 commented
How are the hyperparameters of xgboost selected in the emsemble model?
jjanizek commented
Sorry for taking so long to respond to this -- great question, in order to pick the HPs used for the individual XGBoost models in the XGBoost ensemble, we used a setting of HPs that had tended to work well across a variety of stratification settings and dataset splits in other experiments. Since we were training 100 models, we did not do fine-tuning of the HPs in order to save time. This potentially makes the individual XGBoost models slightly less optimal, but this is compensated by the ensembling procedure.