You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Sorry for taking so long to respond to this -- great question, in order to pick the HPs used for the individual XGBoost models in the XGBoost ensemble, we used a setting of HPs that had tended to work well across a variety of stratification settings and dataset splits in other experiments. Since we were training 100 models, we did not do fine-tuning of the HPs in order to save time. This potentially makes the individual XGBoost models slightly less optimal, but this is compensated by the ensembling procedure.
How are the hyperparameters of xgboost selected in the emsemble model?
The text was updated successfully, but these errors were encountered: