RUMBoost is a python package to estimate Random Utility models with Gradient Boosted Decision Trees. More specifically, each parameter in the traditional utility function is replaced by an ensemble of regression trees with appropriate constraints to: i) ensure the guarantee of marginal utilities monotonicity; ii) incorporate alternative-specific attributes; and iii) provide an intrinsically interpretable non-linear form of the utility function, directly learnt from the data.
Currently RUMBoost can estimate the following RUMs:
- MNL
- Nested Logit
- Cross-Nested Logit
- An equivalent of the Mixed Effect model
For more details, you can refer to the preprint of our paper.
RUMBoost is launched on pypi. You can install it with the following command:
pip install rumboost
We recommend to install rumboost in a separate environment with its dependencies.
The full documentation can be found here. In addition, you can find several examples on how to use RUMBoost under the example folder. Currently, there are seven example notebooks. We recommend using them in this order:
- simple_rumboost: how to train and plot parameters of a simple RUMBoost model
- feature_interaction: how to include feature interactions for training and plotting
- shared_ensembles: how to train a RUMBoost model with one or more ensembles shared across alternatives
- functional_effect: how to train and plot a functional effect RUMBoost model
- nested: how to train a nested logit RUMBoost model
- cross-nested: how to train a cross-nested logit RUMBoost model
- smoothing_and_vot: how to smooth a RUMBoost output and plot the smoothed version, as well as computing and plotting VoT
- bootstrap: how to test the model robustness
- GPU_and_batch_training: how to train the model with batches and how to compute the gradients on the GPU
If you encounter any issues or have ideas for new features, please open an issue. You can also contact us at nicolas.salvade.22@ucl.ac.uk
This project is licensed under the MIT License - see the LICENSE file for details.
Salvadé, N., & Hillel, T. (2024). Rumboost: Gradient Boosted Random Utility Models. arXiv preprint arXiv:2401.11954