Create a new conda environment:
conda env create -f environment.yml
Install the project:
pip install -e .
Each experiment requires a config json, and there many examples of config files in config
.
To use the config file config/<name>.json
, run the following command from the root folder
python main.py --config <name>
You can also include the --bg
flag if you would like to redirect stderr and stdout to a different file and save the outputs.
python main.py --config <name> --bg
The Bayesian optimization loop is in main.py
models
: the model code for each of the surrogate models we consider.
test_functions
: objective functions for benchmark problems
Our library supports any test function which extends the BaseTestProblem
class defined in BoTorch (documentation). This class requires an implementation of the evaluate_true
method, which takes in X values and returns the value of the objective function at those values.
For example, in order to specify the objective function
class Toy(BaseTestProblem):
dim = 1
def evaluate_true(self, X: Tensor) -> Tensor:
return torch.pow(X, 2)
Many of the test function we use in the library are defined in the test_functions
folder, or directly imported from BoTorch.
To add a new test function, we need to modify get_test_function
in main.py
. This function will parse the string specified in the config file by "test_function" and use it to initialize the test function.
The core logic for each model lies in the model
folder. To add a new surrogate model implementation, add a new class that extends the Model
class (model.model
). This class extends the botorch.model.model.Model class (documentation).
The two most important functions to implement are posterior
, which computes the posterior at the specified points (BoTorch documentation), and fit_and_save
, which fits the model to the queried points.
To run the new surrogate model and compare it with other models, we will need to modify the initialize_model
function in main.py
in order to parse the config file and initialize the model appropriately.