This is the code used for the paper Towards Learning Universal Hyperparameter Optimizers with Transformers (NeurIPS 2022).
All dependencies can be installed in requirements.txt
. The two main components are T5X and OSS Vizier.
To use our pre-trained OptFormer (exactly as-is from the paper), follow the steps:
- Download the model checkpoint from [TODO].
- Load the model checkpoint into the
InferenceModel
, as shown in policies_test.py.
The InferenceModel
will then be wrapped into the OptFormerDesigner
, which follows the same API as a OSS Vizier standard Designer
.
TODO
Disclaimer: This is not an officially supported Google product.