/optformer

Primary LanguagePythonApache License 2.0Apache-2.0

OptFormer: Transformer-based framework for Hyperparameter Optimization

This is the code used for the paper Towards Learning Universal Hyperparameter Optimizers with Transformers (NeurIPS 2022).

Installation

All dependencies can be installed in requirements.txt. The two main components are T5X and OSS Vizier.

Usage

Pre-trained OptFormer as a Policy

To use our pre-trained OptFormer (exactly as-is from the paper), follow the steps:

  1. Download the model checkpoint from [TODO].
  2. Load the model checkpoint into the InferenceModel, as shown in policies_test.py.

The InferenceModel will then be wrapped into the OptFormerDesigner, which follows the same API as a OSS Vizier standard Designer.

Training the OptFormer (Coming Soon!)

TODO

Disclaimer: This is not an officially supported Google product.