/optuna-allennlp

🚀 A demonstration of hyperparameter optimization using Optuna for models implemented with AllenNLP.

Primary LanguageJupyter Notebook

experimental_result

Optuna using AllenNLP

Demonstration for using Optuna with AllenNLP integration.

Quick Start

Google Colab

Open in Colab

On your computer

# create virtual environment
python3 -m venv venv
. venv/bin/activate

# install libraries
pip install -r requirements.txt

# train a model using AllenNLP cli
allennlp train -s result/allennlp config/imdb_baseline.jsonnet

# run hyperparameter optimization
python optuna_train.py

# define-and-run style example
python optuna_train_custom_trainer.py --device 0 --target_metric accuracy --base_serialization_dir result

[New!!] Use allennlp-optuna

You can use allennlp-optuna, an AllenNLP plugin for hyperparameter optimization.

# Installation
pip install allennlp-optuna

# You need to register allennlp-optuna to allennlp using .allennlp_plugins
# It is not required if .allennlp_plugins already exists on your working directory
echo 'allennlp_optuna' >> .allennlp_plugins

# optimization
allennlp tune config/imdb_optuna.jsonnet config/hparams.json --serialization-dir result

Attention!

Demonstration uses GPU. If you want to run the scripts in this repository, please update cuda_device = -1 in allennlp config and optuna_config.

Blog Articles