baudm/parseq

After Tune.py, How to set the base lr?

Closed this issue · 1 comments

Topdu commented

Hi @baudm thank you for your great works.
I got the result: Best hyperparameters found were: {'lr': 0.00021083783614390136}, useing the Ray Tune with(CUDA_VISIBLE_DEVICES=0,1 ./tune.py tune.num_samples=20) to search for lr. But I don't know how to set the base lr in the training config file. I simply replaced the PARSeq code with my custom code. May I ask if the default lr(7e-4) is directly replaced with 0.0002108?

baudm commented

Hello, you can change the learning rate in two ways:

  1. Edit the value of the lr field from the YAML config of the model you want to train, e.g. configs/model/parseq.yaml
  2. Hydra override via command line:
    ./train.py model.lr=<new LR>