How to use wandb sweep for hyperparameter search when finetuning with llama2
Closed this issue · 1 comments
PurvangL commented
Is your feature request related to a problem? Please describe.
A clear and concise description of what the problem is. Ex. I'm always frustrated when [...]
Describe the solution you'd like
A clear and concise description of what you want to happen.
Provide a code snippet on how new APIs/changes would be used by others.
I would like to do hyperparameter search with wandb sweep. Right now I can't find an option to add sweep config in yaml file. How can I enable wandb sweep and fine-tune llama2 model?
https://wandb.ai/aarora/Nvidia%20NeMO/reports/Train-Optimize-Analyze-Visualize-and-Deploy-Models-for-Automatic-Speech-Recognition-with-NVIDIA-s-NeMo--VmlldzoxNzI0ODEw
Describe alternatives you've considered
A clear and concise description of any alternative solutions or features you've considered.
Additional context
Add any other context or screenshots about the feature request here.
PurvangL commented
@okuchaiev following up regarding issue. please let me know if any other information needed.