How to load pretrained checkpoint
Mountchicken opened this issue · 2 comments
Mountchicken commented
Hi @PhoenixZ810
Thanks for the great work!
How should I load the model weights during finetuning? I have already downloaded the weights for mgllava/vicuna7b/iter_10805
.pth, but I did not see an option to load the weights in xtuner/tools/train.py
. Additionally, after completing the first stage of pretraining, how should I load the weights of the MLP Projector for the second stage of finetuning?
PhoenixZ810 commented
Thank you for your attention.
As I mentioned in Before Train section, please update the pretrained_pth
in config if you want to fine-tune MG-LLaVA.
If you want to run evaluation, please set the model path --checkpoint
in test.sh.
Hope this helps! 😊
Mountchicken commented
That works. Thanks