training costs for fine-tuning LLaMA (CapsFus-LLaMA)
YoojLee opened this issue · 2 comments
YoojLee commented
Hi, thanks for such a great work!
I would like to ask you of training costs for fine-tuning LLaMA2-13B on the caption fusion task. If possible, please let me know which gpu you have used and how many days (or hours) it costs!
yqy2001 commented
Thank you for your interest. The finetuning cost is about 1-2 days with 8 A800-80G gpus based on Alpaca's codebase, as only 2M samples are enough (2 epochs).
YoojLee commented
Thanks for quick reply!