zamling/PSALM

Number of Epochs during Training

Opened this issue · 1 comments

Hi, I wonder how many epochs do you train PSALM during the finetuning phase?

In your training script the num_train_epochs is set to 10, but in your paper you stated 56k with batch_size 64 are used during training. Setting num_train_epochs=10 results in the learning rate barely decreases and the loss is not going down for a long time. Should we decrease the num_train_epochs to a smaller number? For example, 3 or 4.

Thanks!

Hi @YunzeMan
we did 56K iterations totally during training. If you only training segmentation data, numbers for 10 epoch can be calculated as:

(~120,000 (COCO) + ~120,000 (RefCOCO) + ~120,000 (COCO-Interactive)) * 10 (epoch) / 64 (bs) = ~56K

And when adding QA dataset, we keep this number and choose an average of these four tasks.

If you are confusing loss, you can send your loss curve for each loss and maybe I can help you find problem