lixin4ever/BERT-E2E-ABSA

about training time

xingbw opened this issue · 3 comments

hi, quite appreciate your work, and I'm wondering how long it takes to train

For the models without introducing recurrent unit (i.e., BERT-Linear, BERT-SAN, BERT-TFM, BERT-CRF), one complete run of training on laptop14 will cost about 15-20 minutes (verified on NVIDIA GTX 1080, max_steps=1500, n_gpus=3, per_gpu_train_batch_size=32).

The training time becomes 27-30 minutes when the same configurations are applied to BERT-GRU.

OK I see, thanks very much! !

Hi lixin4ever,

i am a newbie to the field of nlps and just stumbled over your paper on my literature search for one of my current projects. I do appreciate your work and your model might be of help.

Is there a way to manually change the number of training epochs in the fast_run.py script? I would like to reduce the runtime for test cases so i can probably test on just one single training epoch and just increase runtime if needed.

Thanks a lot in advance for helping me out.

Best regards