orobix/fwdgrad

Use FGD to fine-tune the transformer

cyl943123 opened this issue · 1 comments

Hi, Cool Work!

I'm curious about the performance of using the FGD to fine-tune the transformer on GLUE task
do you have done it before?

Thanks!!

Hi @cyl943123, nope I haven't tried. I think this will be difficult: even for the simple MNIST a subtle change in the hyperparameters, the learning rate for example, led to instabilities. It will still be nice to see generalization to other tasks. Have you tried something in this regard?