Allen0307/AdapterBias

L0 regularization in AdapterBias

Opened this issue · 2 comments

Hello, I can't understand your L0 regularization in AdapterBias section after reading the paper, and I can't find the code. Could you please send me the code of L0 regularization in AdapterBias?

我也想知道

Hi,
Our implementation of L0 regularization is the same as diff-pruning(https://arxiv.org/abs/2012.07463). The code could be found in here(https://github.com/dguo98/DiffPruning/blob/main/examples/run_glue_diffpruning.py). However, the performance of utilizing L0 regularization on AdapterBias did not work well. We recommend directly fine-tuning on AdapterBias.