thunlp/OpenDelta

What is the difference between OpenDelta and adapter-transformers?

fighterhit opened this issue · 1 comments

Hi team, recently I was investigating the method of fine-tuning the PTMs using an adapter(delta) model. I found the functions implemented by OpenDelta and adapter-transformers are similar. Is there any difference between them? Thanks!

Thanks for your interest. The key difference is that opendelta is a package that does not modify the pre-trained models' code, which means that it has the potential to be applied to any pytorch models with transformers architecture, even the newly emerging ones. Adapter-transformers modifies codes of huggingface transformers in its implementation. For the key features, see Why OpenDelta.