nlp-uoregon/trankit

Upgrade to Adapters library and recent Transformers versions

calpt opened this issue · 4 comments

Hey Trankit team,

First of all, thanks for this great toolkit!

I'm one of the authors and maintainers of AdapterHub, which you use for your adapter implementations. As you might already be aware of, we've recently released Adapters as our new main library, succeeding adapter-transformers. Along with many new features, this new package especially brings one change that might be interesting to your project: disentanglement from the Transformers codebase. This means the adapter implementations now live in a separate namespace and Transformers can be installed independently.

Switching to Adapters might bring multiple benefits to this project:

  • Not having to maintain your own copy of adapter-transformers
  • Leveraging the newest features & bug fixes within Transformers not present in your copy of adapter-transformers
  • Leveraging the work on our library regarding compatibility with new Transformers versions, Torch versions, Python versions etc.
  • Full compatibility on pre-trained adapter weights

If you'd be interested in this transition, I would open a first PR implementing the switch to Adapters. Also happy to discuss any further issues/ topics!

Hi @calpt,
Thanks for the update and letting us know.
Please make a PR and we will review it.
Thanks for your help!

Just a ping. It seems like a pretty important pull request, any plan to merge it? While I already merged it to my fork, for my use case of adapting Trankit to use a completely different model, even though it's probably simple enough to do it with the self-mentioned repo (not for me...), it feels way more tractable and clean with the new version. I don't know yet but at least that's my feeling.

Happy to help in case anything else needed to get this merged!

Hi everyone,
Thanks for your contributions.
I'm doing some testing to make sure there is no issue with the pull request #78.
This will take a few days to be finished.
Thanks!