Metaseq
A codebase for working with Open Pre-trained Transformers.
Using OPT with 🤗 Transformers
The OPT 125M--30B models are now available in HuggingFace Transformers.
Getting Started in Metaseq
Follow setup instructions here to get started.
Documentation on workflows
Background Info
Support
If you have any questions, bug reports, or feature requests regarding either the codebase or the models released in the projects section, please don't hesitate to post on our Github Issues page.
Please remember to follow our Code of Conduct.
Contributing
We welcome PRs from the community!
You can find information about contributing to metaseq in our Contributing document.
The Team
Metaseq is currently maintained by the CODEOWNERS: Susan Zhang, Stephen Roller, Anjali Sridhar, Naman Goyal, Punit Singh Koura, Moya Chen, and Christopher Dewan.
License
The majority of metaseq is licensed under the MIT license, however portions of the project are available under separate license terms:
- Megatron-LM is licensed under the Megatron-LM license