EleutherAI/gpt-neox
An implementation of model parallel autoregressive transformers on GPUs, based on the Megatron and DeepSpeed libraries
PythonApache-2.0
Stargazers
- AbecidUC Berkeley
- AdilZouitineIRT Saint-Exupéry
- albusdemensKRAFTON
- Crowler70
- DomHudson
- edwiosioStation Ltd.
- fauzanardhJakarta
- fractalhq
- fronay
- ganlerUniversity of Illinois Urbana-Champaign
- gegallego@mt-upc
- guillefixOxford
- Hellisotherpeople@Oracle
- HHousenVocode
- ii64
- itsnotashesVirgo Cluster
- joaomarcoscsilvaUniversidade de São Paulo
- jshearerRochester, NY
- kylerchin@catenarytransit
- LexiconCode
- loretoparisi@Musixmatchdev
- minhlong94Ho-chi-minh-city
- morri2
- niladellArctoris
- nps1nghSaarbrücken
- ochen1Canada
- osipovCounterFactual.AI
- parasjUC Berkeley
- PDillisComputer Vision Center
- RameshArvindBaltimore, MD
- ShinEnkiOxford, England
- sixclusterchina
- steven-mi@GetYourGuide
- TheDudeFromCIHobbyist
- TMatsThe University of Tokyo @matsuolab @matsuolab-research
- wexler