EleutherAI/gpt-neo

GPT-neo 350M weights?

gangiswag opened this issue · 3 comments

Would it be possible to also share weights for the 350M parameter model, since I don't see it here: https://huggingface.co/EleutherAI

Asking since GPT-neo 350M is comparable with BERT-large (345M) in terms of number of parameters.

Unfortunately, we don't have a 350M parameter version of GPT-Neo. Would you be able to use a 350M parameter GPT-NeoX model? Unfortunately it's not compatible with the HuggingFace API yet.

@gangiswag https://huggingface.co/xhyi/PT_GPTNEO350_ATG/tree/main

I want to stress that this is not an EleutherAI-endorced model. I have no idea what it's contents or performance are.