The TinyLlama project is an open endeavor to pretrain a 1.1B Llama model on 3 trillion tokens.
Primary LanguagePythonApache License 2.0Apache-2.0
No issues in this repository yet.