Lightning-AI/lit-llama
Implementation of the LLaMA language model based on nanoGPT. Supports flash attention, Int8 and GPTQ 4bit quantization, LoRA and LLaMA-Adapter fine-tuning, pre-training. Apache 2.0-licensed.
PythonApache-2.0
Stargazers
- carmoccaSpain
- jsyanggSouth Korea.
- aog5New York, NY
- luiscapeNew York, NY
- meet-cjli
- gwthompson
- hedrergudene
- stephenLeeBeijing
- njneetesonCalgary, AB, Canada
- TheSeamau5Austin, TX
- awaelchliSwitzerland
- johnhenningCambridge, MA
- HellenNamulindaUganda
- Syzygianinfern0Earth, mostly.
- ajndkrAmsterdam
- ntkrnl
- jacobjkwu
- letsgitcrackinglocalhost
- rpand002Cambridge, MA, US
- utkd135
- pablovela5620Austin Texas
- felixdittrich92Germany
- JoanFMBarcelona, Spain
- msadikm
- sujitojha1Bangalore
- tzmartinSan Francisco, CA
- owlwangBeijing
- samanyougargEntire Cosmos
- baggiponte
- kristian-georgievCambridge, MA
- dkden7eLas Palmas de G.C.
- DanielFloresDiaz
- davidgoveaBrooklyn, NY
- DACUS1995Bucharest
- NikolasMarkouEU - UK - Cyprus
- the-exile-110