/lit-llama

Simple (fast) transformer inference in PyTorch with torch.compile + lit-llama code

Primary LanguagePython

No issues in this repository yet.