eugenepentland/landmark-attention-qlora
Landmark Attention: Random-Access Infinite Context Length for Transformers QLoRA
PythonApache-2.0
Stargazers
- 2tYfX8sgnrx
- adeelahmad@XohoTech
- AlpinDale
- Art9681
- bacocoCACIB
- balsulami
- bhavitsharmaRelicX
- blumplan
- constasmile
- dustyatx
- eugenepentland
- fleszar1E.U.
- grimulkan
- gulliverwaite
- hemanthkumarak
- hiyorijl
- howardslee
- ihendleyCambridge, Massachusetts
- jonpage0
- kmnisUniversity of Chicago
- kunatoKUNANA AI
- lgwacker@mundipagg @stone-payments @pagarme
- lightningRalf
- mleacraft
- muntasir2000
- ofirkrisTap
- RyanCraigheadToronto, CA
- saivigNew Delhi, India
- shrikrishnahollaBengaluru, India
- sibblwe-do.ai
- smiraldrBengaluru
- spacemiqoteCaoNiMa Inc
- TheSeamau5Entrepreneur
- worthmining
- xnohatCybertizen .Inc
- zacharyweissReMo Energy, Inc.