liltom-eth/llama2-webui
Run any Llama 2 locally with gradio UI on GPU or CPU from anywhere (Linux/Windows/Mac). Use `llama2-wrapper` as your local llama2 backend for Generative Agents/Apps.
Jupyter NotebookMIT
Stargazers
- abidlabs
- aespinozaStructum, Inc.
- antwgames
- bmilde
- brandnewxAsia
- brunocruzfranchiHospital Italiano de Buenos Aires
- camillevitarBuenos Aires, Argentina
- Cfor3
- choltha
- cshnai
- dhgrsPreferred Networks Inc.
- DLRAM
- duringleaves@adobe
- greg-q10
- hdvrai
- JADGardnerUniversity of York
- jiawen5USC
- jlb1504
- JonaRiley
- kittinanBangkok, Thailand
- lab176344MoiiAi Inc
- Mcxdcyy
- runelk
- SandalotsVolcanak
- shantanudevChicago
- shuxiaokaiM78 Nebula
- Starlight143
- T2Je
- taesiriPlanet Mars
- TzRainSoutheast University
- worthmining
- zeroxaa
- zhonglinxiePeking University
- zhongwei
- ziligyziligy
- ZongZiWang@harveyai