/microllm

My own implementation to run inference on local LLM models

Primary LanguagePythonGNU Affero General Public License v3.0AGPL-3.0

Watchers