/fork

Evals is a framework for evaluating LLMs and LLM systems, and an open-source registry of benchmarks.

Primary LanguagePythonMIT LicenseMIT

Stargazers

No one’s star this repository yet.