/llm-tools

Random collection of tools for interacting with LLMs

Primary LanguagePythonMIT LicenseMIT

llm-tools

Random collection of tools for interacting with LLMs.

  1. memocache.py caches the results of function calls. It is useful for projects that make bulk API calls.
  2. parallel_processor.py is for making parallel API calls.
  3. equivalent_model_wrapper.py stores OpenAI models names that currently point to the same model, which is useful for making parallel calls.

TODO: add demo.