AbanteAI/spice

Spice Prompt Management

biobootloader opened this issue · 0 comments

prompts are a pain point in AI engineering:

  • defining them inline in your code is awkward
  • need to deal with textwrap.dedent, etc
  • editing them and having to change line wrapping is frustrating
  • storing them in separate files is better but then you have to load from files
  • sometimes you want to use string formatting, like f"blah blah {x + y} blah blah"
  • in some cases you might actually want multiple versions of prompts tuned for different models

I'm not aware of any solution that solves all of these pain points. A starting point might be a flow like this:

from spice import prompts

prompts.register_dir("path/to/prompts/directory")

# loads prompt from `path/to/prompts/directory/prompt_x.txt`
messages = [{content: prompts.get_prompt('prompt_x')}]  

Maybe the above methods should be part of the spice client?

Regardless this will also help with future prompt iteration features, because spice will be aware of which parts of your messages were from prompts and could be edited.

A more full solution could include a new file format for prompts?

I've seen people use .toml to store prompts and jinja templates to fill things in: https://github.com/Codium-ai/AlphaCodium/blob/f608cb5479d878348c2ffa9b64e8515314366bc2/alpha_codium/settings/code_contests_prompts_fix_solution.toml