(Studio2) Centralize and minimize prompt handling for LLMs
monorimet opened this issue · 0 comments
monorimet commented
We don't want to lose track of efforts to separate UI from execution, so this issue is here so that we don't forget to keep the two separate moving forward. Currently there is a small tweak to user prompt that lives outside of the API (in the UI) because of how the UI receives yielded tokens/history from the LLM api.