hackclub/sprig

Add instrumentation / logging to LLM backend

Closed this issue · 1 comments

Please ensure that we have robust logging, encompassing the full payloads sent to/from the llm service (this logging should be LLM provider-agnostic). At minimum, we need to be able to inform ppl of how many AI-assisted interactions we enabled. At maximum, it'd be good to save all requests and responses somewhere for future analysis and iteration/improvements.

implemented since hackclub/llm-api#6