Standalone chat-memory / history prompt module
Closed this issue · 1 comments
Hi, I like your project as one of the first that I've seen that takes token budget into consideration. I have been looking into this for awhile, outlining what I think could make a useful standalone library for chat-memory. I'm curious if you would be interested in working on this together?
I've outlined and started here: https://github.com/neural-loop/chat-memory/blob/main/SPEC.md
Here is the general overview that I've been looking at
- Multiple buffers
- messages (a log of prompts,response, timestamp, userid, token count)
- medium term (a summarized version)
- long term (a very condensed history)
- a buffer is set via config (example 4096), and a person could set say 30% allocated to long term and 50% medium term, and 20% to messages)
- Keyword tagging
- messages are tagged with keywords
- prompts containing the keywords will pull in those messages to the returning history prompt
- Configurable Storage options
- localStorage
- database / sqlite / cloud
- Configurable Summary & Keywords
- https://www.npmjs.com/package/node-summary
- engines: https://www.oneai.com/blog/creating-meaningful-article-summaries-using-python
- python via api using: Sumy, Gensim
I have been playing around with some code for this in python and javascript over the last week, but it's still in a conceptual space. Seems it could be useful to un-tether this from the specific implementations, as multiple people are working on similar tasks / chatbot type interfaces.
Hello, that sounds like a really fun thing to work on but at the moment I have an important personal project to work on so I can't put in the time. I wish you luck! I really like the idea of splitting the tokens for different types of memory.