DefTruth/Awesome-LLM-Inference

Context compression methods?

liyucheng09 opened this issue · 2 comments

This is such a great collection! Best llm reading list I came across.

I think it would be a good idea to add context compression methods such as LlmLingua, selective-context, and auto-compressor.

would you like to submit a PR to add these two paper in this repo?

merged #7