mybigday/llama.rn

Prompt cache

jhen0409 opened this issue · 0 comments

https://github.com/ggerganov/llama.cpp/blob/92d0b751a77a089e650983e9f1564ef4d31b32b9/examples/main/main.cpp#L243

Support save prompt cache as file so we could speed up context initialization + prompt processing.