turboderp/exui

vllm backend?

ekg opened this issue · 1 comments

ekg commented

How hard would it be to use vllm for the backend?

Working with openai API compatible endpoints would be enough.

It would be difficult. It's written as more of a single-user UI for ExLlamaV2 than as a generic frontend for OpenAI servers. The client and server are very closely coupled.