Requests require updated implementation for new oobabooga blocking and non-blocking APIs
Closed this issue · 2 comments
nicksavarese commented
There are new APIs available for the text-generation-webui. I'd like to implement the the non-blocking / streaming API so text streams into the text field directly as the LLM outputs, but the current API offered by the UI does not accept most parameters (max token length, temperature, etc.) and so, the control over responses can vary greatly.
Looking into providing a dedicated API for the text-gen-webui to work with this extension going forward.
nicksavarese commented
The blocking API is now supported, but leaving this issue open until the streaming API is implemented.
nicksavarese commented
Shipped in 3e036bd