nicksavarese/allora-ios

Requests require updated implementation for new oobabooga blocking and non-blocking APIs

Closed this issue · 2 comments

There are new APIs available for the text-generation-webui. I'd like to implement the the non-blocking / streaming API so text streams into the text field directly as the LLM outputs, but the current API offered by the UI does not accept most parameters (max token length, temperature, etc.) and so, the control over responses can vary greatly.

Looking into providing a dedicated API for the text-gen-webui to work with this extension going forward.

The blocking API is now supported, but leaving this issue open until the streaming API is implemented.

Shipped in 3e036bd