AI assistants are transformational for programmers. However, ChatGPT 4 is also relatively slow. Streaming its responses greatly improves the user experience. These utilities attempts to bring these tools closer to the command-line and editor while preserving streaming. There are three parts here:
- A Rust binary that streams completion responses to stdin
- A shell script that builds a little REPL over that binary
- A Neovim Lua plug-in that brings this functionality into the editor
The Rust program can be built with cargo build
. It expects an OPENAI_API_KEY
environment variable. The Rust program can take two kinds of input, read from stdin:
- Raw input In this case, a System prompt is provided in the compiled code
- Transcript The Rust program also accepts a homegrown "transcript" format in which transcript sections are delineated by lines which look like this
===USER===
If a transcript does not start with a System section, then the default System prompt is used.
The included lua script can be copied to .config/nvim/lua
and installed with something like
vim.cmd("command! ChatGPT lua require'chatgpt'.chatgpt()")
This command locates the Rust binary through the SHELLBOT
environment variable. This should be set to the absolute path of the rust binary built in the step above.
This plugin is optimized to allow for streaming. It attempts to keep new input in view by repositioning the cursor at the end of the buffer as new text is appended. The plugin takes care to work in the case that the user switches away from the window where the response is coming in. To turn off the cursor movement while a response is streaming, hit "Enter" or "Space." This will free the cursor for the rest of the response.
shellbot.sh
can be used from the command line in cases where the editor isn't active. Because it uses fold
for word wrap, it works best in a narrow window. The first prompt comes from $EDITOR. Subsequent prompts are taken with read
. Hitting enter on a blank line does submit.