/aiui.nvim

A unified set of modules to interact with different LLM providers.

Primary LanguageLuaMIT LicenseMIT

aiui.nvim

A unified set of modules to interact with different LLM providers.

Why aiui.nvim?

Unify your development experience across different LLM providers with aiui.nvim's adaptable UI modules, allowing for easy model switching without changing your workflow.

Features

  • Unified LLM Interface: Swap LLM providers on-the-fly while keeping your workflow consistent.

  • In-Editor Chat: Engage with LLMs in a familiar chat interface inside neovim. chat_demo

  • Single buffer Difference: Visualize LLM-suggested code changes directly within your buffer, akin to a git diff. diff_demo

  • Fuzzy seach to select chats: Fuzzy search-enabled model and instance switching or resuming past chats.

  • Conversations as files: Store chat logs as readable markdown and session data as json for external access.

Checkout the roadmap for upcomming features

Getting Started

Assuming you are using Lazy.nvim

{
  "MLFlexer/aiui.nvim",
  dependencies = {
    "nvim-lua/plenary.nvim",
    "nvim-telescope/telescope.nvim",
  },

  init = function()
    --adds default keybindings and initializes
    require("aiui").add_defaults()

    -- If NOT using the default setup:
    -- add you LLM provider
    -- local ModelCollection = require("aiui.ModelCollection")
    -- local ollama_client = require("models.clients.ollama.ollama_curl")
    -- ModelCollection:add_models(ollama_client:get_default_models())

    -- Add any agents you like
    -- ModelCollection:add_agents({
    -- 	default_agent = "You are a chatbot, answer short and concise.",
    -- })

    -- Initialize the Chat and set default keybinds and autocmds
    -- local Chat = require("aiui.Chat")
    -- Chat:new({
    --   name = "Mistral Tiny",
    --   model = "mistral-tiny",
    --   context = {},
    --   agent = "default_agent",
    -- })
    -- Chat:apply_default_keymaps()
    -- Chat:apply_autocmd()
  end,
}

Need help? Checkout how the default setup is done in: aiui/defaults.lua or ask in the Discussions tab.

Adding your own LLM client

This section is unfinished, however you should implement the function annotations for the ModelClient. Need help, see clients directory or ask in the Discussions tab.

Roadmap

Chat Features

  • Highly customizable.
  • Support for concurrent chat instances.
  • Persisting and retrieving chat history.
  • Code reference shortcuts (like @some_function or /some_file) within chats.
  • New chat creation and retrieval via fuzzy search.
  • Real-time chat streaming.
  • Popup chat window.
  • Buffer chat window.

Inline Code Interactions

  • Integrated diff views for in-buffer modifications.
  • Quickly add comments, fix errors, ect. for visual selection.
  • LSP interactions to fix errors or other LSP warnings.
  • Start a Chat window with the visual selection.