LSP-AI is an open source language server that serves as a backend for performing completion with large language models and soon other AI powered functionality. Because it is a language server, it works with any editor that has LSP support.
The goal of LSP-AI is to assist and empower software engineers by integrating with the tools they already know and love not replace software engineers.
A short list of a few of the editors it works with:
- VS Code
- NeoVim
- Emacs
- Helix
- Sublime
It works with many many many more editors.
See the wiki for instructions on:
LSP-AI can work as an alternative to Github Copilot.
LSP-AI.VS-Code.and.Helix.Demo.mp4
On the left: VS Code using Mistral Codestral. On the right: Helix using stabilityai/stable-code-3b
Note that speed for completions is entirely dependent on the backend being used. For the fastest completions we recommend using either a small local model or Groq.
tl;dr LSP-AI abstracts complex implementation details from editor specific plugin authors, centralizing open-source development work into one shareable backend.
Editor integrated AI-powered assistants are here to stay. They are not perfect, but are only improving and early research is already showing the benefits. While several companies have released advanced AI-powered editors like Cursor, the open-source community lacks a direct competitor.
LSP-AI aims to fill this gap by providing a language server that integrates AI-powered functionality into the editors we know and love. Here’s why we believe LSP-AI is necessary and beneficial:
-
Unified AI Features:
- By centralizing AI features into a single backend, LSP-AI allows supported editors to benefit from these advancements without redundant development efforts.
-
Simplified Plugin Development:
- LSP-AI abstracts away the complexities of setting up LLM backends, building complex prompts and soon much more. Plugin developers can focus on enhancing the specific editor they are working on, rather than dealing with backend intricacies.
-
Enhanced Collaboration:
- Offering a shared backend creates a collaborative platform where open-source developers can come together to add new functionalities. This unified effort fosters innovation and reduces duplicated work.
-
Broad Compatibility:
- LSP-AI supports any editor that adheres to the Language Server Protocol (LSP), ensuring that a wide range of editors can leverage the AI capabilities provided by LSP-AI.
-
Flexible LLM Backend Support:
- Currently, LSP-AI supports llama.cpp, OpenAI-compatible APIs, Anthropic-compatible APIs and Mistral AI FIM-compatible APIs, giving developers the flexibility to choose their preferred backend. This list will soon grow.
-
Future-Ready:
- LSP-AI is committed to staying updated with the latest advancements in LLM-driven software development.
There is so much to do for this project and incredible new research and tools coming out everyday. Below is a list of some ideas for what we want to add next, but we welcome any contributions and discussion around prioritizing new features.
- Implement semantic search-powered context building (This could be incredibly cool and powerful). Planning to use Tree-sitter to chunk code correctly.
- Support for additional backends
- Exploration of agent-based systems