A Logseq plugin that integrates with OpenAI-compatible LLM APIs to provide AI assistance directly in your notes.
- 🔌 Works with OpenAI-compatible APIs (LM Studio, Ollama, Google Gemini, x.ai Grok, and more)
- 🎯 Custom system prompts for specialized tasks
- ⚡️ Quick access with keyboard shortcuts
- 🎛️ Adjustable temperature and response length
- ⌨️ Customizable hotkeys
- 🔗 Page Content Fetching: If a block contains only a page link (e.g.,
[[page name]]), the plugin will fetch the content of the linked page and use it as the context for the LLM. This allows you to use the plugin with page content as if it were block content. By default our plugin only works with current block. This is useful when you want to apply prompts to multiple blocks at the same time, you can use the "Block to Page" plugin to turn multiple blocks into page and then apply logseq-copilot system prompts.
- Download the plugin
- Enable it in Logseq Settings > Plugins
- Configure your API settings
- Verify your API connection
-
API Endpoint: Your OpenAI-compatible API endpoint
- OpenAI: https://api.openai.com/v1 (Get API Key)
- Azure OpenAI: https://YOUR_RESOURCE_NAME.openai.azure.com/openai/deployments/YOUR_DEPLOYMENT_NAME (Get API Key)
- Google Gemini: https://generativelanguage.googleapis.com/v1beta/openai (Get API Key)
- x.ai Grok: https://api.x.ai/v1 (Get API Key)
- Anthropic Claude: https://api.anthropic.com/v1 (Get API Key)
- LM Studio (local): http://localhost:1234/v1 (Download)
- Ollama (local): http://localhost:11434/v1 (Download)
-
API Key: Your API key
- Required for cloud providers (OpenAI, Google AI Studio, x.ai, Anthropic, etc.)
- Can be empty for local providers (LM Studio, Ollama)
-
Model: Your model name (case-sensitive, must match exactly)
- OpenAI:
- 'gpt-4-turbo-preview'
- 'gpt-4'
- 'gpt-3.5-turbo'
- Read more models
- Azure OpenAI: Use your deployment name
- Google Gemini:
- 'gemini-1.5-pro'
- 'gemini-1.5-flash'
- 'gemini-1.5-flash-latest'
- Read more models
- x.ai Grok:
- 'grok-1'
- 'grok-beta'
- Read more models
- Anthropic:
- 'claude-3-opus'
- 'claude-3-sonnet'
- 'claude-3-haiku'
- Read more models
- LM Studio: Use the exact model name as shown in the UI
- 'mistral-7b-instruct'
- 'llama2-7b-chat'
- 'neural-chat'
- Read more models
- Ollama:
- 'mistral'
- 'llama2'
- 'codellama'
- 'neural-chat'
- Read more models
- OpenAI:
-
Temperature: Control response randomness (0-1)
-
Max Tokens: Set maximum response length
You can configure up to three custom system prompts for specialized tasks:
- Custom Prompt 1: Triggered by
/copilot1or Ctrl+Shift+J - Custom Prompt 2: Triggered by
/copilot2or Ctrl+Shift+K - Custom Prompt 3: Triggered by
/copilot3or Ctrl+Shift+L
- Select any block in your notes
- Either:
- Type
/copilotand press Enter, or - Press Ctrl+Shift+H (default hotkey)
- Type
- The AI will respond to your block content
- First, configure your custom prompts in plugin settings
- Select a block in your notes
- Trigger the custom prompt either by:
- Using slash commands:
/copilot1,/copilot2, or/copilot3 - Using hotkeys: Ctrl+Shift+J/K/L
- Using slash commands:
Default shortcuts (can be customized in Settings > Shortcuts):
- Default Copilot: Ctrl+Shift+H
- Custom Prompt 1: Ctrl+Shift+J
- Custom Prompt 2: Ctrl+Shift+K
- Custom Prompt 3: Ctrl+Shift+L
- Test your API connection with the verify button before use
- Adjust temperature for more creative (higher) or focused (lower) responses
- Use custom prompts for frequently repeated tasks
- Customize hotkeys in Logseq's Settings > Shortcuts if you prefer different combinations
If you encounter any issues or have suggestions, please:
- Check if your API connection is verified
- Ensure you have selected a block before using commands
- Check the console for any error messages
We welcome contributions! Here's how you can help improve the Logseq Copilot Plugin.
-
Prerequisites
- Node.js (LTS version recommended)
- Yarn
- Babashka
- Java Development Kit (JDK) (for ClojureScript development)
- Clojure CLI
-
Clone and Install Dependencies
git clone https://github.com/avelino/logseq-copilot.git cd logseq-copilot bb deps -
Development Commands We use Babashka (bb) tasks for development. Here are the main commands:
-
Start development environment:
bb dev
This command:
- Cleans the dist directory
- Copies resources
- Starts Shadow-CLJS watch process for hot reloading
-
Check for dependency updates:
bb deps
-
Build the plugin:
bb build
This creates a production build in the
distdirectory.
-
- In Logseq, go to Settings → Developer mode → Turn it on
- Click "Load unpacked plugin"
- Select the
distdirectory from your development workspace - The plugin will be loaded for development
For hot-reloading during development:
- Make changes to the code
- The Shadow-CLJS watch process will automatically rebuild
- In Logseq, click the reload (↻) button next to the plugin name
-
Fork the repository
-
Create a new branch for your feature/fix:
git checkout -b feature/your-feature-name
-
Make your changes
-
Test your changes thoroughly
-
Commit your changes with a clear message:
git commit -m "feat: add new feature" # or "fix: resolve issue"
-
Push to your fork and submit a Pull Request
- Use
bb watch-cljsfor just the ClojureScript watch process - The
bb portalcommand opens the Portal UI for debugging - Check the browser console for errors and debug information
- Test your changes in Logseq before submitting a PR
MIT License - See LICENSE