/thread

AI-powered Jupyter Notebook — use local AI to generate and edit code cells, automatically fix errors, and chat with your data

Primary LanguageJavaScriptGNU Affero General Public License v3.0AGPL-3.0

AI-powered Jupyter Notebook

Vizly Notebook is a Jupyter alternative that integrates an AI copilot into your Jupyter Notebook editing experience.

Best of all, Vizly Notebook runs locally and can be used for free with Ollama or your own API key. To start:

pip install vizly-notebook

To start vizly-notebook, run the following

vizly-notebook

or

jupyter vizly-notebook

or

jupyter thread

Key features

1. Familiar Jupyter Notebook editing experience

SameEditorExperience

2. Natural language code edits

CellEditing

3. Generate cells to answer natural language questions

ThreadGenerateMode

4. Ask questions in a context aware chat sidebar

ThreadChatDemo480

5. Automatically explain or fix errors

image

Demo

ThreadDemo720.mp4

ThreadIntro

Feature Roadmap

These are some of the features we are hoping to launch in the next few month. If you have any suggestions or would like to see a feature added, please don't hesitate to open an issue or reach out to us via email or discord.

  • Add co-pilot style inline code suggestions
  • Data warehouse + SQL support
  • No code data exploration
  • UI based chart creation
  • Ability to collaborate on notebooks
  • Publish notebooks as shareable webapps
  • Add support for Jupyter Widgets
  • Add file preview for all file types

Cloud

Eventually we hope to integrate Vizly Notebook into a cloud platform that can support collaboration features as well hosting of notebooks as web application. If this sounds interesting to you, we are looking for enterprise design partners to partner with and customize the solution for. If you're interested, please reach out to us via email or join our waitlist.

Development instructions

To run the repo in development mode, you need to run two terminal commands. One will run Jupyter Server, the other will run the NextJS front end.

To begin, run:

yarn install

Then in one terminal, run:

sh ./run_dev.sh

And in another, run:

yarn dev

Navigate to localhost:3000/vizly-notebook and you should see your local version of Vizly Notebook running.

If you would like to develop with the AI features, navigate to the proxy folder and run:

yarn install

Then:

yarn dev --port 5001

Using Vizly Notebook with Ollama

You can use Ollama for a fully offline AI experience. To begin, install and run vizly-notebook using the commands above.

Once you have run vizly-notebook, in the bottom left, select the Settings icon:

image

Next, select the Model Settings setting:

image

This is what you will see:

image

Navigate to Ollama and enter your model details:

image

Use Ctrl / Cmd + K and try running a query to see how it looks!