/local-llm

Primary LanguageJupyter Notebook

Run a Local LLM from a Jupyter Notebook in VSCode!

Using : llama.cpp via llama-cpp-python

Setup Instructions

  1. Install VSCode: Download VSCode.

  2. Install Git: Download Git.

  3. Clone the Repository:

    • Open VSCode.
    • Open the Command Palette (Ctrl + Shift + P or Cmd + Shift + P).
    • Type Git: Clone and enter:
      https://github.com/DrDavidL/local-llm.git
      
    • Select a folder to clone to.
    • Open the cloned repository in VSCode.
  4. Set Up the Python Virtual Environment:

    • Open the integrated terminal in VSCode (Ctrl + Shift + ~ or Cmd + Shift + ~).
    • Navigate to the project directory:
      cd local-llm
    • Create a virtual environment:
      python -m venv venv
    • Activate the virtual environment:
      • Windows: .\venv\Scripts\activate
      • MacOS/Linux: source venv/bin/activate
    • Install required packages:
      pip install -r requirements.txt
  5. Open the IPython (Jupyter) notebook: Open the notebook file (local.ipynb).

  6. Select the Python Interpreter: Ensure VSCode is using the Python interpreter from your virtual environment. When you try to run a cell you'll be prompted to choose.

    • Select the interpreter from the venv directory (e.g., ./venv/bin/python for MacOS/Linux or .\venv\Scripts\python.exe for Windows).

If questions: cloning a GitHub repository 🛠 setting up a Python virtual environment
More info: VSCode Python tutorial 💻 virtual environments best practices

  1. Download one or two LLMs from HuggingFace

Options that will run on machines that are even a few years old. After you download, replace the paths used in the notebook cells with the paths to your own downloaded GGUF formatted files.

  1. Run the Notebook cells
  • Open your output file, 'output.md', by right clicking and choosing preview. Move to vertical pain to the right of your notebook. This way you can see the output as it emerges!
  • Run the cells of your notebook.

Congrats! You're interacting with an AI model running locally on your device!