/CodeLlama-Demo

Deployed Code Llama 13B GGUF model on CPU with Gradio interface

Primary LanguagePythonMIT LicenseMIT

Project Name

Description

This is a Gradio interface for using CodeLlama on your web browser for code assistance.

Installation

  1. Install Dependencies: Run the following command to install all the dependencies required for the project:

    pip install -r requirements.txt
  2. Download Model: Run the provided download_model.sh script to download the required model. Ensure you have permissions to execute the script:

    chmod +x download_model.sh
    ./download_model.sh

Usage

After installing the dependencies and downloading the model, you can run the project using the following command:

python app.py

License

This project is licensed under the MIT License - see the LICENSE file for details.