/ChatLLM

Simple chat application for querying local and cloud LLM models.

Primary LanguagePascalMIT LicenseMIT

ChatLLM

ChatLLM is a simple Delphi application for chatting with Large Language Models (LLMs). Its primary purpose is to act as a coding assistant.

Features

  • Supports both cloud based LLM models (ChatGPT) and local models using Ollama.
  • Supports both the legacy completions and the [chat/completions] (https://platform.openai.com/docs/api-reference/chat) API endpoints.
  • The chat is organized around multiple topics.
  • Can save and restore the chat history and settings.
  • Streamlined user interface.
  • Syntax highlighting of code (python and pascal).
  • High-DPI awareness.

The application uses standard HTTP client and JSON components from the Delphi RTL and can be easily integrated in other Delphi applications.

Usage

Chat with Openai cloud based models such as gpt-4

The screenshot below shows the settings for using gpt3.5-turbo which offers a good balance between cost and performance.

image

Settings:

  • Endpoint: The base URL for accessing the openai API
  • Model: The openai model you want to use.
  • API key: You need to get one from https://platform.openai.com/api-keys to use the openai models
  • Timeout: How long you are prepared to wait for an answer.
  • Maximum number of response tokens: An integer value that determines the maximum length of the response.
  • System prompt: A string providing context to the LLM, e.g. "You are my python coding assistant".

Chat with local models

You first need to download the ollama installer and install it. Ollama provides access to a large number of LLM models such as codegemma from Google and codelllama from Meta. To use a given model you need to install it locally. You do that from a command prompt by issuing the command:

ollama pull model_name

After that you are ready to use the local model in ChatLLM.

The screenshot below shows the settings for using codellama.

image

You do not need an API key to use Ollama models and usage is free. The downside is that it may take a long time to get answers, depending on the question, the size of the model and the power of your CPU and GPU.

Chat topics

The chat is organized around topics. You can create new topics and back and forth between the topics using the next/previous buttons on the toolbar. When you save the chat all topics are soved and then restored when you next start the application. Questions within a topic are asked in the context of the previous questions and answers of that topic.

Screenshot

image

Compilation requirements