/Hello-ChatGPT

Connect ChatGPT to Slackbot. Works via FastAPI

Primary LanguagePythonApache License 2.0Apache-2.0

Python 3.x

image

ChatGPT API with FastAPI

This repository contains implementations that use OpenAI's ChatGPT model. The basic structure is simple. When a message comes in via Slack, we generate a response via the ChatGPT API.

All settings are set via environment variables. See here.

  • slack_token: A Slack token that begins with XOXB.
  • openai_token: An OpenAI token that begins with sk.
  • number_of_messages_to_keep: Set how many conversation histories to keep.

Prerequisite

  • Docker
  • Redis

Before running the application, make sure that Docker and Redis are installed and running on your system.

important: Set and use all the environment variables in app/config/constants.py.

Local Execution Guide

  1. First, to run this application in your local environment, please execute the following command to install the required libraries.
pip install -r requirements.txt
  1. Once the necessary libraries have been installed, execute the following command to run the application.
uvicorn app.main:app --reload

This command will run the application based on the app object in the main module of the app package. You can use the --reload option to automatically reload the application when file changes are detected.

Installation

  1. Clone the repository:
https://github.com/jybaek/Hello-ChatGPT.git
cd Hello-ChatGPT
  1. Build the Docker image:
docker build -t chatgpt-api .
  1. Run the Docker container:
docker run --rm -it -p8000:8000 chatgpt-api
  1. Open your web browser and go to http://localhost:8000/docs to access the Swagger UI and test the API.

API Documentation

The API documentation can be found at http://localhost:8000/docs once the Docker container is running.

License

This project is licensed under the terms of the MIT license. See LICENSE for more information.