This FastAPI AI chat application is a RESTful API service that enables users to have AI-powered chat interactions. It uses FastAPI framework and integrates AI capabilities for a sophisticated human-machine interaction experience. The application is designed to be simple, scalable, and ready for production deployment.
- Create new AI chat interactions.
- Retrieve all interactions.
- Add messages to interactions.
- Fetch all messages within an interaction.
- Mock AI responses using
gpt4free
.
- Python 3.11
- Docker (for containerization)
- Docker Compose (for easy local deployment)
-
Clone the repository:
git clone https://github.com/mhb8898/llm-mock.git
-
Navigate to the project directory:
cd fastapi-ai-chat-app
-
Install dependencies:
pip install -r requirements.txt
-
Build and run the application:
docker-compose up --build
-
Access the application at
http://localhost:8000
.
-
Run the server:
uvicorn app.main:app --reload
-
Access the application at
http://127.0.0.1:8000
.
POST /interactions
: Create a new interaction.GET /interactions
: Fetch all interactions.POST /interactions/{id}/messages
: Add a message to an interaction.GET /interactions/{id}/messages
: Fetch all messages within an interaction.
Run the tests with:
pytest