/fastapi-crud-mcp

A minimal example of FastAPI with MCP library fastapi-mcp

Primary LanguagePythonMIT LicenseMIT

codecov Python Version License: MIT

FastAPI CRUD MCP

A minimal CRUD API for β€œitems,” built with FastAPI and exposed as MCP tools via FastAPI-MCP. Includes a scenario-driven client harness using PydanticAI and Rich.


πŸš€ Features

  • FastAPI: high-performance HTTP API
  • SQLAlchemy + Pydantic: ORM models + input/output schemas
  • FastAPI-MCP: auto-expose your endpoints as MCP tools (/mcp/tools, /mcp/events)
  • Rich CLI: beautiful, colored terminal output for scenario runs
  • Scenario Runner: client harness that drives and validates your API via PydanticAI agents
  • SQLite backend for demo; easily swap to PostgreSQL, MySQL, etc.

πŸ“¦ Project Layout


.
β”œβ”€β”€ backend
β”‚   β”œβ”€β”€ server
β”‚   β”‚   β”œβ”€β”€ main.py            # FastAPI + FastAPI-MCP wiring
β”‚   β”‚   β”œβ”€β”€ models.py          # SQLAlchemy + Pydantic schemas
β”‚   β”‚   β”œβ”€β”€ routes.py          # CRUD endpoints
β”‚   β”‚   β”œβ”€β”€ crud.py            # DB operations
β”‚   β”‚   β”œβ”€β”€ db.py              # session & engine
β”‚   β”‚   └── logger.py          # stdlib logging setup
β”‚   └── client
β”‚       β”œβ”€β”€ scenarios.py       # Scenario definitions
β”‚       └── main.py            # run\_scenarios.py harness
β”œβ”€β”€ .env                       # example environment variables
β”œβ”€β”€ pyproject.toml             # Project dependencies
└── README.md                  # this file


βš™οΈ Installation & Setup

  1. Clone & enter directory

    git clone https://github.com/yourusername/fastapi-crud-mcp.git
    cd fastapi-crud-mcp
  2. Create & activate a virtualenv

    uv venv
    source .venv/bin/activate
  3. Install dependencies

    uv sync
  4. Environment variables Copy the example and adjust if needed:

    cp .env.example .env
    MCP_HOST_URL='http://127.0.0.1:8000/mcp'
    
    LLM_PROVIDER='openai'
    LLM_MODEL_NAME='gpt-4o-mini'
    LLM_MODEL=${LLM_PROVIDER}:${LLM_MODEL_NAME}
    
    OPENAI_API_KEY=sk-proj-your-api-key-here

πŸƒ Running the Server

docker compose up -d --build
  • API docs β†’ http://localhost:8000/docs
  • OpenAPI JSON β†’ http://localhost:8000/openapi.json

πŸ€– Running the Scenario Client

python3 -m backend.client.main

This harness will:

  1. Load your .env settings
  2. Spin up a PydanticAI agent against MCP_HOST_URL
  3. Execute each scenario (create/list/get/update/delete)
  4. Display rich panels for prompts & outputs

🚨 Notes & Tips

  • Switch DB: edit backend/server/db.py for PostgreSQL or MySQL.
  • Add auth: protect /mcp or /api via FastAPI dependencies.
  • Extend scenarios: drop new entries into backend/client/scenarios.py.
  • Production: add Alembic for migrations, and monitor with Prometheus.

🀝 Contributing

  1. Fork πŸ”±

  2. Create a feature branch:

    git checkout -b feature/my-feature
  3. Commit & push:

    git commit -am "Add awesome feature"
    git push origin feature/my-feature
  4. Open a PR and we’ll review!


πŸ“„ License

This project is MIT-licensedβ€”see the LICENSE file for details.