This is a template used by create_app to create a new FastAPI project with a PostgreSQL database and Docker Compose.
To create your new project from this template, simply run:
pip install create_app
python -m create_app create python_fastapi_with_database
- Project structure
- A FastAPI API with:
- Configuration through environment variables and environment file
- CORS
- Alembic database migrations
- Methods to manage database sessions and very common queries
- Models and methods for endpoints with pagination
- Virtualenv
- Unit tests
- Docker Compose containerization
- Pre-commit GIT hooks
- Makefile with useful commands
This template uses pre-commit to run GIT hooks in your repo:
This helps developers to keep the same code style in the project.
To install the hooks in your repo, first install pre-commit in your system. Then run:
make install_git_hooks
The project structure has been designed keeping in mind that you start with an API and a database, but may also need to add other services to your project.
By default, you get a FastAPI service, named the same as you project package name, and a database service (a PostgreSQL instance). You can add as many services you want to the docker-compose.yml file. If you need to build a new service, create a new folder at the root of the repo and put all its files in there. This will keep your repo organized and all your services decoupled from each other.
Say you have a project named "my_project", and want to add a Redis service:
my_project/ (repo root)
│ ...
│
└───my_project/ (FastAPI service, included in this template)
│
└───database/ (PostgreSQL database, included in this template)
│
└───redis_cache/ (A new service you added)
│ │ ...
│ │ Dockerfile
│
│ docker-compose.yml
│
│ ...
Each Python service (although you may have services using other technologies) has the following structure:
service_a/ (service folder)
│
└───service_a/ (contains sources)
│ │
│ └───tests/
│ │
│ │run.py
│
│ Dockerfile
│ Makefile
│ requirements.frozen
│ requirements.test.frozen
- The Dockerfile provides instructions to build the image
- Makefile is not required, but it's useful to keep some everyday commands in there
- Declare your dependencies in requirements.frozen and requirements.test.frozen (refer to requirements)
- run.py (or main.py) is the entry point to the service
- Put the unit tests in the tests package (refer to unit tests)
To build the images, go into the project repo and run:
docker compose build
And to run the containers:
docker compose up
After starting the container, you can hit the API root with any browser or HTTP client. For example, with CURL:
curl localhost:{api_port}
Check the API docs! http://localhost:{api_port}/docs
And the alternative API docs! http://localhost:{api_port}/redoc
It is recommended to keep your system's Python interpreter clean, and install your project's dependencies in a virtual environment (venv). Doing this has advantages like preventing dependencies conflicts between different projects you may have in your system.
After you've installed venv in your system, go to the service folder and run the following to create the venv:
make create_virtualenv
Use the requirements.frozen file to declare the project's dependencies, and requirements.test.frozen to declare dependencies that are only required to run tests. As indicated in the filenames, it is advised to declare the dependencies with explicit versions (example: requests==2.28.1). This will allow you to control when to upgrade dependencies versions, and will save you headaches when a new dependency version is released right when you were running a deployment pipeline.
To install the requirements in the venv, go to the service folder and run:
make install_requirements
To install the test requirements in the venv, run:
make install_test_requirements
To install requirements and test requirements with a single, command, run:
make install_all_requirements
Add your unit tests to the tests package.
To run all unit tests, go to the service folder and run:
make run_unit_tests
Let's see with an example how you could build an API to manage a TODO list.
One way to create the new table in the DB is to declare our ORM model first, and from there generate a DB migration to get that table created in the database. Let's choose this path, as it's the simplest and most practical.
Add the SQLAlchemy model in database/models.py. This is the mapping for our new "todo" table.
from sqlalchemy import Column, Integer, Text
from database import Base
class Todo(Base):
__tablename__ = "todo"
id = Column(Integer, primary_key=True)
name = Column(Text)
Now is time to generate the database migrations from the model we've just added. With the services running, do the following:
cd {project_package_name}
make generate_database_migration MESSAGE="Add 'todo' table"
This will create a new file in alembic/versions, named {migration_id}_add_todo_table.py. That's the Alembic migration to create the "todo" table.
Use this command to upgrade the database to the latest version. In our case, it will run the migration we've just generated:
cd {project_package_name}
make migrate_database
We'll create a couple of models so that we can serialize our data, and document its structure so that people can check our docs and know what to expect when they use our endpoints.
Add a todo_models.py in the serialization package, with a couple of models:
from typing import List
from pydantic import BaseModel
from serialization.base_models import BasePaginatedList
class TodoModel(BaseModel):
id: int
name: str
class Config:
orm_mode = True
class TodoCreateOrEdit(BaseModel):
name: str
class TodoPaginatedList(BasePaginatedList):
results: List[TodoModel]
It's probably convenient to make our URI paths configurable in our API. You could just hardcode them, but say we want to be able to change them in our settings file (settings.env), with absolutely no impact in our code. On (settings.py) we'll add two new settings. One for the URI path (todos_route), and another to give the route a human-readable name for the API documentation (todos_tag). These are default values, meaning that if you change them in settings.env, the values in that file will be used instead.
from pydantic import BaseSettings
class Settings(BaseSettings):
...
todos_route: str = "/todos"
todos_tag: str = "Todos"
...
Then add a new todos.py module in routers, and add our new router with configurable path and tag:
from fastapi import APIRouter
from settings import settings
router = APIRouter(prefix=settings.todos_route, tags=[settings.todos_tag])
All that's left now is to register our router in the API, which is done by adding it to a list in routers/__init__.py:
from typing import List
from fastapi import APIRouter
from .todos import router as todos_router
# Add your APIRouters to this list
ALL_ROUTERS: List[APIRouter] = [todos_router]
Let's add a few endpoints to the router, in routers/todos.py.
This endpoint returns a paginated list of TODOs.
from fastapi import Depends
from settings import settings, ROOT_ROUTE
from database import Session, session_scope
from database.models import Todo
from serialization.model_serialization import paginate_list
from serialization.todo_models import TodoPaginatedList
@router.get(ROOT_ROUTE, response_model=TodoPaginatedList)
def list_todos(
limit: int = settings.default_limit,
offset: int = settings.default_offset,
session: Session = Depends(session_scope),
):
return paginate_list(session, Todo, offset, limit)
This endpoint is to create a new TODO.
from fastapi import Depends, status
from settings import ROOT_ROUTE
from database import Session, session_scope
from database.models import Todo
from serialization.todo_models import TodoModel, TodoCreateOrEdit
@router.post(
ROOT_ROUTE, response_model=TodoModel, status_code=status.HTTP_201_CREATED
)
def create_todo(
todo: TodoCreateOrEdit, session: Session = Depends(session_scope)
):
todo_orm = Todo(**todo.dict())
session.add(todo_orm)
session.flush()
return todo_orm
This endpoint is to get an existing TODO. Returns 404 (Not found) if it does not exist.
from fastapi import Depends
from settings import IDENTIFIER_ROUTE
from database import Session, session_scope
from database.models import Todo
from serialization.model_serialization import get_or_raise
from serialization.models import TodoModel
@router.get(IDENTIFIER_ROUTE, response_model=TodoModel)
def read_todo(identifier: int, session: Session = Depends(session_scope)):
return get_or_raise(session, Todo, id=identifier)
This endpoint is to update an existing TODO. Returns 404 (Not found) if it does not exist.
from fastapi import Depends
from settings import IDENTIFIER_ROUTE
from database import Session, session_scope
from database.models import Todo
from serialization.model_serialization import get_or_raise
from serialization.models import TodoModel, TodoCreateOrEdit
@router.put(IDENTIFIER_ROUTE, response_model=TodoModel)
def update_todo(
identifier: int,
todo: TodoCreateOrEdit,
session: Session = Depends(session_scope),
):
instance = get_or_raise(session, Todo, id=identifier)
instance.name = todo.name
session.add(instance)
return instance
This endpoint is to delete an existing TODO. Returns 404 (Not found) if it does not exist.
from fastapi import Depends, status, Response
from settings import IDENTIFIER_ROUTE
from database import Session, session_scope
from database.models import Todo
from serialization.model_serialization import get_or_raise
@router.delete(IDENTIFIER_ROUTE, status_code=status.HTTP_204_NO_CONTENT)
def delete_todo(identifier: int, session: Session = Depends(session_scope)):
instance = get_or_raise(session, Todo, id=identifier)
session.delete(instance)
return Response(status_code=status.HTTP_204_NO_CONTENT)
You can add as many settings you need to settings.py.
When adding settings, you can specify default values.
You can change the value of these settings in settings.env. If, for a setting, you set a value in this file, it overwrites the default one (if any).