๐๐จ FastAPI Rocket Boilerplate to build an API based in Python with its most modern technologies!
Also sqlmodel, pydantic, alembic, poetry, ...
- Infrastructure: the common services that every backend needs, served in local by Docker Compose and in Google Cloud by Pulumi.
- Easy: all the commands ready by Makefile.
- Fast: thanks to Fastapi and async programming.
- Async: Celery using RabbitMQ as broker and Redis as backend.
- ORM: custom sqlmodel orm as django orm and mongoengine.
- Authentication: OAuth2 with access/refresh tokens.
- Admin dashboard: custom admin dashboard as django by sqladmin.
- Rock-Solid Reliability: CI, pre-commit, integrity testing and covered by unit test at +95%.
- Frontend friendly: auto generation of SDK Typescript client.
- Python 3.11
- Docker
- Node only for SDK frontend generation
- Pulumi only for deploying
-
Clone the repo
-
Create a virtual environment:
python3.11 -m venv venv
- Install the requirements with Poetry for developing, testing and debugging purposes.
make install
- If you want to use the pre-commit with the same style-check that the CI pipeline:
pre-commit install
โน๏ธ You can test the pre-commit without committing running
pre-commit run --all-files
Build and run the Docker services for using in Local.
make run
Congrats! the API is working at this point, you can check:
- Docs: http://localhost:8000/docs
- Admin: http://localhost:8000/admin
- RabbitMQ: http://localhost:15672/
For admin, use:
ADMIN_USER=superuser
ADMIN_PASS=admin
For generating the SDK frontend client (the app should be running):
make generate_sdk
Run pytest with coverage for unit testing.
make test
You do not need to run inside Docker container.
The DB is replaced by a SQLite db in memory ๐
Use Alembic for DB migrations.
If you create a new model, import it in: app/core/db/migrations/models.py
After this, or modified a previous model, create the migration document:
docker-compose run app alembic revision --autogenerate -m "your commit"
If you are trying to do something complicated, maybe you need to fix the file manually.
Migration file should be created inside the Docker container because the DB url is referencing the Docker network domain.
Migrations will run when docker compose up, but you can run them manually:
docker-compose run app alembic upgread head
Basically, you will want to create new services that contain endpoints and models. And of course, it is almost completely sure you need to add new extra dependencies.
You can use the service user
as reference.
If you want to create a new model to be stored in the DB, you should follow these steps:
- Create a new Class based in ModelCore with
table=True
from app.core.base.models import ModelCore
class NewModel(ModelCore, table=True):
unique_property: str
- Import the new class into the migration model file
app.core.db.migrations.models
- Create a new migration
- Create an AdminModel in
app.services.admin.models
:
from app.core.admin.models import ModelViewCore
class NewModelAdmin(ModelViewCore, model=NewModel):
# You can add config settings here for the Admin panel.
pass
- Append it in
admin_models
intoapp.services.admin.config
If you want to create a new view protected by auth, you should include the get_current_user
dependency.
Here you have an example of a new service with a protected route:
from fastapi import APIRouter, Depends
from app.core.auth.functions import get_current_user
router = APIRouter(
prefix="/security",
tags=["security"]
)
@router.get("/protected")
def protected_route(current_user: str = Depends(get_current_user)):
""" Endpoint for auth test"""
return {"message": f"ยกHola, {current_user}! This is a protected url and you are inside!"}
And then append the router in routers
into app.main
For creating new users, they can register by themselves or be added by Admin panel.
Use Poetry like:
poetry add <new_dependency>
You should change the next env vars in .env
:
- Password hash:
- SECRET_KEY: run in the terminal
openssl rand -base64 32
to generate a new one
- SECRET_KEY: run in the terminal
- Admin superuser:
- ADMIN_USER
- ADMIN_PASS
Also, it is possible you want to modify the expiry time of access/refresh tokens.
We use Pulumi for deploying.
The DB will be deployed in GCP SQL service.
The rest of the services will be deployed in GCP GKE.
- Log in Google cloud
gcloud auth application-default login
- Create a new project in Google console
- Set the project in gcloud
gcloud config set project <YOUR_GCP_PROJECT_ID>
To find it, you can run:
gcloud projects list
- Modify
Pulumi-dev.yaml
with your GCP project and region. - Modify the begging of
__main__.py
with your variables - run:
pulumi up
โ It is done, your project is alive!
- Deployment with Kubernetes in Google Cloud
- Deployment with Kubernetes in AWS
- Deployment with Kubernetes in Azure
- Add logging
- Add Sentry
- Add Flower
- Integrity tests
- Cover 100% with unit-testing
- Add mypy and pylint to the Pre-commit
- Use 100% async/await for routes and database connections
- Authentication client with Google
- Search events by model AND id
- Fix popup for reverse_delete
- Relationship of records into model details (performance)