/Scalable-backend-template

Blazing fast pre-configured scalable backend template using FastAPI, PostgreSQL, SQLAlchemy, Alembic and Docker Swarm Mode 🚀🌟

Primary LanguagePython

Scalable backend template using FastAPI, PostgreSQL, SQLAlchemy, Alembic and Docker Swarm Mode

Create your backend easily using the latest technology.
Pre-configured template consists of 3 main components:

  1. API - build with:
  • FastAPI - async, high-performance Python web framework
  • SQLAlchemy - async SQL toolkit and ORM
  • Alembic - database migration tool
  1. database - PostgreSQL
  2. database tool available from your browser - pgAdmin

Scalability is ensured thanks to Docker Swarm Mode.

You can also use API image and scale it with Kubernetes.
However, Kubernetes it's not trivial, and it's not recommended for beginners.

Configuration

The most important configuration sits in .env file.
You should also take a look at docker-compose files.

Local development

Open terminal and type docker compose up. It will start all services.

First usage

Run the above command, open another terminal and type:
docker exec $(docker ps -f name=api -q | head -n 1) alembic upgrade head
It will run first migration and create sample database.

To make development smooth and use full power of your code editor, create a virtual python environment:
python3 -m venv venv

Now switch to it . venv/bin/activate

And install all needed packages pip3 install -r requirements.txt

Now you can play with your code freely!

Going to PRODUCTION

Preparations

To deploy backend, we will use Docker Swarm Mode - container orchestration system.

It means you need at least a single node cluster running.

Don't have one? Don't worry!
You can do it easily, even on your own computer with docker swarm init
With that mode set on, you can easily add other servers to your cluster.

Next, on your manager node you have to build a docker image of API docker build --tag api .

Now, you have to make sure you configured your environmental variables in .env file!
Load them on your manager node by typing export $(cat .env)

The last step is to make PostgreSQL data persistent.
For that, you have to mark one of your nodes to always run PostgreSQL database.
You can see all your nodes by running docker node ls
Then you grab your chosen node ID and label it by typing
docker node update --label-add postgres-data-node=true ID_OF_YOUR_CHOSEN_NODE
on your manager node.

Deploying backend

Simply run docker stack deploy -c docker-compose-prod.yaml backend

Yay! Your backend is working :) Don't forget to make migrations!

What next?

Now you can for example scale your API service with a command:
docker service scale backend_api=10
You will have 10 similar services running behind load balancer. Simple!