Using this template you can have a scalable, automated and cheap api server running in 10 minutes! Based on FastApi, Docker and Lambdas (FDL).
This is my very opinionated starter template for fastapi based projects. It includes a lot of my learned best practices, it is the template I personally use. it includes:
- Fastapi as an api framework
- Local API Server and a Jupyter server, using docker-compose
- A dev and a production environment, seperated by .envs files
- so you can easily switch locally by selecting a different docker compose-file
- Infra using Serverless and AWS Lambda
- Deployment using docker .sh files and AWS ECR
- this only works for linux (because of the .sh files we use)
- it should be straight forward to extend to MAC / Windows, pull requests are welcome
- python + cookiecutter has to be installed
- npm has to be installed locally to set up serverless framework
- additionally you need:
- an aws account
- a serverless account
- (optional) a postgres database accessible by the Lambda functions (preferably in the same VPN)
- (optional but recommended) a sentry account
- go to serverless and create an organization / app
- it has to be the same name as the project_name and the serverless_app parameters
- run the cookiecutter for example by:
cookiecutter git@github.com:leanderloew/FPL-starter.git
-
follow the command line to set up your input
-
wait for the project to be generated
-
now you have to set up the Elastic Container Registries, you can do that by running
cd terraform/ecr
terraform init
terraform plan
terraform apply
cd ../..
- (Optional!) you can optionally run similar code to create the production and development databases, however please make sure the code doesn't destroy existing databases, also if in doubt create the databases using an online interface instead
- next you have to set up serverless framework run:
cd serverless_config
npm i serverless
cd ../
- finally you can run your first development deploy
bash deploy_dev.sh
- you can follow one of the health urls you should see in your browser:
{"status":200}
- to run the local api server and jupyter server simply run
docker-compost -f local_dev.yml up
- you can run a local production server instead with
docker-compost -f local_prod.yml up
when you have a new change you want to deploy to development run
bash deploy_dev.sh
similarly when you want to deploy to production run
bash deploy_prod.sh
- if you set it up without database the costs should be basically 0, also a lot of requests (1 million) and 400.000 gbseconds will be covered by aws free tier.
- however, if you set up the hosted database it will always cost you some amount.
- you can integrate the deployment script into a remote deployment for example with github actions
- the main pain will then be to handle the environment variables
- it is recommended not to push before setting up and running some tests
running this code has the potential to:
- destroy existing databases
- only if you run the dev_db and prod_db terraform scripts
- generate a lot of costs, if you receive a lot of requests somehow you are yourself liable to make sure your project is secure and the costs stay in your budget. This setup is purely educational.
- Data Base Setup
- Public AWS
- Private AWS with VPN
- A demo app integrated with Retool
- Pycharm + Docker compose setup for debugging