/py-frameworks-bench

Another benchmark for some python frameworks

Primary LanguagePythonMIT LicenseMIT

Async Python Web Frameworks comparison

https://klen.github.io/py-frameworks-bench/

Updated: 2022-01-14

benchmarks tests


This is a simple benchmark for python async frameworks. Almost all of the frameworks are ASGI-compatible (aiohttp and tornado are exceptions on the moment).

The objective of the benchmark is not testing deployment (like uvicorn vs hypercorn and etc) or database (ORM, drivers) but instead test the frameworks itself. The benchmark checks request parsing (body, headers, formdata, queries), routing, responses.

Table of contents

The Methodic

The benchmark runs as a Github Action. According to the github documentation the hardware specification for the runs is:

  • 2-core vCPU (Intel® Xeon® Platinum 8272CL (Cascade Lake), Intel® Xeon® 8171M 2.1GHz (Skylake))
  • 7 GB of RAM memory
  • 14 GB of SSD disk space
  • OS Ubuntu 20.04

ASGI apps are running from docker using the gunicorn/uvicorn command:

gunicorn -k uvicorn.workers.UvicornWorker -b 0.0.0.0:8080 app:app

Applications' source code can be found here.

Results received with WRK utility using the params:

wrk -d15s -t4 -c64 [URL]

The benchmark has a three kind of tests:

  1. "Simple" test: accept a request and return HTML response with custom dynamic header. The test simulates just a single HTML response.

  2. "API" test: Check headers, parse path params, query string, JSON body and return a json response. The test simulates an JSON REST API.

  3. "Upload" test: accept an uploaded file and store it on disk. The test simulates multipart formdata processing and work with files.

The Results (2022-01-14)

Accept a request and return HTML response with a custom dynamic header

The test simulates just a single HTML response.

Sorted by max req/s

Framework Requests/sec Latency 50% (ms) Latency 75% (ms) Latency Avg (ms)
blacksheep 1.2.2 20378 2.80 3.93 3.10
falcon 3.0.1 17495 3.13 4.67 3.62
muffin 0.86.3 16659 3.14 5.07 3.81
baize 0.14.1 14165 3.66 6.00 4.49
starlette 0.17.1 13692 3.77 6.23 4.64
emmett 2.3.2 13292 3.88 6.46 4.81
fastapi 0.70.1 10745 4.78 7.89 5.92
aiohttp 3.8.1 7356 8.66 8.77 8.72
quart 0.16.2 3709 17.56 18.44 17.32
tornado 6.1 3220 19.77 20.09 19.89
sanic 21.12.0 1675 30.48 54.10 38.14
django 4.0 964 60.47 72.25 66.29

Parse path params, query string, JSON body and return a json response

The test simulates a simple JSON REST API endpoint.

Sorted by max req/s

Framework Requests/sec Latency 50% (ms) Latency 75% (ms) Latency Avg (ms)
muffin 0.86.3 10355 4.94 8.43 6.15
blacksheep 1.2.2 10209 5.07 8.51 6.24
falcon 3.0.1 10042 5.11 8.70 6.34
starlette 0.17.1 8345 6.44 9.67 7.64
emmett 2.3.2 7058 7.04 12.31 9.23
baize 0.14.1 6422 10.16 10.55 9.95
fastapi 0.70.1 6214 8.12 14.47 10.27
aiohttp 3.8.1 4555 13.98 14.21 14.05
tornado 6.1 2769 23.09 23.29 23.12
quart 0.16.2 2032 30.84 32.28 31.47
sanic 21.12.0 1592 32.09 56.21 40.13
django 4.0 853 69.17 81.53 74.94

Parse uploaded file, store it on disk and return a text response

The test simulates multipart formdata processing and work with files.

Sorted by max req/s

Framework Requests/sec Latency 50% (ms) Latency 75% (ms) Latency Avg (ms)
blacksheep 1.2.2 5821 8.99 14.55 11.07
muffin 0.86.3 4678 11.11 18.15 13.65
falcon 3.0.1 3365 15.11 25.96 19.18
baize 0.14.1 2631 23.36 26.21 24.33
starlette 0.17.1 2434 20.68 36.69 26.25
fastapi 0.70.1 2290 22.44 37.93 27.92
aiohttp 3.8.1 2256 28.81 29.12 28.35
tornado 6.1 2087 30.62 31.06 30.67
quart 0.16.2 1676 37.79 39.51 38.13
emmett 2.3.2 1323 44.97 52.28 48.28
sanic 21.12.0 1319 40.01 63.05 48.45
django 4.0 674 88.07 95.81 94.63

Composite stats

Combined benchmarks results

Sorted by completed requests

Framework Requests completed Avg Latency 50% (ms) Avg Latency 75% (ms) Avg Latency (ms)
blacksheep 1.2.2 546120 5.62 9.0 6.8
muffin 0.86.3 475380 6.4 10.55 7.87
falcon 3.0.1 463530 7.78 13.11 9.71
starlette 0.17.1 367065 10.3 17.53 12.84
baize 0.14.1 348270 12.39 14.25 12.92
emmett 2.3.2 325095 18.63 23.68 20.77
fastapi 0.70.1 288735 11.78 20.1 14.7
aiohttp 3.8.1 212505 17.15 17.37 17.04
tornado 6.1 121140 24.49 24.81 24.56
quart 0.16.2 111255 28.73 30.08 28.97
sanic 21.12.0 68790 34.19 57.79 42.24
django 4.0 37365 72.57 83.2 78.62

Conclusion

Nothing here, just some measures for you.

License

Licensed under a MIT license (See LICENSE file)