This is a test project, using node streams to deliver on demand csv files to clients.
The used packages:
express
as http server@faker-js/faker
to generate fake datacsv-stringify
to convert object in csv
It also uses Readable.from
to create a Readable Streams
from a generator
and AbortController
to stop the download if the client cancel the request.
This project uses docker compose
:
docker compose up -d
or
npm run docker:up
The startup sequence is:
stateDiagram-v2
direction LR
nginx: Load Balance
db: Postgres
seed: Seed
app: Application
[*] --> db
db --> seed
seed --> app
state app {
direction LR
app1
app2
}
app --> nginx
nginx --> [*]
To destroy the services, including volumes and images:
docker compose down --volumes --rmi all
or
npm run docker:down
The url http://localhost/download-csv
will download a 10.000 line csv file. But this endpoint
has two query params
:
size
is the number of lines
Examples
http://localhost/download-csv/?size=50000
will download a csv with 50k lines.
The url
and query params
are the same.
# this command to see the lines building up
curl http://localhost/download-csv?size=500
# this command to download the file, and see the progress
curl http://localhost/download-csv?size=500 -o ./file.csv
Because it uses Node Streams
and asynchronous
delay (it's simulates requests) this example can handle a lot of requests at once without blocking
or getting out of memory