This repository is a demonstration of using background processing / message queues using BullMQ, a Redis-based queue for NodeJS. In this example we create message queues in Redis, write jobs into these queues, and attach workers to process those jobs. There are three container types in play;
- A "redis" container that persists its data to disk (in
redis-data/
) to ensure jobs live between container teardowns and restarts. - A "service" container which is an Express that has endpoints to create and place jobs into a message queue.
- A "worker" container that attaches to a message queue and processes messages.
These instructions assumes you already have docker
and docker compose
installed. The steps are as follows:
- Run the service command which alsos spin up the redis container and creates the message queue within Redis (should it not already exist yet).
docker compose up service
- Hit the
/rate-limited-queue/bulk-add
endpoint to queue up dummy jobs that are ready for processing.
curl -X POST http://localhost:8080/rate-limited-queue/bulk-add
-
Open the bull-board dashboard and see the "waiting" tab is populated with the jobs you spun up. They will remain here until we start the worker and they get processed.
-
Start the worker in another tab / terminal.
docker compose up rate-limited-queue-worker
- Observe that jobs are picked up from "waiting", moved to "active", progressed over time then moved to "completed". You will also see appropriate logging in both the server and the workers output. Jobs demonstrate encountering rate limiting with an external API. This results in the worker telling the queue to pause job processing for an allotted amount of time.