/bid-it

Scalable Bidding System With Microservices Architecture

Primary LanguageTypeScript

bid-it

Coverage Status

Scalable Bidding System With Microservices Architecture

Example of frontend application using this service is bid-it-app

Architecture

Architecture

All 3 main applications are under single NestJS monorepo.

  1. REST API Source code
  2. Bid Engine Source code
  3. Websocket server Source code

Installation

There are two ways to run this application:

  1. Docker (recommended)
  2. Manual

1. Docker

docker-compose --compatibility up -d

This will starts 3 Rest API services with 3 WebSocket servers with Nginx as load-balancer in front of them.

You can access the REST API Swagger UI at http://localhost:3000/api

You can shutdown the services with the command docker-compose down

2. Manual

Prerequisite

  1. MongoDB
  2. Redis

Running The Services

  1. Install dependencies.

    yarn install
  2. Adding an .env file at the root of the project with the following content:

    DEALS_DB_URL=mongodb://localhost:27017/deal
    USERS_DB_URL=mongodb://localhost:27017/user
    REPORTING_DB_URL=mongodb://localhost:27017/report
    REDIS_URL=redis://localhost:6379
  3. Start all the services.

    yarn start
  4. You can access the REST API Swagger UI at http://localhost:3000/api

Reports

We use a reporting server (source) listening to Redis event and persists the events to MongoDB.

The reports then can be generated with generate-report.ts

Simulations

With Docker in Unix environment

  1. Start all the services:

    docker-compose --compatibility up -d
  2. Generate test data and simulate 200 concurrent clients performing bidding, and generates reporting in console.

    ./run-simulation.sh

Others

  1. Start all the services:

    yarn start
  2. (First time only) Generate all the deals:

    yarn setup
  3. Simulate many clients performing bidding:

    yarn simulate
  4. Generate reports:

    yarn report

Additional Considerations/Improvements

Scaleability

  1. Both REST API and Websocket servers can be scaled horizontally easily.
  2. At the moment, a single queue is used to process all bids. This singleton design is intentional to avoid race conditions between bids.
  3. The singleton design of queue may have performance impact if the load is very high. One possible solution to explore is to distribute the load to process bids across multiple queues by ensuring bids associated with a particular deal always go to the same queue.

Built With