/benchmarking

Node.js Benchmarking Working Group

Primary LanguageShell

Benchmarking Work Group

Mandate

The Benchmark working group's purpose is to gain consensus for an agreed set of benchmarks that can be used to:

  1. Track and evangelize performance gains made between Node releases
  2. Avoid performance regressions between releases

Its responsibilities are:

  1. Identify 1 or more benchmarks that reflect customer usage. Likely need more than one to cover typical Node use cases including low-latency and high concurrency
  2. Work to get community consensus on the list chosen
  3. Add regular execution of chosen benchmarks to Node builds
  4. Track/publicize performance between builds/releases

The path forward is to:

See here for information about the infrastructure in place so far: https://github.com/nodejs/benchmarking/blob/master/benchmarks/README.md

Logistics

Semi-monthly Meetings

Meetings of the working group typically occur every third Tuesday as shown on the the node.js project calendar. A few days before each meeting, an issue will be created with the date and time of the meeting. The issue will provide schedule logistics as well as an agenda, links to meeting minutes, and information about how to join as a participant or a viewer.

Current Project Team Members

  • Michael Dawson (@mhdawson) Facilitaor
  • Uttam Pawar (@uttampawar)
  • Michael Paulson (@michaelbpaulson)
  • Gareth Ellis (@gareth-ellis)
  • Kunal Pathak (@kunalspathak)
  • Jamie Davis (@davisjam)

Emeritus Project Team Members

  • Trevor Norris (@trevnorris)
  • Ali Sheikh (@ofrobots)
  • Yosuke Furukawa (@yosuke-furukawa)
  • Yunong Xiao (@yunong)
  • Mark Leitch (@m-leitch)
  • Surya V Duggirala (@suryadu)
  • Wayne Andrews (@CurryKitten)
  • Kyle Farnung (@kfarnung)
  • Benedikt Meurer (@bmeurer)
  • Sathvik Laxminarayan (@sathvikl)