/k6-performance-test

An testing framework based on k6 for performance testing.

Primary LanguageHTMLApache License 2.0Apache-2.0


badge dash

An automation testing framework for performance test based on k6. With InfluxDB, Grafana Dashboard for better visualizations and CI process intergrated by CircleCI, Azure Pipelines, Github Actions.

Table of Contents

Changelogs

.....

⭐ 07/11/2021 : Add Postman and JMeter converters

⭐ 06/11/2021 : Update README and k6-html-reporter

⭐ 01/11/2021 : Add Azure Pipelines

⭐ 29/10/2021 : Add k6-reporter as second option for reporting

⭐ 28/10/2021: Add Github Actions and CircleCI

⭐ 24/10/2021 : InfluxDB with Grafana Dashboard for better report visualization

Features

⚙️ Support many performance testing types (Load, Soak, Stress, Spike, ..)

⚙️ Cloud execution with different load zones (Asia, EU, US, Canada, ...)

⚙️ Multiple reports exported (JSON, HTML, XML)

⚙️ CI integrated (CircleCI, Github Actions, GitlabCI, Bitbucket Pipelines)

⚙️ InfluxDB + Grafana Dashboard using Docker Compose

⚙️ Visual regression testing supported

⚙️ And other functions inherited from k6

Installation

  • Head to k6 Installation for your k6 installation

  • Use npm to install the dependencies (if any)

	npm install

Basic Usage

Run test locally

  • To run any test file (.js), simply use:
	k6 run <path to test file>

Run test on cloud

  • To begin, you must first register a k6 Cloud account and then log into your account via the CLI.
	k6 login cloud
  • Then, you only have to pass your existing script to the k6 cloud command.
	k6 cloud <path to test file>

Run test with options

  • Specify VUs (virtual users) as 10, duration 30s, passed as parameters
	k6 run --vus 10 --duration 30s script.js
  • Set up standard outpput for result
	k6 run --out json=full.json --summary-export=summary.json script.js

Write Test

Four stages of test life cyle

  • To begin, you need to know the four distinct life cycle stages in a k6 test are "init", "setup", "VU" and "teardown"
	// 1. init code

	export function setup() {
	  // 2. setup code
	}

	export default function (data) {
	  // 3. VU code
	}

	export function teardown(data) {
	  // 4. teardown code
	}

1️⃣ Init code - VU level: outside of default function() and only run once per VU

2️⃣ VU code - VU level: inside of default function() and is run over and over for as long as the test is running. A VU will execute the default function from start to end in sequence, once the VU reaches the end of the default function it will loop back to the start and execute the code all over.

3️⃣ Setup code - Test-wide level: The setup is only called once for a test. Setup is called at the beginning of the test, after the init stage but before the VU stage (default function

4️⃣ Teardown code - Test-wide level: The teardown are only called once for a test. Teardown is called at the end of a test, after the VU stage (default function).

Example test

The example with making HTTP request and using stages in k6 (ram-up and ramp-down)

  • The configuration of stages would be set inside options, and there are 3 stages described in example
  • default function is where we write code for VU. In this example, we amke HHTP request get to http://test.loadimpact.com
  • check is built-in method of k6 to validate result. We checked status was 200 and transaction time < 200
  • sleep() to stimulate break time between each iteration of VU

The example with cloud execution (custom load zones) and thresholds

  • In line 5, we used Rate, one of four custom metrics provided by k6. Rate is an object for representing a custom metric keeping track of the percentage of added values that are non-zero. We put this Rate ("failed requests") into threshold to check fail rate had to be < 10%
  • Threshold are a pass/fail criteria used to specify the performance expectations. In this example, we defined http_req_duration with p(95) < 250, this means 95% of request durations must be less than 250ms.
  • From line 20 onwards, that's where we set up load zones for cloud test, 60% traffic distributed on AWS Ashburn, 40% on AWS Dublin.
  • Line 22, we can set up projectID, which is linked to created project on k6 cloud for reporting
  • Line 37 and 39, value of true (1) and false (0) were put into Rate "failed requests"

Using k6-reporter by benc-uk

  • Add below lines of code in init section to import
	import { htmlReport } from "https://raw.githubusercontent.com/benc-uk/k6-reporter/main/dist/bundle.js";
  • Then add this function to the test file, which is implicitly called by k6 at the end of every test
	export function handleSummary(data) {
  	  return {
            "summary.html": htmlReport(data),
  	  };
	}

Using multiple reporters

  • Import jUnit and textSummary for k6 lib
	import { htmlReport } from "https://raw.githubusercontent.com/benc-uk/k6-reporter/main/dist/bundle.js";
	import { jUnit, textSummary } from "https://jslib.k6.io/k6-summary/0.0.1/index.js";
  • Add more options for reports
	export function handleSummary(data) {
    	  return {
	    "./results/html-result.html": htmlReport(data),
	    stdout: textSummary(data, { indent: " ", enableColors: true }),
	    './results/junit-result.xml': jUnit(data), // but also transform it and save it as a JUnit XML...
	    './results/json-result.json': JSON.stringify(data), // and a JSON with all the details...
    	  };
	}

Using k6-html-reporter

  • Specify the path to json report and path to output directory in html-report.js
	const reporter = require('k6-html-reporter');

	const options = {
		jsonFile: <path-to-json-report>,
		output: <path-to-output-directory>,
	    };

	reporter.generateSummaryReport(options);
  • Run the test which already has handleSummary function specified, take a look at animal-soak-test.js
	k6 run ./tests/atwt/animal-soak-test.js
  • Run js file html-report.js to generate html report from json report
	node ./src/utils/html-reporter.js
  • The exported report "report.html" will be located at "path-to-output-directory"
  • For more info: k6-html-reporter

Calculating RPS with k6

	Request Rate = (VU * R) / T

	T = (R * http_req_duration) + 1s
  • Request Rate: measured by the number of requests per second (RPS)
  • VU: the number of virtual users
  • R: the number of requests per VU iteration
  • T: a value larger than the time needed to complete a VU iteration

For more info: Generate constant request rate in k6

More testing type examples

More information

InfluxDB And Grafana Dashboard

Definition

  • Adding InfluxDB and Grafana, K6 gives a very powerful visualisation of the load test as it runs
  • InfluxDB: is a fast time-series database, which is supported by K6 as an output target for realtime monitoring of a test. Whilst K6 is running, it will stream run statistics to InfluxDB
  • Grafana: is a beautiful browser UI for data visualisation, which supports InfluxDB as a data source
  • Docker: is a platform for containers. Docker Compose adds the ability to bundle multiple containers together into complex integrated applications.

Set up to run

docker-compose.yml

  • There are 3 servers and two networks where
    • Runs Grafana web server for visualisation in the background on port 3000
    • Runs InfluxDB database in the background on port 8086
    • Runs K6 on an ad-hoc basis to execute a load test script

External files

  • grafana-datasource.yaml: configures Grafana to use InfluxDB as a data source, connect to the database over the local docker network on port 8086

  • grafana-dashboard.yaml: configures Grafana to load a K6 dashboard from the /var/lib/grafana/dashboards directory

  • dashboard/k6-load-testing-results_rev3.json: a JSON configuration of a K6/InfluxDB dashboard with few modifications

How to run

  • Running a load test requires that the InfluxDB and Grafana services are already running in the background:
	docker-compose up -d influxdb grafana
  • Run docker-compose to perform a K6 run on a test script:
	docker-compose run k6 run /tests/threshold.js

I also write shell script for faster usage:

Convert Other Tests To k6

Postman collection to k6 test

  • Install via npm, preferably use -g for globally:
	npm install --save-dev postman-to-k6
  • To convert an exported collection to k6 script:
	postman-to-k6 <path to collection json> -o <path to output k6 test js>

JMeter file to k6 test

  • Install via npm, preferably use -g for globally:
	npm install --save-dev jmeter-to-k6
  • To convert an exported jmx file to k6 script:
	jmeter-to-k6 <path to jmx> -o <output directory>
  • This will create a directory ./directory/ with a file called test.js and a sub-directory called libs
  • Then we can run that converted k6 test file as usual
  • For more info: https://github.com/grafana/jmeter-to-k6

CI Builds

CI Build status Config File Description
CircleCI CircleCI config.yml Test local.js with standard output files stored in CircleCI (using Orbs)
CircleCI - AWS set up Not activated config-aws-firewall.yml Load Test Behind the Firewall
CircleCI - Basic Not activated config-basic.yml Basic run
CircleCI - Cloud Not activated config-cloud.yml Cloud execution
CircleCI - Docker with result output Not activated config-result-docker.yml Cloud execution
Azure Pipelines Azure Pipelines azure-pipelines.yaml Branch sunced and pipelines built automatically on Azure Pipelines
Azure Pipelines - Docker Not activated azure-pipelines.docker.yaml Azure Pipelines with Docker images
Azure Pipelines - Manual Installation Not activated azure-pipelines.manual.yaml Azure Pipelines with manual installation of k6
Github Actions - k6 Github - k6 k6.yml Github Actions with local test run
Github Actions - Docker Github - Docker k6-docker.yaml Github Actions with docker
Github Actions - Windows Github - Windows k6-wins.yaml Github Actions with manual installation on Windows
Github Actions - Mac Github - Mac k6-mac.yaml Github Actions with manual installation on Mac

Author

Tuyen Nguyen - Senior QA Automation Engineer

License

Copyright 2021 Tuyen Nguyen

   Licensed under the Apache License, Version 2.0 (the "License");
   you may not use this file except in compliance with the License.
   You may obtain a copy of the License at

       http://www.apache.org/licenses/LICENSE-2.0

   Unless required by applicable law or agreed to in writing, software
   distributed under the License is distributed on an "AS IS" BASIS,
   WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
   See the License for the specific language governing permissions and
   limitations under the License.