/custom-metrics

Custom metrics to use with WebPageTest agents

Primary LanguageJavaScriptApache License 2.0Apache-2.0

Custom metrics

Adding a new custom metric

HTTP Archive uses WebPageTest (WPT) to collect information about how web pages are built. WPT is able to run arbitrary JavaScript at the end of a test to collect specific data, known as custom metrics. See the WPT custom metrics documentation for more info.

To add a new custom metric to HTTP Archive:

  1. Select the appropriate js file. Some custom metrics are small and single-purpose while others return many metrics for a given topic, like media.js and almanac.js. Create a new file if you're not sure where your script belongs.

  2. For scripts that return a JSON object, the key should be named according to what it's measuring, for example meta-nodes returns an array of all <meta> nodes and their attributes:

return JSON.stringify({
  'meta-nodes': (() => {
    // Returns a JSON array of meta nodes and their key/value attributes.
    var nodes = document.querySelectorAll('head meta');
    var metaNodes = parseNodes(nodes);

    return metaNodes;
  })(),

  //  check if there is any picture tag containing an img tag
  'has_picture_img': document.querySelectorAll('picture img').length > 0
});
  1. Test your changes on WPT using the workflow below.

  2. Submit a pull request. Include one or more links to test results in your PR description to verify that the script is working.

Testing

Manual testing using webpagetest.org website

To test a custom metric, for example doctype.js, you can enter the script directly on webpagetest.org under the "Custom" tab.

image

Note that all WPT custom metrics must have [metricName] at the start of the script. This is excluded in the HTTP Archive code and generated automatically based on the file name, so you will need to manually ensure that it's set.

If you include the debug=1 parameter on the WPT home page, for example https://webpagetest.org?debug=1, the test results will include a raw debug log from the agent including the devtools commands to run the custom metrics (and any handled exceptions). The log ouput can be found in the main results page to the left of the waterfall. For each run there will be a link for the "debug log" (next to the timeline and trace links).

To see the custom metric results, select a run, first click on "Details", and then on the "Custom Metrics" link in the top right corner:

image

image

For complex metrics like almanac.js you can more easily explore the results by copy/pasting the JSON into your browser console.

Automated WPT test runs

  1. WPT tests are running using WPT API wrapper.
  2. Test runs are using a private WPT instance, set by the WPT_HOST environment variable.
  3. By default, WebAlmanac website is used for testing in every PR.
  4. PR author can define a list of websites to test additionally, by using a markdown list as shown in PR template.

Unit tests

  1. Unit tests are using Jest Testing Framework.
  2. Open unit-tests.test.js file and add test cases for the custom metrics.
  3. wpt_data variable contains is an object with custom metrics values parsed from WPT response.

Linting

On opening a Pull Request we will do some basic linting of JavaScript using ESLint through the GitHub Super-Linter.

You can run this locally with the following commands:

docker pull github/super-linter:slim-latest
docker run -e RUN_LOCAL=true -e VALIDATE_JAVASCRIPT_ES=true -e VALIDATE_MARKDOWN=true -e USE_FIND_ALGORITHM=true -v $PWD/custom_metrics:/tmp/lint github/super-linter:slim-latest