hendriknielaender/zBench

Support passing parameters to benchmark functions

Closed this issue · 2 comments

Is your feature request related to a problem? Please describe.
Currently, benchmark functions only take a single parameter, with no way of receiving inputs from the test runner. This prevents any form of parameterized testing or resource bootstrapping.

This would be especially helpful for benchmarking some media processing functions I'm writing, where I'd like to benchmark algorithms with different radii or modes and compare their run times. The way zBench operates now, I'd need to create multiple benchmark functions instead of just writing one that accepts a parameter.

Additionally, since my media processing functions need test data fed to them in the form of an array (so they have something to operate on), if I allocate the test data in the benchmark function, then benchmarking results include the time it takes to allocate and populate memory, which is not what I want.

Describe the solution you'd like
A mechanism for passing in parameters to the benchmark function. A likely place to implement this would be in the run function, since you could execute multiple runs with different parameters.

Describe alternatives you've considered
None.

Additional context
None

Thank you for bringing this to our attention. The feature for enabling parameterized testing in benchmark functions makes sense. It's clear that having the ability to pass in parameters to benchmark functions would greatly enhance testing.

I'll look into implementing a mechanism for passing parameters to the benchmark function in the next release.

That would be wonderful, thank you very much!