hugoduncan/criterium

Standard deviation formatting error

kittylyst opened this issue · 1 comments

When benching a function which executes in nanos, the standard deviation appears to always be displayed as micros, even if the actual value is nanos. E.g.

user=> (quick-bench (+ 1 2))
WARNING: Final GC required 46.4255478301965 % of runtime
Evaluation count : 4852350
Execution time mean : 131.554401 ns 95.0% CI: (131.397982 ns, 131.699486 ns)
Execution time std-deviation : 6.948727 us 95.0% CI: (6.929592 us, 6.956560 us)
Execution time lower ci : 124.562738 ns 95.0% CI: (124.562738 ns, 124.562738 ns)
Execution time upper ci : 139.214504 ns 95.0% CI: (139.214504 ns, 139.906643 ns)
nil

I assume you are using 0.2.0 - could you try 0.2.1-SNAPSHOT, as I believe this is fixed already.