Standard deviation is not calculated for scale.factor
LoganEvans opened this issue · 2 comments
The call to nls
in usl.solve.nls
has 3 fitted parameters: X1
, sigma
, and kappa
. The value X1
corresponds to the scale.factor
. The usl
function computes the standard deviation for both sigma
and kappa
and stores those values in coef.std.err
, but it does not compute this value for scale.factor
.
After reading through this blog post, it appears that the residuals are expected to be larger for a higher number of processors, so unfortunately, the correct variance can't be computed directly from the residuals
vector.
The implemented model treats the scale factor to a large extent as a constant. This comes from the original algorithm which needs this measurement as input and therefore does not leave any freedom in choosing it. Therefore the output of the model focuses on sigma and kappa.
I implemented the currently available standard errors to after looking at the numbers coming from other tools: http://perfdynamics.blogspot.de/2010/11/reporting-standard-errors-for-usl.html
The original performance measurements probably contain so much uncertainty that it is questionable to establish a false sense of accuracy anyway.
Closing this issue as it won't be changed in the code.