Irregular length of precision for at least .sub method
SeaOfCrystalGreens opened this issue · 3 comments
.sub method seems to calculate to precision one less than precision set for Decimal object
Decimal.set({ precision: 3, rounding: 5 })
let n1 = new Decimal("1")
let n2 = new Decimal("2")
let n3 = new Decimal("3")
console.log(n1.div(n3).toString()) // 0.333
console.log(n2.sub(n1.div(n3)).toString()) // 1.67
console.log(n2.sub(n2.sub(n1.div(n3))).toString()) // 0.33
let a = n1.div(n3)
let b = n2.sub(n2.sub(n1.div(n3)))
console.log(a.equals(b)) //false
after changing line
Line 1296 in 7f01abd
to
pr = Ctor.precision + 1;
in
Line 1264 in 7f01abd
it seems to start calculating correctly
Decimal.set({ precision: 3, rounding: 5 })
let n1 = new Decimal("1")
let n2 = new Decimal("2")
let n3 = new Decimal("3")
console.log(n1.div(n3).toString()) // 0.333
console.log(n2.sub(n1.div(n3)).toString()) // 1.667
console.log(n2.sub(n2.sub(n1.div(n3))).toString()) // 0.333
let a = n1.div(n3)
let b = n2.sub(n2.sub(n1.div(n3)))
console.log(a.equals(b)) //true
Although maybe it is intended behavior.
This is ephemery fix.
Intended only to highlight issue.
If unintended, what parts of code might cause shortening of precision?
To what extend it might be in other parts of code?
Sorry, which calculation do you think is incorrect?
You are not going to find any errors here in basic arithmetic operations. This is a reliable, time-tested library.
Note that precision refers to number of significant digits not decimal places.
Comparison of : (2-(2-(1/3)) === (1/3)
Decimal.set({ precision: 3, rounding: 5 })
let n1 = new Decimal("1")
let n2 = new Decimal("2")
let n3 = new Decimal("3")
let a = n1.div(n3)
let b = n2.sub(n2.sub(n1.div(n3)))
console.log(a.equals(b))
Before code change, results in false.
After equality holds.
It looks like it holds after decimal places are of equal length (erroneously increased precision).
If all works as intended then I do apologize for opening issue.
I'll try to understand it more carefully.
1 / 3
to a precision of 3 significant digits is 0.333
.
2 - 0.333
is exactly 1.667
, but rounded to a precision of 3 significant digits it is 1.67
.
2 - 1.67 = 0.33
exactly.
0.333 != 0.33
, as expected.
Normally, a much greater precision is used for calculations (for example, the default precision is 20) and then toSignificantDigits
or toDecimalPlaces
or toFixed
etc. is used to round the result to the desired final precision.