Conversion of high-precision numbers to double loses precision, should pick BigDecimal
GoogleCodeExporter opened this issue · 1 comments
GoogleCodeExporter commented
What steps will reproduce the problem?
1. In YAML provide a high-precision value such as 1.234567890123456789
2. Load the YAML
3. Dump the value
What is the expected output? What do you see instead?
I expect new BigDecimal("1.234567890123456789").
I get 1.2345678901234567d (missing the last 89 digits)
What version of SnakeYAML are you using? On what Java version?
1.14 SnakeYAML, 1.8.0_31 JDK.
Please provide any additional information below. (Often a failing test is
the best way to describe the problem.)
I'm unsure what havoc it would wreak for SnakeYAML to check huge int/float
values and "upgrade" int->BigInteger and float->BigDecimal when encountering
values with too many bits to fit into int/double.
Original issue reported on code.google.com by b.k.ox...@gmail.com
on 24 Jan 2015 at 10:43
GoogleCodeExporter commented
Well, BigInteger and BigDecimal are hardly ever used. "Upgrading" all int and
double to their Big counterparts is a major change which breaks the backwards
compatibility.
If you need it in your application you can always change SnakeYAML to create
anything you wish.
Check ConstructYamlFloat, create your own implementation and inject it to
SafeConstructor.
Original comment by py4fun@gmail.com
on 26 Jan 2015 at 2:05
- Added labels: Type-Enhancement
- Removed labels: Type-Defect