I noticed that large number precisions are lost while converting json to csv
ronakg11 opened this issue · 1 comments
ronakg11 commented
Example json:
{
"data": [
[
6068511140000001,
{
"coordinates": [
[
-118.2252993,
34.0864107
],
[
-118.224834,
34.085999
]
],
"type": "LineString"
}
],
[
10292320860000001,
{
"coordinates": [
[
-118.183504462242,
34.0576910972595
],
[
-118.183576643467,
34.057457447052
],
[
-118.183595955372,
34.0572071075439
]
],
"type": "LineString"
}
]
]
}
Converted CSV:
6068511140000001 | -118.2252993 | 34.0864107 | -118.224834 | 34.085999 | LineString | ||
---|---|---|---|---|---|---|---|
10292320860000000 | -118.183504462242 | 34.0576910972595 | -118.183576643467 | 34.057457447052 | LineString | -118.183595955372 | 34.0572071075439 |
You'll notice that 10292320860000001 in the json became 10292320860000000 in the converted csv.
Can you fix this please? :)
amtiongson commented
I have the same issue. Is there any update if this has already been fixed? Thanks.