not converting numbers will leading zeros
avilella opened this issue · 3 comments
Hi, I have a case where a .csv has a table with numbers like: 032907365 that are getting converted to 32907365 when doing:
avilella@avilellaM710t:~$ echo -e "col\n032907365\n"
col
032907365
avilella@avilellaM710t:~$ echo -e "col\n032907365\n" | q -O -H -d ',' "SELECT col FROM - " 2>/dev/null
col
32907365
avilella@avilellaM710t:~$ echo -e "col\n032907365\n" | q -O -H -d ',' "SELECT CAST(col as TEXT) FROM - " 2>/dev/null
CAST(col as TEXT)
32907365
How can I switch off that conversion process?
It seems you need to use at least version 1.6.0 of q
, which supports an --as-text
option. If you also have columns which should actually be interpreted as ints that would make things more complicated, but at least a workaround which casts should work then.
If you have a very simple query you could also consider the pragmatic (but technically wrong) approach of leaving out the -O
and -H
parameter and therefore force a text interpretation of all columns by q
(assuming your header column names are all text). Of course then you would need to use c1
etc. to reference columns in the query.
Resolved with q version 1.7.x and:
echo -e "col\n032907365\n" | q --as-text -O -H -d ',' "SELECT col FROM - " 2>/dev/null
col
032907365