Clean up CSV files for import into PostgreSQL.
Usage: csvimport [--clean] [--merge] file.csv...
--clean Drop tables before recreating them
-h, --help Print this help and exit
--merge string Attempt to merge all imported data into this table
--version Print version information and exit
For each CSV file it's given on the commandline it will create a SQL script that will create a matching table and
import the data into it. It will also create alltables.sql
which uses psql's \i
to include
all the other generated scripts.
If --clean
is given then the generated SQL scripts will attempt to drop the existing table before creating it.
--merge
specifies a table name. If set, alltables.sql
will attempt to merge the contents of all the other
tables created into that table. Unless the CSV files being imported are identical in structure this will
probably fail.
csvimport --clean my_file.csv
psql -f my_file.sql
If a file.json
is passed then csvimport will attempt to parse it as an array of objects, then treat it
as it would a CSV file containing the same data. This is even more of a hack than everything else, but
occasionally useful.
This is a fairly quick hack for my own use rather than production grade code. It falls back to text types for anything it doesn't understand. Patches or pull requests welcome.