bug: postgres driver tries to insert slices and arrays as records, and fails (reproduction included)
sleddev opened this issue · 3 comments
GORM Playground Link
Description
Explain your user case and expected results
When inserting go arrays or slices in a GORM model, the generated SQL uses them as records, so the insert fails.
For example:
type Test struct {
Data []float64 `gorm:"type:float8[]"`
}
test := Test{Data: []float64{8, 4, 2, 1, 0.5}}
DB.Create(&test)
gives the SQL:
INSERT INTO "tests" ("data") VALUES ((8,4,2,1,0.5))
And I get the error ERROR: column "data" is of type double precision[] but expression is of type record (SQLSTATE 42804)
Instead, the SQL should be:
INSERT INTO "tests" ("data") VALUES ('{8,4,2,1,0.5}')
or
INSERT INTO "tests" ("data") VALUES (ARRAY [8,4,2,1,0.5])
If I use pgx directly, and do:
conn, err := pgx.Connect(...)
arr := []float64{8, 4, 2, 1, 0.5}
_, err = conn.Exec(context.Background(), "INSERT INTO tests (data) VALUES($1)", arr)
It works as expected, generating SQL with the '{...}'
syntax
@carlcris I ended up making type aliases, and implementing my own Scan
and Value
functions on those types.
Here's my current implementation: https://gist.github.com/sleddev/17ddd0e5f2a037ecf7bb6527367cb916
There may be edge cases I didn't account for, but from my testing and usage works nicely.
To use it, just swap []float64
to pg.Float8Array
in your model