gocarina/gocsv

Discard nested struct fields from CSV with MarshalFile

KinaneD opened this issue · 2 comments

Hello,

I am trying to write a slice of structs to a CSV, but some unaccounted for columns are also added and types are appended to columns titles.

Said slice is retrieved from SQL where some columns defaults to NULL

Struct:

type Users struct {
   ID string `gorm:"type:uuid;index;" csv:"id" fake:"{uuid}"
   AudienceFemale sql.NullFloat64 `gorm:"type:double precision;index;" csv:"audience_female" fake:"{float64range:20,100} default:null"`
   Gender  sql.NullString `gorm:"type:string;default:null;size:1;index" csv:"gender" fake:"{randomstring:[m,f]}"`
}

Based on the above struct i expect the CSV to only contain the following columns: id, audience_female and gender.

gocsv.MarshalFile(&users, csvFile) successfully writes to the CSV but the following columns: id, audience_female.Float64, audience_female.Valid, gender.String and gender.Valid.

  1. Is there a way to prevent this "validation" from happening and limiting the columns to the ones in the struct?
    1.1. Could there be some sort of struct tags to prevent this behaviour? i tried csv:"audience_female,omitempty" and csv:"audience_female,default=0" but these did not work out.
    1.2. What are the supported tags? For instance see how the GORM docs document these, it would be great if these could be added to the README.
  2. Is there a way to preserve NULL or leave the cell empty for that matter rather than converting the value to it's default i.e. 0 in the case of float64?

Please see the attached CSV, here's a snippet:

id,audience_female.Float64,audience_female.Valid,gender.String,gender.Valid
762757704,9.4523454,TRUE,f,TRUE
7776373907,15.232,TRUE,f,TRUE
4117194142,7.3452,TRUE,f,TRUE
3217110349,0,FALSE,m,TRUE
3605425415,12.546774,TRUE,m,TRUE
4860605584,16.34534535,TRUE,f,TRUE
97148322,24.45245635,TRUE,f,TRUE
8155265787,19.54763757,TRUE,f,TRUE
159925593,17.5676326,TRUE,f,TRUE
1762808979,53.46745675,TRUE,m,TRUE
7344275957,4.674548747,TRUE,f,TRUE
1054673756,0,FALSE,m,TRUE
2547861680,5.7847,TRUE,m,TRUE
1119723173,31.34524747,TRUE,f,TRUE

gocsv_duplciate_validation_columns.csv

Thank you.

As it turns out this behaviour is related to the Struct's field type sql.NullFloat64 which is a struct on it's own composed on a Float64 and a Valid fields.
Related to #95

I think it is a common practice to export an SQL table results into a CSV, it makes sense to add support to sql Null interfaces to this package.

Finally worked my way around this by defining a proprietary type that wraps sql.NullFloat64 and implements the MarshalCSV interface.

type NullFloat64 struct {
	sql.NullFloat64
}

func (s *NullFloat64) MarshalCSV() (string, error) {
	v, err := s.Value()

	if v == nil {
		v = 0
	}

	return fmt.Sprint(v), err
}

I have to manually check if value is nil and replace it by 0 otherwise it would be added as <nil> string to the CSV.

And changed the Model's struct definition to leverage the new type instead:

type Users struct {
   ID string `gorm:"type:uuid;index;" csv:"id" fake:"{uuid}"
   AudienceFemale NullFloat64 `gorm:"type:double precision;index;" csv:"audience_female" fake:"{float64range:20,100} default:null"`
   Gender  NullString `gorm:"type:string;default:null;size:1;index" csv:"gender" fake:"{randomstring:[m,f]}"`
}

On the other hand, large values has been added to the CSV in scientific notation i.e. 1.1259076377306024e+308, i guess that should not be an issue.

Did the same for sql.NullString.

Hope this helps.