DarioBalinzo/kafka-connect-elasticsearch-source

org.apache.kafka.connect.errors.DataException: Invalid Java object for schema with type FLOAT64: class java.lang.Long for field:

lucasenabit opened this issue · 3 comments

please, I had a problem with this field. In my ES i have a scheme with long field, but i have too others fields with same type, but this problem happen just that field.

Exemple mapping ES:

},
"redirectEnd" : {
"type" : "long"
},
"redirectStart" : {
"type" : "long"
},
"requestStart" : {
"type" : "float"
},
"resourceType" : {
"type" : "keyword"
},
"responseEnd" : {
"type" : "float"
},
"responseStart" : {
"type" : "float"
},
"secureConnectionStart" : {
"type" : "float"
},
"startTime" : {
"type" : "float"
},
"transferSize" : {
"type" : "long"
},
"workerStart" : {
"type" : "long"
}

Exemple values ES:

{
"connectEnd" : 4898.4000000059605,
"connectStart" : 4898.4000000059605,
"encodedBodySize" : 21504,
"decodedBodySize" : 140288,
"transferSize" : 8192,
"domainLookupEnd" : 4898.4000000059605,
"domainLookupStart" : 4898.4000000059605,
"duration" : 3224,
"entryType" : "resource",
"fetchStart" : 4898.4000000059605,
"initiatorType" : "script",
"resourceType" : "js",

tks.

Hi, thanks for reporting this.

The connector guess the schema looking at the values in the json, and this problem is caused by a field that contains both number with and without digits.

I will schedule a fix for this, since the connector might use the es schema as source of truth for decide data types, instead of just looking the values of the data.

Hi Dario, tks for answer.

so, at the moment in this case, would not have an alternative to this problem?

As workaround, try one of the following:

  • try to use the json serializer (not avro)
  • index data with a .0 decimal digits, also if they are integers
  • blacklist the field in order to avoid the error