logstash-plugins/logstash-input-kafka

decorate_events is missing to catch message size

TomonoriSoejima opened this issue · 0 comments

  • Version: 8.1.1
  • Operating System: OS X 10.14.2
  • Config File
input {
  kafka {
    bootstrap_servers => "localhost:9092"
    topics => ["test"]
    decorate_events => true
    codec => plain
  }
}

filter {
        json {
                source => "message"
        }
        mutate {
            add_field => {
                "kafka" => "%{[@metadata][kafka]}"
            }
        }
}

output {
        stdout { codec => rubydebug }
        elasticsearch {
                hosts => "localhost:9200"
        }
}
  • Steps to Reproduce:

I tried to capture message size with config above but all I end up in es is below.

  {
        "_index": "logstash-2019.01.17",
        "_type": "doc",
        "_id": "c6K9WWgBM1pL-G5k3dHw",
        "_score": 1,
        "_source": {
          "@timestamp": "2019-01-17T02:57:21.702Z",
          "kafka": """{"topic":"test","consumer_group":"logstash","partition":0,"offset":18,"key":null,"timestamp":1547693840696}""",
          "name": "Sam",
          "message": """{"name":"Sam"}""",
          "@version": "1"
        }
      }

The doc says Option to add Kafka metadata like topic, message size to the event.

Looking at https://github.com/logstash-plugins/logstash-input-kafka/blob/master/lib/logstash/inputs/kafka.rb#L258, it appears that size is not being captured and I wonder if that is a limitation of kafka or not.

Neither kafka-console-producer and kafka-console-consumer mention size in its help at all.