logstash-plugins/logstash-input-kafka

Kafka Input Plugin: Max Partition size error should be a WARN

clintonjimmie opened this issue · 0 comments

  • Version: Logstash 5.6.10 / Kafka 5.1.11
  • Operating System:

Ran into an issue today where Logstash stopped reading from some Kafka topics that had been working.

After quite a bit of troubleshooting and setting the logs to DEBUG on a different Logstash 5.6.10 instance, we saw the following:

Exception in thread "Ruby-0-Thread-20: /Downloads/logstash-5.6.10/vendor/bundle/jruby/1.9/gems/logstash-input-kafka-5.1.11/lib/logstash/inputs/kafka.rb:229" org.apache.kafka.common.errors.RecordTooLargeException: There are some messages at [Partition=Offset]: {esb.activity.processed-2=40674602} whose size is larger than the fetch size 1048576 and hence cannot be ever returned. Increase the fetch size on the client (using max.partition.fetch.bytes), or decrease the maximum message size the broker will allow (using message.max.bytes).

Adding max_partition_fetch_bytes => "10000000" to the config resolved the issue thankfully. There was a couple of hours of troubleshooting that happened that led up to this point, though.

If this is a data stoppage point, I believe this error should be bubbled up as a WARN and not DEBUG.

Let me know if there is more data or clarification I can provide.