logstash-plugins/logstash-filter-syslog_pri

Too big long value for priority crashes the pipeline with RangeError

andsel opened this issue · 1 comments

andsel commented

Logstash information:

Please include the following information:

  1. Logstash version (e.g. bin/logstash --version) any
  2. Logstash installation source (e.g. built from source, with a package manager: DEB/RPM, expanded from tar or zip archive, docker)
  3. How is Logstash being run (e.g. as a service/service manager: systemd, upstart, etc. Via command line, docker/kubernetes) indifferent
  4. How was the Logstash Plugin installed already packaged with Logstash

JVM (e.g. java -version):

If the affected version of Logstash is 7.9 (or earlier), or if it is NOT using the bundled JDK or using the 'no-jdk' version in 7.10 (or higher), please provide the following information:

  1. JVM version (java -version)
  2. JVM installation source (e.g. from the Operating System's package manager, from source, etc).
  3. Value of the JAVA_HOME environment variable if set.

OS version (uname -a if on a Unix-like system): any

Description of the problem including expected versus actual behavior:
Given a value too big for the priority field crashes the pipeline.

Steps to reproduce:

Please include a minimal but complete recreation of the problem,
including (e.g.) pipeline definition(s), settings, locale, etc. The easier
you make for us to reproduce it, the more likely that somebody will take the
time to look at it.

  1. Run the pipeline:
input {
 generator {
   message => '{"syslog_test_pri": 29999999999999999999999, "raw_data": "<240>Dec 12 11:22:33 mymachine failed for myuser on /dev/pts/0"}'
   codec => json
   count => 1
 }
}

filter {
 syslog_pri {
   ecs_compatibility => "v1"
   syslog_pri_field_name => "syslog_test_pri"
 }
}

output {
 stdout {
   codec => rubydebug
 }
}

It fails with

(RangeError) bignum too big to convert into `long'

Provide logs (if relevant):

[2022-12-12T15:45:14,310][INFO ][logstash.javapipeline    ][main] Pipeline started {"pipeline.id"=>"main"}
[2022-12-12T15:45:14,331][INFO ][logstash.agent           ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>[]}
[2022-12-12T15:45:14,449][ERROR][logstash.javapipeline    ][main] Pipeline worker error, the pipeline will be stopped {:pipeline_id=>"main", :error=>"(RangeError) bignum too big to convert into `long'", :exception=>Java::OrgJrubyExceptions::RangeError, :backtrace=>["org.jruby.RubyArray.[](org/jruby/RubyArray.java:1545)", "RUBY.parse_pri(/home/andrea/workspace/logstash_andsel/vendor/bundle/jruby/2.6.0/gems/logstash-filter-syslog_pri-3.1.1/lib/logstash/filters/syslog_pri.rb:108)", "RUBY.filter(/home/andrea/workspace/logstash_andsel/vendor/bundle/jruby/2.6.0/gems/logstash-filter-syslog_pri-3.1.1/lib/logstash/filters/syslog_pri.rb:81)", "RUBY.do_filter(/home/andrea/workspace/logstash_andsel/logstash-core/lib/logstash/filters/base.rb:159)", "RUBY.multi_filter(/home/andrea/workspace/logstash_andsel/logstash-core/lib/logstash/filters/base.rb:178)", "org.jruby.RubyArray.each(org/jruby/RubyArray.java:1865)", "RUBY.multi_filter(/home/andrea/workspace/logstash_andsel/logstash-core/lib/logstash/filters/base.rb:175)", "org.logstash.config.ir.compiler.AbstractFilterDelegatorExt.multi_filter(org/logstash/config/ir/compiler/AbstractFilterDelegatorExt.java:134)", "RUBY.start_workers(/home/andrea/workspace/logstash_andsel/logstash-core/lib/logstash/java_pipeline.rb:301)"], :thread=>"#<Thread:0x39f36dc9@/home/andrea/workspace/logstash_andsel/logstash-core/lib/logstash/java_pipeline.rb:131 sleep>"}
[2022-12-12T15:45:14,451][INFO ][logstash.javapipeline    ][main] Pipeline terminated {"pipeline.id"=>"main"}
[2022-12-12T15:45:14,836][INFO ][logstash.pipelinesregistry] Removed pipeline from registry successfully {:pipeline_id=>:main}

Tracelog:

org.jruby.RubyArray.[](org/jruby/RubyArray.java:1545)
RUBY.parse_pri(/home/andrea/workspace/logstash_andsel/vendor/bundle/jruby/2.6.0/gems/logstash-filter-syslog_pri-3.1.1/lib/logstash/filters/syslog_pri.rb:108)
RUBY.filter(/home/andrea/workspace/logstash_andsel/vendor/bundle/jruby/2.6.0/gems/logstash-filter-syslog_pri-3.1.1/lib/logstash/filters/syslog_pri.rb:81)
RUBY.do_filter(/home/andrea/workspace/logstash_andsel/logstash-core/lib/logstash/filters/base.rb:159)
RUBY.multi_filter(/home/andrea/workspace/logstash_andsel/logstash-core/lib/logstash/filters/base.rb:178)
org.jruby.RubyArray.each(org/jruby/RubyArray.java:1865)
RUBY.multi_filter(/home/andrea/workspace/logstash_andsel/logstash-core/lib/logstash/filters/base.rb:175)
org.logstash.config.ir.compiler.AbstractFilterDelegatorExt.multi_filter(org/logstash/config/ir/compiler/AbstractFilterDelegatorExt.java:134)
RUBY.start_workers(/home/andrea/workspace/logstash_andsel/logstash-core/lib/logstash/java_pipeline.rb:301)
andsel commented

The RFC for syslog pri defines implicitly a maximum value equal to 999 for priority, this means that the max value for facility_code is 124, however the specification defines 23 values ([0.22]) so any value out of that range should be considered a violation of specification and so let the events eligible to be tagged with error.
This RangeError comes from a bad data passed to the plugin.