logstash-plugins/logstash-filter-grok

Unable to capture string from logstash input s3 prefix

niraj8241 opened this issue · 1 comments

I am working on ingesting cloudtrail data to elasticsearch using the logstash s3 input plugin and a grok filter to capture the name of the AWS account to be used for the index name. But when i try to run logstash it does not output the captured name to index settings. This works perfectly fine when i am using file input plugin and capturing string from the "path" variable. So i am sure my regex is configured correctly.

Logstash Version:- 5.5.0
OS:- Ubuntu 14.04
ES:- 5.0.0

Configuration

input {
  s3 {
    type => "cloudtrail"
    bucket => "xxxxxxxxxxxxxxxx"
    prefix => "AWSLogs/xxxxxxxxxxxxx/CloudTrail/us-east-1/2017/02/21/"
    backup_to_dir => "/etc/s3backup/"
    add_field => { source => gzfiles }
    codec => cloudtrail {}
    region => "us-east-1"
    access_key_id => "xxxxxxxxxxxxxxxxxxxx"
    secret_access_key => "xxxxxxxxxxxxxxxxxxxxxxxxxxxxx"
    sincedb_path => "/etc/s3backup/sincedb"
  }
}

filter {
   grok {
       match => {"prefix" => "^AWSLogs/(?<tstmp>[^/]+)/"}
   }
 }

output {
  stdout { codec => rubydebug }
  elasticsearch {
       	index => "%{[tstmp]}-%{+YYYY-MM}"
        hosts => ["xxxxxxxxxxxxx:9200"]
   }
}

The output of index pattern what i get is %{[tstmp]}-2017-02