Unable to capture string from logstash input s3 prefix
niraj8241 opened this issue · 1 comments
niraj8241 commented
I am working on ingesting cloudtrail data to elasticsearch using the logstash s3 input plugin and a grok filter to capture the name of the AWS account to be used for the index name. But when i try to run logstash it does not output the captured name to index settings. This works perfectly fine when i am using file input plugin and capturing string from the "path" variable. So i am sure my regex is configured correctly.
Logstash Version:- 5.5.0
OS:- Ubuntu 14.04
ES:- 5.0.0
Configuration
input {
s3 {
type => "cloudtrail"
bucket => "xxxxxxxxxxxxxxxx"
prefix => "AWSLogs/xxxxxxxxxxxxx/CloudTrail/us-east-1/2017/02/21/"
backup_to_dir => "/etc/s3backup/"
add_field => { source => gzfiles }
codec => cloudtrail {}
region => "us-east-1"
access_key_id => "xxxxxxxxxxxxxxxxxxxx"
secret_access_key => "xxxxxxxxxxxxxxxxxxxxxxxxxxxxx"
sincedb_path => "/etc/s3backup/sincedb"
}
}
filter {
grok {
match => {"prefix" => "^AWSLogs/(?<tstmp>[^/]+)/"}
}
}
output {
stdout { codec => rubydebug }
elasticsearch {
index => "%{[tstmp]}-%{+YYYY-MM}"
hosts => ["xxxxxxxxxxxxx:9200"]
}
}
The output of index pattern what i get is %{[tstmp]}-2017-02
magnusbaeck commented
Looks like a duplicate of https://discuss.elastic.co/t/how-to-capture-text-from-a-path-or-s3-prefix/95998.