A failed timestamp parse will leave a non LogStash::Timestamp in @timestamp
jsvd opened this issue · 1 comments
jsvd commented
After a json document is serialized, a check is performed on event.timestamp
and if it's a String it's converted into a LogStash::Timestamp. However if the conversion fails the event be tagged as jsonparsefailure but still carry a string in @timestamp
.Example:
% echo '{"@timestamp": "07/sep/2015:07:50:54 UTC" }}' | bin/logstash -e 'input { stdin { } } filter { json { source => "message"} } output { stdout { codec => rubydebug } }'
Logstash startup completed
Trouble parsing json {:source=>"message", :raw=>"{\"@timestamp\": \"07/sep/2015:07:50:54 UTC\" }}", :exception=>#<LogStash::TimestampParserError: invalid timestamp string "07/sep/2015:07:50:54 UTC", error=java.lang.IllegalArgumentException: Invalid format: "07/sep/2015:07:50:54 UTC" is malformed at "/sep/2015:07:50:54 UTC">, :level=>:warn}
{
"message" => "{\"@timestamp\": \"07/sep/2015:07:50:54 UTC\" }}",
"@version" => "1",
"@timestamp" => "07/sep/2015:07:50:54 UTC",
"host" => "Joaos-MBP.lan",
"tags" => [
[0] "_jsonparsefailure"
]
}
Logstash shutdown completed
Some plugins will no longer work when handling this event since they expect @timestamp to always be an instance of LogStash::Timestamp
jsvd commented
this no longer happens since 1.0.1, please upgrade to the most recent version of logstash-filter-json to avoid the issue
this was fixed in 2.0.3, but there is no patch for 1.x