fluentd collect k8s containerd log do not use the timestamp from realy time
arthas3014 opened this issue · 2 comments
Describe the bug
I use fluentd-kubernetes-daemonset:v1.11.5-debian-elasticsearch7-1.1 to collect k8s containerd log to es,and I find that the log from es timestamp is not consistent with the time of the log file on the k8s node, which is more than ten seconds later than the real time, so I suspect that fluentd failed to parse time as timestamp to elasticsearch
To Reproduce
I use fluentd-kubernetes-daemonset:v1.11.5-debian-elasticsearch7-1.1 to collect k8s containerd log to es,and I find that the log from es timestamp is not consistent with the time of the log file on the k8s node, which is more than ten seconds later than the real time, so I suspect that fluentd failed to parse time as timestamp to elasticsearch
Expected behavior
I use fluentd-kubernetes-daemonset:v1.11.5-debian-elasticsearch7-1.1 to collect k8s containerd log to es,and I find that the log from es timestamp is not consistent with the time of the log file on the k8s node, which is more than ten seconds later than the real time, so I suspect that fluentd failed to parse time as timestamp to elasticsearch
Your Environment
- Tag of using fluentd-kubernetes-daemonset:
FLUENTD_SYSTEMD_CONF: disable
FLUENT_ELASTICSEARCH_SSL_VERIFY: false
FLUENT_ELASTICSEARCH_RELOAD_CONNECTIONS: false
FLUENT_ELASTICSEARCH_INCLUDE_TIMESTAMP: true
Your Configuration
@include "#{ENV['FLUENTD_SYSTEMD_CONF'] || 'systemd'}.conf"
@include "#{ENV['FLUENTD_PROMETHEUS_CONF'] || 'prometheus'}.conf"
@include kubernetes.conf
@include conf.d/*.conf
#<match kubernetes.**>
# @type stdout
#</match>
<match kubernetes.**>
@type elasticsearch_dynamic
@id kubernetes_elasticsearch
@log_level info
include_tag_key true
host "#{ENV['FLUENT_ELASTICSEARCH_HOST']}"
port "#{ENV['FLUENT_ELASTICSEARCH_PORT']}"
path "#{ENV['FLUENT_ELASTICSEARCH_PATH']}"
scheme "#{ENV['FLUENT_ELASTICSEARCH_SCHEME'] || 'http'}"
ssl_verify "#{ENV['FLUENT_ELASTICSEARCH_SSL_VERIFY'] || 'true'}"
ssl_version "#{ENV['FLUENT_ELASTICSEARCH_SSL_VERSION'] || 'TLSv1_2'}"
user "#{ENV['FLUENT_ELASTICSEARCH_USER'] || use_default}"
password "#{ENV['FLUENT_ELASTICSEARCH_PASSWORD'] || use_default}"
reload_connections "#{ENV['FLUENT_ELASTICSEARCH_RELOAD_CONNECTIONS'] || 'false'}"
reconnect_on_error "#{ENV['FLUENT_ELASTICSEARCH_RECONNECT_ON_ERROR'] || 'true'}"
reload_on_failure "#{ENV['FLUENT_ELASTICSEARCH_RELOAD_ON_FAILURE'] || 'true'}"
log_es_400_reason "#{ENV['FLUENT_ELASTICSEARCH_LOG_ES_400_REASON'] || 'false'}"
logstash_prefix "#{ENV['FLUENT_ELASTICSEARCH_INDEX_PREFIX'] || 'logstash'}-${record['kubernetes']['namespace_name']}"
logstash_dateformat "#{ENV['FLUENT_ELASTICSEARCH_LOGSTASH_DATEFORMAT'] || '%Y-%m-%d'}"
logstash_format "#{ENV['FLUENT_ELASTICSEARCH_LOGSTASH_FORMAT'] || 'true'}"
index_name "#{ENV['FLUENT_ELASTICSEARCH_LOGSTASH_INDEX_NAME'] || 'logstash'}"
target_index_key "#{ENV['FLUENT_ELASTICSEARCH_TARGET_INDEX_KEY'] || use_nil}"
type_name "#{ENV['FLUENT_ELASTICSEARCH_LOGSTASH_TYPE_NAME'] || 'fluentd'}"
include_timestamp "#{ENV['FLUENT_ELASTICSEARCH_INCLUDE_TIMESTAMP'] || 'false'}"
#time_key logtime
template_name "#{ENV['FLUENT_ELASTICSEARCH_TEMPLATE_NAME'] || use_nil}"
template_file "#{ENV['FLUENT_ELASTICSEARCH_TEMPLATE_FILE'] || use_nil}"
template_overwrite "#{ENV['FLUENT_ELASTICSEARCH_TEMPLATE_OVERWRITE'] || use_default}"
sniffer_class_name "#{ENV['FLUENT_SNIFFER_CLASS_NAME'] || 'Fluent::Plugin::ElasticsearchSimpleSniffer'}"
request_timeout "#{ENV['FLUENT_ELASTICSEARCH_REQUEST_TIMEOUT'] || '5s'}"
suppress_type_name "#{ENV['FLUENT_ELASTICSEARCH_SUPPRESS_TYPE_NAME'] || 'true'}"
enable_ilm "#{ENV['FLUENT_ELASTICSEARCH_ENABLE_ILM'] || 'false'}"
ilm_policy_id "#{ENV['FLUENT_ELASTICSEARCH_ILM_POLICY_ID'] || use_default}"
ilm_policy "#{ENV['FLUENT_ELASTICSEARCH_ILM_POLICY'] || use_default}"
ilm_policy_overwrite "#{ENV['FLUENT_ELASTICSEARCH_ILM_POLICY_OVERWRITE'] || 'false'}"
<buffer>
flush_thread_count "#{ENV['FLUENT_ELASTICSEARCH_BUFFER_FLUSH_THREAD_COUNT'] || '8'}"
flush_interval "#{ENV['FLUENT_ELASTICSEARCH_BUFFER_FLUSH_INTERVAL'] || '5s'}"
chunk_limit_size "#{ENV['FLUENT_ELASTICSEARCH_BUFFER_CHUNK_LIMIT_SIZE'] || '10M'}"
queue_limit_length "#{ENV['FLUENT_ELASTICSEARCH_BUFFER_QUEUE_LIMIT_LENGTH'] || '32'}"
retry_max_interval "#{ENV['FLUENT_ELASTICSEARCH_BUFFER_RETRY_MAX_INTERVAL'] || '30'}"
retry_forever true
</buffer>
</match>
kubernetes.conf:
----
<label @FLUENT_LOG>
<match fluent.**>
@type null
@id ignore_fluent_logs
</match>
</label>
<source>
@id fluentd-containers.log
@type tail
path /var/log/containers/*.log
pos_file /var/log/es-containers.log.pos
exclude_path /var/log/containers/fluentd-es*.log
tag raw.kubernetes.*
read_from_head true
<parse>
@type multi_format
<pattern>
format json
time_key time
time_format %Y-%m-%dT%H:%M:%S.%NZ
</pattern>
<pattern>
format /^(?<time>.+?) (?<stream>stdout|stderr) [^ ]* (?<log>.*)$/
#format /^(?<time>[^ ]+) (?<stream>stdout|stderr) [^ ]* (?<log>.*)$/
time_key time
time_format %Y-%m-%dT%H:%M:%S.%N%:z
#time_format %Y-%m-%dT%H:%M:%S.%L%z
</pattern>
</parse>
</source>
<match raw.kubernetes.**>
@id raw.kubernetes
@type detect_exceptions
remove_tag_prefix raw
message log
stream stream
multiline_flush_interval 5
max_bytes 500000
max_lines 1000
</match>
<filter kubernetes.**>
@type record_transformer
<record>
logtime ${time}
</record>
</filter>
<filter **>
@id filter_concat
@type concat
key message
multiline_end_regexp /\n$/
separator ""
</filter>
<filter kubernetes.**>
@id filter_kubernetes_metadata
@type kubernetes_metadata
</filter>
<filter kubernetes.**>
@id filter_parser
@type parser
key_name log
reserve_data true
remove_key_name_field true
<parse>
@type multi_format
<pattern>
format json
</pattern>
<pattern>
format none
</pattern>
</parse>
</filter>
<filter kubernetes.**>
@type grep
<exclude>
key message
pattern /.*\/healthz/
</exclude>
</filter>
<source>
Your Error Log
do not have error log
Additional context
No response
the time in node container file is:
2024-05-29T15:10:06.516919109+08:00 stderr F panic: logger sync failed: sync /dev/stderr: invalid argument
but the parse out the custom time is:
"logtime": "2024-05-29 07:10:06 +0000"
and the es timetamp is : 2024-05-29T07:10:26.326419109+00:00