logstash-plugins/logstash-output-kafka

kafka output message_key field doesn't work

Closed this issue · 1 comments

cgyim commented

logstash version:5.2
kafka version : 0.11.0

logstash running on ubuntu 14.04 .
kafka running on Macbook 10.12 Sierra

And this is issue Detail:
logstash instance 1 (as kafka producer) config:
`output {
if "bond_contract_query" in [tags]{
kafka {
topic_id => "bond_contract_query"
message_key => "bond_contract_query"
bootstrap_servers => "192.168.33.1:9092,192.168.33.1:9093,192.168.33.1:9094"
}
}else if "band_order_deal" in [tags]{
kafka {
message_key => "band_order_deal"
topic_id => "band_order_deal"
bootstrap_servers => "192.168.33.1:9092,192.168.33.1:9093,192.168.33.1:9094"
}
}else if "data_import" in [tags]{
kafka {
message_key => "data_import"
topic_id => "data_import"
bootstrap_servers => "192.168.33.1:9092,192.168.33.1:9093,192.168.33.1:9094"
}
}else if "bank_interface" in [tags]{
kafka {
message_key => "bank_interface"
topic_id => "bank_interface"
bootstrap_servers => "192.168.33.1:9092,192.168.33.1:9093,192.168.33.1:9094"
}
}
kafka {
topic_id => "elasticsearch"
bootstrap_servers => "192.168.33.1:9092,192.168.33.1:9093,192.168.33.1:9094"
}

}`

logstash instance 2 (as kafka consumer) config:

input { kafka { bootstrap_servers => "localhost:9092,localhost:9093,localhost:9094" auto_offset_reset => "earliest" group_id => "mac" topics => ["elasticsearch"] consumer_threads => 8 decorate_events => true } }
wechatimg23

As you see ,that field is nil. I guess either it's a bug or I made a mistake in configuration or installation

cgyim commented

moreover, the "topics_pattern" also seems failed to work.I have change my kafka and scala version from 0.11.0 and 2.12 to 0.10.0.1 and 2.11.0, still the same problem.