logstash-plugins/logstash-input-elasticsearch

elasticsearch hosts in format ipv6 not supperted

Opened this issue · 0 comments

when using ipv6 I get an error when starting logstash

[2019-03-04T09:18:57,606][ERROR][logstash.pipeline ] Error registering plugin {:pipeline_id=>"main", :plugin=>"<LogStash::Inputs::Elasticsearch id=>"elasticsearch_input_index_task", hosts=>["[fd95:ff55:7fb8:f1e5:f816:3eff:feed:ce5b]:9201"], index=>".kibana", query=>"{\"query\":{\"match_all\":{}}}", size=>5000, scroll=>"1m", add_field=>{"[@metadata][enrichment]"=>"true", "[@metadata][type]"=>"task_enrichment"}, enable_metric=>true, codec=><LogStash::Codecs::JSON id=>"json_3360181f-c73d-41df-a4c6-4e284aba296e", enable_metric=>true, charset=>"UTF-8">, docinfo=>false, docinfo_target=>"@metadata", docinfo_fields=>["_index", "_type", "_id"], ssl=>false>", :error=>"bad URI(is not URI?): http://[fd95:0", :thread=>"#<Thread:0x27dbe229 run>"}
[2019-03-04T09:18:57,720][ERROR][logstash.pipeline ] Pipeline aborted due to error {:pipeline_id=>"main", :exception=>#<URI::InvalidURIError: bad URI(is not URI?): http://[fd95:0>, :backtrace=>["uri:classloader:/META-INF/jruby.home/lib/ruby/stdlib/uri/rfc3986_parser.rb:67:in split'", "uri:classloader:/META-INF/jruby.home/lib/ruby/stdlib/uri/rfc3986_parser.rb:73:in parse'", "uri:classloader:/META-INF/jruby.home/lib/ruby/stdlib/uri/common.rb:227:in parse'", "uri:classloader:/META-INF/jruby.home/lib/ruby/stdlib/uri/common.rb:714:in URI'", "org/jruby/RubyMethod.java:127:in call'", "C:/Users/seko0716/Desktop/otrk_local/logstash-6.3.2/vendor/bundle/jruby/2.3.0/gems/faraday-0.9.2/lib/faraday/utils.rb:258:in URI'", "C:/Users/seko0716/Desktop/otrk_local/logstash-6.3.2/vendor/bundle/jruby/2.3.0/gems/faraday-0.9.2/lib/faraday/connection.rb:309:in url_prefix='", "C:/Users/seko0716/Desktop/otrk_local/logstash-6.3.2/vendor/bundle/jruby/2.3.0/gems/faraday-0.9.2/lib/faraday/connection.rb:77:in initialize'", "C:/Users/seko0716/Desktop/otrk_local/logstash-6.3.2/vendor/bundle/jruby/2.3.0/gems/elasticsearch-transport-5.0.5/lib/elasticsearch/transport/transport/http/faraday.rb:38:in __build_connection'", "C:/Users/seko0716/Desktop/otrk_local/logstash-6.3.2/vendor/bundle/jruby/2.3.0/gems/elasticsearch-transport-5.0.5/lib/elasticsearch/transport/transport/base.rb:138:in block in __build_connections'", "org/jruby/RubyArray.java:2486:in map'", "C:/Users/seko0716/Desktop/otrk_local/logstash-6.3.2/vendor/bundle/jruby/2.3.0/gems/elasticsearch-transport-5.0.5/lib/elasticsearch/transport/transport/base.rb:130:in __build_connections'", "C:/Users/seko0716/Desktop/otrk_local/logstash-6.3.2/vendor/bundle/jruby/2.3.0/gems/elasticsearch-transport-5.0.5/lib/elasticsearch/transport/transport/base.rb:40:in initialize'", "C:/Users/seko0716/Desktop/otrk_local/logstash-6.3.2/vendor/bundle/jruby/2.3.0/gems/elasticsearch-transport-5.0.5/lib/elasticsearch/transport/client.rb:114:in initialize'", "C:/Users/seko0716/Desktop/otrk_local/logstash-6.3.2/vendor/bundle/jruby/2.3.0/gems/elasticsearch-transport-5.0.5/lib/elasticsearch/transport.rb:26:in new'", "C:/Users/seko0716/Desktop/otrk_local/logstash-6.3.2/vendor/bundle/jruby/2.3.0/gems/logstash-input-elasticsearch-4.2.1/lib/logstash/inputs/elasticsearch.rb:177:in register'", "C:/Users/seko0716/Desktop/otrk_local/logstash-6.3.2/logstash-core/lib/logstash/pipeline.rb:340:in register_plugin'", "C:/Users/seko0716/Desktop/otrk_local/logstash-6.3.2/logstash-core/lib/logstash/pipeline.rb:351:in block in register_plugins'", "org/jruby/RubyArray.java:1734:in each'", "C:/Users/seko0716/Desktop/otrk_local/logstash-6.3.2/logstash-core/lib/logstash/pipeline.rb:351:in register_plugins'", "C:/Users/seko0716/Desktop/otrk_local/logstash-6.3.2/logstash-core/lib/logstash/pipeline.rb:498:in start_inputs'", "C:/Users/seko0716/Desktop/otrk_local/logstash-6.3.2/logstash-core/lib/logstash/pipeline.rb:392:in start_workers'", "C:/Users/seko0716/Desktop/otrk_local/logstash-6.3.2/logstash-core/lib/logstash/pipeline.rb:288:in run'", "C:/Users/seko0716/Desktop/otrk_local/logstash-6.3.2/logstash-core/lib/logstash/pipeline.rb:248:in block in start'"], :thread=>"#<Thread:0x27dbe229 run>"}

  • Version: 6.3.2
  • Operating System: linux/windows
  • Config File (if you have sensitive info, please remove it):

input {
elasticsearch {
id => "elasticsearch_input_index_task"
hosts => ["[fd95:ff55:7fb8:f1e5:f816:3eff:feed:ce5b]:9201"]
index => "task"
query => '....'
size => 5000
scroll => "1m"
schedule => "*/10 * * * *"
add_field => {
"[@metadata][enrichment]" => "true"
"[@metadata][type]" => "task_enrichment"
}
}
}

output {
if ([@metadata][type] == "task_enrichment") {
elasticsearch {
id => "output_elasticsearch_enrichment_task"
hosts => ["[fd95:ff55:7fb8:f1e5:f816:3eff:feed:ce5b]:9201"]
index => "task"
document_id => "%{id}"
action => "update"
doc_as_upsert => true
retry_on_conflict => 6
retry_max_interval => 320
retry_initial_interval => 5
}
}
}

i am fixed it in pull request