logstash-plugins/logstash-input-jdbc

Failure mechanism in logstash

Opened this issue · 0 comments

Hi Folks,

I am trying to push data from oracle to ES and I am able to do that as well. My problem when I am pushing those data and that time my ES goes down. I am using last_run_metadata_path property to update my last fetched value. My ES is down but still logstash is able to update last_run_metadata_path file.
Suppose after 10 minutes, when ES will up logstash is able to import data from oracle to ES. It will import all new data.
How i will import all those data who newly added when ES is down.

OS: window 10 enterprise
Logstash: 7.2.0
conf file.
input { jdbc{ jdbc_validate_connection => true jdbc_driver_library => "" jdbc_driver_class => "Java::oracle.jdbc.OracleDriver" jdbc_connection_string => "myconnectionstring" jdbc_user => "username" jdbc_password => "password" statement => "SELECT * FROM auto_increment_tb where id>:sql_last_value" use_column_value=>true tracking_column=>id schedule => "* * * * * *" last_run_metadata_path => "C:\Pramod\rnd\ElasticSearch\logstash-7.2.0\.logstash_jdbc_last_run" } } output { elasticsearch { hosts => ["http://localhost:9200"] index => "auto_increment_tb" } stdout{ codec => rubydebug } }