logstash-plugins/logstash-input-jdbc

JDBC input does not work with PostgresSql numeric column.

Closed this issue · 0 comments

  • Version: 4.3.19 with logstash 7.5.0

  • Operating System: ALL

  • Config File (if you have sensitive info, please remove it): important art is:
    input
    {
    jdbc
    {
    jdbc_driver_class => "org.postgresql.Driver"
    jdbc_connection_string => "jdbc:postgresql://host:5432/schema"
    jdbc_user => "user"
    jdbc_password => "pass"
    statement => "select
    id as id,
    log_record as log_record,
    creation_time as creation_time,
    to_char(creation_time, 'YYYY.MM') as log_date
    from sz_audit_log
    where
    id > :sql_last_value AND
    id < ((select MAX(id) from sz_audit_log where id > :sql_last_value) - 100)
    order by id
    "
    use_column_value => true
    tracking_column => "id"
    tracking_column_type => "numeric"
    jdbc_paging_enabled => true
    jdbc_page_size => "1000000"
    schedule => "3,18,33,48 * * * *"
    last_run_metadata_path => "/srv/logstash-data/.sz_audit_log_jdbc_last_run"
    }
    }

  • Sample Data: NA

  • Steps to Reproduce:
    User has configuration as above:

Tracking column is defined in Postgres as "id | numeric(19,0) | "
When they run pipeline last value is stored in last_run_metadata_path as following:

--- !ruby/object:BigDecimal '0:0.4136727923e10'

This caused an issue when the pipeline is run again as logstash refuse o use this value and start grabbing values from 1st again.
There is not any error in logs it s just hat logstash does not use the value stored in last_run_metadata_path

This works correctly with plugin 4.3.13