logstash-plugins/logstash-output-google_bigquery

Logstash crashs and Google BigQuery feature upgrade

Closed this issue · 1 comments

Hi

  1. Running Logstash i get the following error over and over again:
IOError: closed stream
flush at org/jruby/RubyIO.java:2199
size at org/jruby/RubyFile.java:1108
method_missing at /opt/logstash/lib/logstash/outputs/google_bigquery.rb:565
receive at /opt/logstash/lib/logstash/outputs/google_bigquery.rb:194
handle at /opt/logstash/lib/logstash/outputs/base.rb:86
worker_setup at /opt/logstash/lib/logstash/outputs/base.rb:78

I'm trying to understand what am i doing wrong
i've tested my configuration and everything seems more than fine
as sometimes it works and sometimes it just fails with unknown reason
2. Google BigQuery output plugin
I think its a good idea to add the threshold attribute for allowed errors within an bq upload job, you can see example in the bigquery portal
3. Unknown bug of file control within Google BigQuery plugin
within the plugin you have two keys you can set:
deleter_interval_secs - set the interval in sec for deleting the temp file from disk
uploader_interval_secs - set the interval in sec from data collection to uploading the data to bigquery
I've noted that despite that the file was uploaded and was done been written with data, once it's deleted using the deleter_interval_secs key, Logstash craches because it can't find the file

migrated from: https://logstash.jira.com/browse/LOGSTASH-2247

The fix for this has been released as part of logstash-output-google_bigquery version 4.0.0.

WARNING 4.0.0 has breaking changes so read the changelog before upgrading to understand how they'll affect you.

The plugin can be updated with the following command

bin/logstash-plugin update logstash-output-google_bigquery