logstash-plugins/logstash-output-google_bigquery

Google Bigquery table not created dynamically

Closed this issue · 8 comments

I am able to run load job successfully but the data is not visible in GBQ, though I see that the insert job is run successfully, Also Can anyone tell me if we need to create table on GBQ before shipping logs from logstash, if so, what is the table id format.

Also, As far as I know we need to specify schema. So I just wanted to know if we can create schema dynamically.

any answer here? I'm with the same issue

Same issue here.

I believe I've found the issue with this. It is likely that the rows being inserted are malformed, and an error is being returned by BigQuery. The plugin is swallowing the exception, rather than raising it. This is because the plugin is looking here:

It should be looking here:

response_body['status'].has_key?("errorResult")

Based on these docs:

https://cloud.google.com/bigquery/docs/reference/rest/v2/jobs#configuration.load

It looks like the plugin is looking for errorResult, but only in the deleter:

if job_status.has_key?("errorResult")

If the load failed, it just leaves the file there.

Merged PR fixes this. Please close issue.

@suyograo I think you can close this issue.

lol that was fast... I think I commented as you were closing. :)

Haha, no, I happened to be in the slack channel where I got this notification :)