Kafka Connect BigQuery Storage Write Connector
This is an implementation of a sink connector from Apache Kafka to Google BigQuery using Storage Write API.
Configuration
name | type | required | default | description |
---|---|---|---|---|
project | string | true | The BigQuery project name to write to | |
dataset | string | true | The BigQuery dataset name to write to | |
table | string | true | The BigQuery table name to write to | |
keyfile | string | true | The filepath of a JSON key with BigQuery service account credentials | |
write.mode | enum (commtted or pending ) |
false | committed | This value set stream mode (see BigQuery Doc) |
buffer.size | int | false | 1000 | The number of records kept in buffer before transport |
Example
{
"name": "bigquery-sink",
"config": {
"connector.class": "com.reproio.kafka.connect.bigquery.BigqueryStorageWriteSinkConnector",
"tasks.max" : "4",
"topics" : "sample-topics",
"project" : "sample-project",
"dataset" : "sample_dataset",
"table": "sample_table",
"keyfile" : " /opt/host/gcp_key.json"
}
}