This project walks through how to set up a change data capture based system on Azure using Event Hubs (for Kafka), MSSQL and Debezium|JDBC. It will use the Debezium-MSSQL or JDBC-MSSQL connector to stream database modifications from MSSQL Server to Kafka topics in Event Hubs.
- Azure subscription
- Linux
- Kafka release, available from kafka.apache.org
The following connect-standalone.properties sample illustrates how to configure Connect to authenticate and communicate with the Kafka endpoint on Event Hubs
Connectors are packaged as Kafka Connect plugins. Kafka Connect isolates each plugin so that the plugin libraries do not conflict with each other.
To manually install a connector:
- Find your connector on Confluent Hub and download the connector ZIP file.
- Extract the ZIP file contents and copy the contents to the desired location.
- Add this to the plugin path in your Connect properties file.
The following plugins sample contains two plugins: debezium-2.1 and jdbc-10.7.1
Then, create a configuration file connect-debezium-mssql.properties for the Debezium-MSSQL source connector and connect-jdbc-mssql-sink.properties for the Jdbc-MSSQL sink connector.
> cd kafka_2.13-3.4.0/
> bin/connect-standalone.sh config/connect-standalone.properties config/connect-debezium-mssql.properties config/connect-jdbc-mssql-sink.properties
NOTE: List all available Kafka Connect plugins
> curl -s -XGET http://localhost:8083/connector-plugins|jq '.[].class'
NOTE: List all available Kafka Connect worker
> curl -H "Accept:application/json" localhost:8083/connectors