Playing around with Capture Data Change using Debezium
Explore the docs »
View Demo
·
Report Bug
·
Request Feature
Table of Contents
This repository contains two Spring Boot applications that demonstrate the use of Debezium Postgres and JDBC Sink connectors for data streaming from a Postgres database to Kafka using both JSON and Avro converters.
These instructions will get you a copy of the project up and running on your local machine for development and testing purposes.
- Docker
- Docker Compose
- Java 8 or higher
CDC-JSON_CONVERTER
branch: Demonstrates the use of JSON converter for data streaming.CDC-AVRO_CONVERTER
branch: Demonstrates the use of Avro converter for data streaming.
- Clone the repository.
git clone https://github.com/Cherni-Oussama/kafka-debezium-data-streaming.git
- Checkout for the specifific branch you want to test
git checkout CDC-JSON_CONVERTER
orgit checkout CDC-AVRO_CONVERTER
-
Build the project.
docker compose up
. -
Verify that all containers are up.
docker ps
or you could check them from Docker Desktop -
Navigate to the Upstream application Swagger page and try to create new Company or Employee http://localhost:8010/swagger-ui/index.html#/employee-controller/addEmployee
-
Verify that data is being streamed from the Postgres databases to Kafka.
- connect to kafka container : In CMD
docker exec -it kafka bash
- list the topics created :
kafka-topics --bootstrap-server localhost:9092 --list
- consumer the messages created in the created topic :
kafka-console-consumer --bootstrap-server localhost:9092 --topic topicName --from-beginning
- Verify that data is being streamed from the kafka to postgres sink database.
- connect to postgres sink database :
docker exec -it postgres-sink-db
- Authentification:
psql -U postgres
- list the created tables
\dt
.
- Debezium Postgres Connector: https://debezium.io/documentation/reference/connectors/postgresql.html
- JDBC Sink Connector: https://docs.confluent.io/platform/current/connect/kafka-connect-jdbc/sink-connector/index.html