/a-kafka-story

Kafka ecosystem ... but step by step!

Primary LanguageJava

Please checkout these awesome references

  1. http://developer.confluent.io/
  2. https://kafka-tutorials.confluent.io/

And if you want to learn another way, just follow these steps.

Make docker and maven do their thing once for all by running ./fetch.sh

Then jump in the Kafka Story!

  1. One zookeeper, one kafka broker
  2. One zookeeper, many kafka brokers
  3. Java consumer, java producer
  4. Let's add data with telegraf
  5. Let's setup better defaults
  6. Enter kafka stream
  7. Capture JMX metrics
  8. Grafana
  9. Kafka Connect
  10. Kafka Connect and Schema Registry
  11. Change Data Capture
  12. Change Data Capture and Schema Registry
  13. Change Data Capture and Schema Registry and export to S3
  14. Ksql
  15. Ksql server and UI
  16. Change Data Capture, Schema Registry and Ksql
  17. Change Data Capture, JSON, Ksql and join
  18. Random producer and Complex joins
  19. Sync random producer and mysql, capture CDC diff and push it to telegraf

Don't like Docker ? Please download Confluent platform here: https://www.confluent.io/download/

Also, please take a look at

  1. https://github.com/confluentinc/cp-demo
  2. https://github.com/confluentinc/demo-scene
  3. https://github.com/confluentinc/examples
  4. https://github.com/confluentinc/kafka-streams-examples
  5. https://www.confluent.io/stream-processing-cookbook/