KoffeeTable is a containerized Kafka visualization tool which allows developers to connect their running Kafka cluster and visualize both static and live data. There are a vast array of metrics that can be monitored to ensure efficiency of data pipelines using Kafka, however, it can be difficult to know which metrics are important and visualize these metrics in an intuitive way. KoffeeTable highlights 4 important metrics for Kafka cluster topics, including partition replicas, partition offsets, average lag time, and message velocity. It also displays messages in real-time as they are consumed.
- Setup
- Connecting your cluster
- Using KoffeeTable
- Helpful documentation
- Open Source Information
- Changelog
- License Information
- Contributor Information
In order to use KoffeeTable, you will first need to be running your Kafka cluster in a Docker container. Then, in a terminal, run the KoffeeTable Docker image:
- If you're using a Mac with Apple silicon:
docker run -p 3000:3000 -p 3001:3001 koffeetable/kafka-visualizer:dev
- Otherwise, run:
docker run -p 3000:3000 -p 3001:3001 koffeetable/kafka-visualizer:dev-amd64
Note: if you'd like to connect your MongoDB database, you can pass in your connection string as an environment variable into the command:
docker run -p 3000:3000 -p 3001:3001 --env URI=<YOUR MONGODB URI> koffeetable/kafka-visualizer:dev
- Navigate to
localhost:3000
to view the application. - Click on "Connect" to connect your Kafka cluster by entering its client ID, host name, and port name.
- After connecting, click on "Kafka Cluster Overview" to view metadata on your current connected cluster.
- Under "Topics":
- Click the the dropdown arrow to view partition data for the selected topic.
- Click the link of a topic name to view metrics visualizations for the selected topic.
- You can also add topics to your cluster by clicking the "Add Topic" button and remove existing topics by clicking the delete icons next to each topic.
- Under "Topics":
- Navigate to "Live Messages" and select a topic to view messages consumed per partition.
- Navigate to "Test" to send a sample set of data to your cluster.
- Entering values in "how many inputs" and "delay in ms" and clicking "Start data flow" will send a set number of inputs at a set interval to your selected topic.
- Clicking "Start listening" will begin listening for keyboard key inputs and send keyboard input data to your selected topic.
- https://kafka.apache.org/documentation/
- https://kafka.js.org/docs/getting-started
- https://docs.docker.com/
- Running the application in dev mode:
npm run dev
- Running tests:
npm run test
- We encourage you to submit issues for any bugs or ideas for enhancements. Please feel free to fork this repo and submit pull requests to contribute as well. Also follow KoffeeTable on LinkedIn for more updates. Some ideas for future contributions include:
- Migrating codebase to TypeScript
- Incorporating additional metrics from monitoring systems like Grafana and Prometheus
- More thorough unit and integration testing
- v1.0: initial release May 3, 2023
- Joe Ostrow https://github.com/JSTRO
- Jonas Gantar https://github.com/TJonasT
- Jonathan Valdes https://github.com/jonathanvaldes57
- Matthew Lee https://github.com/Mattholee
- Gavin Briggs-Perez https://github.com/gavinBP
This project is licensed under the MIT License - see the LICENSE.md file for details