Python data types for the Apache Kafka® Protocol.
- Exposes immutable dataclass entities for all protocol messages, generated from the same source as used internally in Apache Kafka®.
- Message classes are simply light-weight data containers and does not inherit anything or expose any methods other than a vanilla dataclass. Encoding and decoding is enabled by making all the necessary details about Kafka encoding introspectable.
- Supports encoding and decoding of messages through
IO[bytes]
. - Test suite with focus on roundtrip property tests using Hypothesis.
$ pip install --require-virtualenv kio
Install development requirements.
$ pip install --require-virtualenv -e .[all]
The test suite contains integration tests that expects to be able to connect to an
Apache Kafka® instance running on 127.0.0.1:9092
. There is a Docker Compose file in
container/compose.yml
that you can use to conveniently start up an instance.
$ docker compose --file=container/compose.yml up -d
Run tests.
$ python3 -X dev -m pytest --cov
Setup pre-commit to run on push.
$ pre-commit install -t pre-push
Warning
Building the schema will delete the src/kio/schema
directory and recreate it again, hence
all of the files under this directory will be deleted. Make sure to not put unrelated files
there and accidentally wipe out your own work.
Fetch, generate, and format schema.
$ make build-schema