A quick and easy way to get Kafka up and running:
- starts a single Kafka message broker (including zookeeper)
- runs with minimal memory requirements (even aws micro instances can host it)
Use cases:
- if you are a developer and want to experiment with a messaging platform/kafka broker in no time,
- or if you are a startup and want to run Kafka on AWS without breaking the bank.
Starting a Kafka instance is simple:
$ docker run --name a-kafka -p 9092:9092 -d paperlib/kafka
... where a-kafka
is the name you want to assign to your container.
When you start the kafka
image, you can adjust the configuration of the Kafka and Zookeeper daemons by passing one or more environment variables on the docker run
command line. All of them are optional.
Adjust the Kafka daemon advertised host. This is the hostname of the advertised.listeners
property in the server.properties
file. Defaults to ADVERTISED_HOST=127.0.0.1
advertised.listeners=PLAINTEXT://${ADVERTISED_HOST}:9092
Adjust the Java heap available for Kafka. Defaults to KAFKA_HEAP_OPTS="-Xmx256M -Xms256M"
.
$ docker run -e KAFKA_HEAP_OPTS="-Xmx1024M -Xms1024M" --name a-kafka -p 9092:9092 -d paperlib/kafka
Adjust the Java heap available for Zookeeper. Defaults to ZOOKEEPER_HEAP_OPTS="-Xmx128M -Xms128M"
.
The following examples are in python and use its kafka-python
library, so we first have to install it:
$ sudo -H pip install kafka-python
and create the topic
over which we are going to be sending our messages:
from kafka.admin import KafkaAdminClient, NewTopic
kafka = KafkaAdminClient(bootstrap_servers="localhost:9092")
topics = []
topics.append(NewTopic(name="example", num_partitions=1, replication_factor=1))
kafka.create_topics(topics)
Next we send a few messages over a topic named example
with the following script:
from kafka import KafkaProducer
producer = KafkaProducer(bootstrap_servers='localhost:9092')
producer.send('example', b'Hello, World!')
producer.send('example', key=b'message-two', value=b'This is Kafka-Python')
producer.flush()
and then we get these messages with a receiver script:
from kafka import KafkaConsumer
consumer = KafkaConsumer('example', group_id='a_readers_group', auto_offset_reset='earliest')
for message in consumer:
print (message)
et voilà! 🙂