Tools for writing ts-http-types to Kafka in Node.js, powered by kafka.js.
The library is a wrapper around kafkajs so it must be installed as peer dependency.
kafkajs must be installed as peer dependency.
$ yarn add kafkajs http-types-kafka
# or
$ npm i kafkajs http-types-kafa
First create the topic you're writing to:
$ kafka-topics.sh --bootstrap-server localhost:9092 --topic express_recordings --create --partitions 3 --replication-factor 1
Note that you may need to change script name depending on how you installed Kafka.
Create a HttpTypesKafkaProducer
and connect to Kafka:
import { CompressionTypes, KafkaConfig, ProducerConfig } from "kafkajs";
import { HttpTypesKafkaProducer } from "http-types-kafka";
// Create a `KafkaConfig` instance (from kafka.js)
const brokers = ["localhost:9092"];
const kafkaConfig: KafkaConfig = {
clientId: "client-id",
brokers,
};
const producerConfig: ProducerConfig = { idempotent: false };
// Specify the topic
const kafkaTopic = "express_recordings";
// Create the producer
const producer = HttpTypesKafkaProducer.create({
compressionType: CompressionTypes.GZIP,
kafkaConfig,
producerConfig,
topic: kafkaTopic,
});
// Connect to Kafka
await producer.connect();
Send a single HttpExchange
to Kafka:
const exchange: HttpExchange = ...;
await producer.send(exchange);
Send multiple HttpExchanges
:
const exchanges: HttpExchange[] = ...;
await producer.sendMany(exchanges);
Send recordings from a JSON lines file, where every line is a JSON-encoded HttpExchange
:
await producer.sendFromFile("recordings.jsonl");
Finally, disconnect:
await producer.disconnect();
Delete the topic if you're done:
$ kafka-topics.sh --bootstrap-server localhost:9092 --topic express_recordings --delete
See available commands:
$ http-types-kafka
First create the destination topic in Kafka.
To send recordings from recordings.jsonl
to Kafka, run:
$ http-types-kafka producer --file=recordings.jsonl --topic=my_recordings
Install dependencies:
$ yarn
Build a package in lib
:
$ yarn compile
Run tests:
$ ./docker-start.sh # Start Kafka and zookeeper
$ yarn test
$ ./docker-stop.sh # Once you're done
Package for npm
:
$ npm pack
Publish to npm
:
$ yarn publish --access public
yarn publish
will create a commit to bump the version and also creates a tag for the new version. Push both:
$ git push
$ TAG=v`cat package.json | grep version | awk 'BEGIN { FS = "\"" } { print $4 }'`
$ git push origin $TAG
First start kafka
and zookeeper
:
# See `docker-compose.yml`
docker-compose up
Create a topic called http_types_kafka_test
:
docker exec kafka1 kafka-topics --bootstrap-server kafka1:9092 --topic http_types_kafka_test --create --partitions 3 --replication-factor 1
Check the topic exists:
docker exec kafka1 kafka-topics --bootstrap-server localhost:9092 --list
Describe the topic:
docker exec kafka1 kafka-topics --bootstrap-server localhost:9092 --describe --topic http_types_kafka_test
Using kafkacat
List topics:
kafkacat -b localhost:9092 -L
Push data to topic from file with snappy
compression:
tail -f tests/resources/recordings.jsonl | kafkacat -b localhost:9092 -t http_types_kafka_test -z snappy
Consume messages from topic to console:
kafkacat -b localhost:9092 -t http_types_kafka_test -C