Error when processing message with key 1. capacity < 0: (-29 < 0)
Ahmad44452 opened this issue · 2 comments
I have used the default docker compose file to deploy KMS along with Kafka, Zookeeper and Prometheus. I currently have two topics in Kafka schedules
and quickstart-events
. I am using KafkaJS to append message to the schedules
topic. Following is my code for this producer:
const { Kafka } = require('kafkajs');
const uuidv4 = require('uuid').v4;
const kafka = new Kafka({
clientId: 'my-app',
brokers: ['localhost:9093'],
});
const producer = kafka.producer()
producer.connect().then(() => {
producer.send({
topic: 'schedules',
messages: [
{
key: uuidv4().toString(),
value: JSON.stringify({
time: 1697565708394,
topic: "quickstart-events",
key: uuidv4().toString(),
value: "HELLO. This is a SCHEDULED MESSAGE."
})
}
],
}).then(() => {
producer.disconnect()
})
});
The producer is working completely fine. The added messages can also be clearly seen in Offset Explorer connected with this deployment
But when I add messages to the schedules
topic, following logs containing error appear in the skyuk/kafka-message-scheduler
's container. Any help would be highly appreciated.
Hi @Ahmad44452 , from your code snippet, I can see that you are encoding the message value as a Json, whereas this should be encoded in Avro binary format according to the Schema
as per Readme line 20.
The error you are seeing is thrown when the KMS is trying to decode a ConsumerRecord value with an Avro decoder.
Ah. Got it. Thank you for the help. Appreciate it a lot