Does gcs-connector-for-apache-kafka connector support batch processing?
Closed this issue · 4 comments
Hi. Thanks for your connector. Is there any way that it is or will be supporting batch processing of avro messages?
Hi @gaspromobooking
Could you please elaborate? This connector--as the Kafka Connect framework itself--is batch-oriented.
Hi. @ivanyu thanks for the quick response.
I'll try to be more percise:
Now i've implemented your converter with avro io.confluent.connect.avro.AvroConverter.
I'm producing kafka messages with a single avro record, passing magic byte and schema version to the value, so it can be deserialized by converter.
I wish to produce one kafka message with several records in it, but i'm not sure, how converter will understand the number of messages to correctly deserialize them and should i put any additional byte(s) before every record?
Thanks.
AvroConverter
most certainly won't be able to handle records batched in this way.
The model of "one logical entity = one physical message" is very natural to Kafka, though, and batching is implemented on this level. You know your case better, but is anything stopping you from following this approach?
I'm not sure yet, as this is proof of concept, and production behaviour could differ. So i'm trying to study different approaches. Nevertheless, thank you for your replies.