confluentinc/kafka-connect-datagen

Connector silently fails to generate messages when schema is too long

olivd44 opened this issue · 3 comments

Hi,

When setting a large json schema either in schema.string or in schema.filename file, connector does not send any message.

No errors are raised, just messages are not generated.

Is there a known size limit in schemas that can be handled by datagen connector?

It used to work with 0.3.3 and does not work anymore from 0.4.x onward

Kind regards,
Olivier

I can reproduce this issue, but it's not related to schema size, it is happening when "regex": "[a-zA-Z0-9]{x}": where x>15:

        {
            "name": "MY_STRING",
            "type": {
                "type": "string",
                "arg.properties": {
                    "regex": "[a-zA-Z0-9]{16}"
                }
            },
            "default": ""
        }

@vdesabou any workaround for this?

@guitcastro This was fixed by #99 and available in 0.5.3 release.

This issue can be closed