streamthoughts/kafka-connect-file-pulse

Off into space

miramar-labs opened this issue · 3 comments

Hello
I've got a bunch of files in a bucket (V2 Storage) up in Azure and I'm trying to get the plugin to give me a metadata listing .. but it's having issues. In the logs all I see of any concern is:

java.lang.NoSuchMethodError: 'int io.netty.util.internal.StringUtil.decodeHexNibble(byte)'

I'm running Confluent Cloud 7.2.0

FilePulse 2.12.0

config:

{
"name": "FilePulseSrc",
"config": {
"connector.class": "io.streamthoughts.kafka.connect.filepulse.source.FilePulseSourceConnector",
"fs.cleanup.policy.class": "io.streamthoughts.kafka.connect.filepulse.fs.clean.LogCleanupPolicy",
"fs.listing.class": "io.streamthoughts.kafka.connect.filepulse.fs.AzureBlobStorageFileSystemListing",
"azure.storage.connection.string": "",
"azure.storage.account.name": "",
"azure.storage.account.key": "",
"azure.storage.container.name": "mycontainer",
"fs.listing.interval.ms": "1000",
"topic": "connect-fp-myaccount-mycontainer",
"tasks.reader.class": "io.streamthoughts.kafka.connect.filepulse.fs.reader.AzureBlobStorageMetadataFileInputReader",
"tasks.file.status.storage.bootstrap.servers": "kafka:9092",
"tasks.max": "4",
"key.converter": "org.apache.kafka.connect.storage.StringConverter",
"value.converter": "io.confluent.connect.avro.AvroConverter",
"value.converter.schema.registry.url": "http://schemaregistry:8081",
"name": "FilePulseSrc"
},
"tasks": [],
"type": "source"
}

any ideas ?
Is there a complete example for Azure Blob Storage? I couldn't find anything...
Thanks

This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.

irux commented

I am having the same problem with the decodeHexNibble error. any ideas?

This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.