Policy class interface is not a subclass
macteja opened this issue · 7 comments
Hello Team,
I have successfully setup the connector with below config,
{
"connector.class": "com.github.mmolimar.kafka.connect.fs.FsSourceConnector",
"fs.uris": "file:////var/kafka/connect/log",
"policy.regexp": "connect-worker.log",
"tasks.max": "2",
"policy.class": "com.github.mmolimar.kafka.connect.fs.policy.Policy",
"name": "POC_FsSourceConnector",
"topic": "kafka.connectfilepulse.source.poc",
"file_reader.class": "com.github.mmolimar.kafka.connect.fs.file.reader.FileReader",
"policy.recursive": "true"
}
After that when the checked the status, connector is running but the tasks are failed with below error,
"trace": "org.apache.kafka.connect.errors.ConnectException: Couldn't start FsSourceTask due to configuration error: Policy class interface com.github.mmolimar.kafka.connect.fs.policy.Policy is not a subclass of interface com.github.mmolimar.kafka.connect.fs.policy.Policy
Could you please look into the error and suggest any.
Thanks
Mac
Hi.
The issue here is the policy class you set. com.github.mmolimar.kafka.connect.fs.policy.Policy
is an interface and you have to set a class. Currently, there are 4 policies which are: com.github.mmolimar.kafka.connect.fs.policy.SimplePolicy
, com.github.mmolimar.kafka.connect.fs.policy.SleepyPolicy
, com.github.mmolimar.kafka.connect.fs.policy.CronPolicy
and com.github.mmolimar.kafka.connect.fs.policy.HdfsFileWatcherPolicy
. You can read more about them here.
Thank you @mmolimar ,
Now i am able to get the data to topic, but have one question,
Is there any way to fetch remote host/vm logs to Kafka topic?
Thanks,
Mac
If the connector can read those files you could use it. If not, maybe you should use another type of connector to ingest that data.
Hi @mmolimar , thank you..
Does kafka-connect-fs supports the remote host file system data copy to topic.
Thanks,
Mac
Hello @mmolimar Good day, i have followed below configs and checked FAQ's but still the logs are not polling to topic,
{
"name":"FsSourceConnectors",
"config":{
"connector.class":"com.github.mmolimar.kafka.connect.fs.FsSourceConnector",
"tasks.max":"2",
"fs.uris":"file:///var/connect/log",
"topic":"test-topic",
"policy.class":"com.github.mmolimar.kafka.connect.fs.policy.SimplePolicy",
"policy.regexp":".*connect-worker\.log$",
"policy.recursive":"false",
"file_reader.class":"com.github.mmolimar.kafka.connect.fs.file.reader.TextFileReader",
"file_reader.text.encoding":"UTF-8",
"file_reader.json.record_per_line":"true",
"poll.interval.ms":"1000"
}
}
Any suggestions on this to activate logs to topic?
Thanks,
mac
The connector uses the FileSystem abstraction from Hadoop to connect to the available implementations (FTP, HDFS, GCS, S3, Azure, Local...). So based on your file system type you should set the proper URI.
On the other hand, the connector config looks like fine. I don't know if the files you have in your FS match the regex but maybe you should use another policy to iterate over that directory to look for files (Cron or Sleepy policy).