dibbhatt/kafka-spark-consumer

failure occurred when set spark.streaming.concurrentJobs from 1 to 2

AndyRao opened this issue · 5 comments

Exception like this:

02 DEBUG zookeeper.ClientCnxn: Reading reply sessionid:0x3557739d57e0613, packet:: clientPath:null serverPath:null finished:false header:: 305,4 replyHeader:: 305,21475155982,-101 request:: '/brokers/ids/-1,F response::
16/06/22 19:03:02 ERROR kafka.DynamicBrokersReader: Node /brokers/ids/-1 does not exist

What Version of Kafka you are using ? What version of Zookeeper you are using ? Can you share the properties you used for this Consumer.

The version of Kafka is 0.9.0.0, and the version of Zookeeper is 3.4.8

Can you share the properties you specified like

zookeeper.hosts=
zookeeper.port=
zookeeper.broker.path=
kafka.consumer.id=
zookeeper.consumer.connection=
zookeeper.consumer.path=

It seems that it's a kafka server problem. I wiil close this issue, thanks

Cool. Thanks