Consumer not able to read kafka topic and write to HDFS
sbellary300s opened this issue · 7 comments
The kafka-spark-consumer (v.1.0.6) job was reading kafka topic and writing to couchbase successfully on CDH 5.5.1 Cluster. But it is not reading or writing the data after we upgraded to CDH 5.9.0. Additionally we are also trying to write to HDFS along with Couchbase. The job just creates specified directories in HDFS and keeps running forever but does not read or write anything.
Specs:
kafka version: 0.8.2.0
Zookeeper version: zookeeper-3.4.5+cdh5.9.0+98
spark version: spark-1.6.0+cdh5.9.0+229
pom.xml :
dibbhatt
kafka-spark-consumer
1.0.6
We also tried to use latest version v1.0.7 but it didnt help.
Thanks
Srikanth
Can you please share Logs of the consumer .
What happened when you restart the job ? It fetching Zero messages ?
Is there any difference between Kafka or Spark version between CDH upgrade ?
If I look at the logs, I will be able to tell something about the issue.
Hello, thanks for your prompt response, i have just shared the logs via email to dibyendu_bhattacharya@yahoo.com
Kafka and zookeeper version did not change after the upgrade. Spark version changed from 1.5.0 to 1.6.0
additionally please check our spark-submit if it helps you address this issue:
spark-submit
--class com.comp.consumer.kafka.client.Consumer
--conf spark.driver.maxResultSize=4G
--conf spark.driver.extraJavaOptions=-Denv=${ENV}
--conf spark.yarn.executor.memoryOverhead=3000
--conf spark.akka.frameSize=1024
--executor-memory 6G
--driver-memory 16G
--master yarn-cluster
--driver-cores 12
--num-executors 27
--executor-cores 5
$JAR $DATE "$CONSUMER" &
@dibbhatt We are observing that the offset is not getting committed to zookeeper. We are not seeing the consumerid in the zkcli
Did you tried running kafka console consumer from Spark Exeutor and see if it able to consume messages ? You are not seeing offsets in ZK mean consumer is not consuming . Can you share the properties files with all consumer specific properties you specified ?
Are you still seeing the issue ? If not you can close the ticket.
hi @sbellary300s @krishna383 if the issue is fixed, please close this ticket.
closing this as I have not heard anything yet @sbellary300s . if you still having the issue, please let me know more details about the the same.