dibbhatt/kafka-spark-consumer

exception on simple example

fabiofumarola opened this issue · 7 comments

 WARN [run-main-0] NativeCodeLoader - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
[error] (run-main-0) java.lang.NoClassDefFoundError: kafka/api/OffsetRequest
java.lang.NoClassDefFoundError: kafka/api/OffsetRequest
    at consumer.kafka.KafkaConfig.<init>(KafkaConfig.java:38)
    at consumer.kafka.ReceiverLauncher.createStream(ReceiverLauncher.java:88)
    at consumer.kafka.ReceiverLauncher.launch(ReceiverLauncher.java:66)
    at it.dtk.KafkaConsumerTest$.main(KafkaConsumerTest.scala:48)
    at it.dtk.KafkaConsumerTest.main(KafkaConsumerTest.scala)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:497)
Caused by: java.lang.ClassNotFoundException: kafka.api.OffsetRequest
    at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
    at consumer.kafka.KafkaConfig.<init>(KafkaConfig.java:38)
    at consumer.kafka.ReceiverLauncher.createStream(ReceiverLauncher.java:88)
    at consumer.kafka.ReceiverLauncher.launch(ReceiverLauncher.java:66)
    at it.dtk.KafkaConsumerTest$.main(KafkaConsumerTest.scala:48)
    at it.dtk.KafkaConsumerTest.main(KafkaConsumerTest.scala)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:497)

Can you let me know how is you pom.xml looks like ? In the consumer pom, Kafka and Spark versions are provided. You need to explicitly mention the Kafka and Spark version in you pom.xml.
or if you are trying the example directly , you need to modify the pom.xml of the consumer and remove provided tag and also update the versions of Kafka and Spark what you are using .

Thanks, I'll try it this evening

Are you still having same issue ?

jekey commented

I got the error too....
it's my sbt file

libraryDependencies += "org.apache.spark" %% "spark-core" % "1.4.1"
libraryDependencies += "org.apache.spark" % "spark-streaming_2.10" % "1.4.1"
libraryDependencies += "org.codehaus.jackson" % "jackson-mapper-asl" % "1.9.13"
libraryDependencies += "dibbhatt" % "kafka-spark-consumer" % "1.0.6"


oh, I fix it ,
by add libraryDependencies += "org.apache.spark" % "spark-streaming-kafka_2.10" % "1.4.1"

Ok cool

closing this as this is the related to your pom/sbt missing dependency .

thanks, that fixed the issue