sbt/sbt-avro

cannot find symbol when compiling

Naitreey opened this issue · 1 comments

Hi, When compiling with auto-generated java source, the following error is generated:

sbt:Data Analytics> compile
[info] Compiling 7 Scala sources and 37 Java sources to /home/naitree/Desktop/projectname/target/scala-2.12/classes ...
[error] /home/naitree/Desktop/projectname/target/scala-2.12/src_managed/main/compiled_avro/com/company/projectname/schemaname/Detection.java:113:1: cannot find symbol
[error]   symbol:   class TimeConversion
[error]   location: class org.apache.avro.data.TimeConversions
[error]   protected static final org.apache.avro.data.TimeConversions.TimeConversion TIME_CONVERSION = new org.apache.avro.data.TimeConversions.TimeConversion();
[error] /home/naitree/Desktop/projectname/target/scala-2.12/src_managed/main/compiled_avro/com/company/projectname/schemaname/Detection.java:114:1: cannot find symbol
...

Looks like classpath is not set correctly. How to fix this?

My buld.sbt:

// ------- sbt-avro configurations --------
AvroConfig / useNamespace := true
AvroConfig / version := "1.8.2"

// ------- resolvers -------
resolvers += "confluent" at "http://packages.confluent.io/maven/"

// ------- root project configurations --------
lazy val root = (project in file("."))
    .settings(
        name := "Data Analytics",
        version := "0.1.0",
        libraryDependencies += "org.apache.hadoop" % "hadoop-common" % "3.2.1" % "provided",
        libraryDependencies += "org.apache.hadoop" % "hadoop-aliyun" % "3.2.1" % "provided",
        libraryDependencies += "org.apache.spark" %% "spark-sql" % "2.4.4" % "provided",
        libraryDependencies += "org.apache.spark" % "spark-streaming_2.12" % "2.4.4" % "provided",
        libraryDependencies += "org.apache.spark" % "spark-streaming-kafka-0-10_2.12" % "2.4.4",
        libraryDependencies += "org.apache.kafka" % "kafka-clients" % "2.4.0",
        libraryDependencies += "log4j" % "log4j" % "1.2.17" % "provided",
        libraryDependencies += "org.slf4j" % "slf4j-api" % "1.7.16" % "provided",
        libraryDependencies += "org.slf4j" % "slf4j-log4j12" % "1.7.16" % "provided",
        libraryDependencies += "org.apache.avro" % "avro" % "1.8.2" % "provided",
        libraryDependencies += "org.apache.spark" % "spark-avro_2.12" % "2.4.4",
        libraryDependencies += "za.co.absa" % "abris_2.12" % "3.1.1",
    )

Strangely, the unresolved symbols can be found and used in my scala code. It seems that only java code can not resolve these classes.

OK, sorted out. It's because: although I specified 1.8.2 as avro version, sbt has chosen 1.9.1 because of abris_2.12. BUT the plugin version is still sbt-avro-1-8, which causes problem.