NoClassDefFoundError
XiaWuSharve opened this issue · 0 comments
version
java: 1.8.0_392
hadoop: 3.2
Scala: 2.12.10
spark: 3.0.0
magellan: 1.0.5-s_2.11
reproduce
using readme.md->reading data example
'''shell
spark-3.0.0-bin-hadoop3.2$ bin/spark-shell --packages harsha2010:magellan:1.0.5-s_2.11
24/02/08 09:36:24 WARN Utils: Your hostname, $HOSTNAME resolves to a loopback address: 127.0.1.1; using 192.168.2.5 instead (on interface wlp1s0)
24/02/08 09:36:24 WARN Utils: Set SPARK_LOCAL_IP if you need to bind to another address
Ivy Default Cache set to: /home/$HOSTNAME/.ivy2/cache
The jars for the packages stored in: /home/$HOSTNAME/.ivy2/jars
:: loading settings :: url = jar:file:/opt/spark-3.0.0-bin-hadoop3.2/jars/ivy-2.4.0.jar!/org/apache/ivy/core/settings/ivysettings.xml
harsha2010#magellan added as a dependency
:: resolving dependencies :: org.apache.spark#spark-submit-parent-9f1a17b3-3111-417d-9732-5e084a4fd8a1;1.0
confs: [default]
found harsha2010#magellan;1.0.5-s_2.11 in local-m2-cache
found commons-io#commons-io;2.4 in local-m2-cache
downloading file:/home/commie/.m2/repository/harsha2010/magellan/1.0.5-s_2.11/magellan-1.0.5-s_2.11.jar ...
[SUCCESSFUL ] harsha2010#magellan;1.0.5-s_2.11!magellan.jar (1ms)
downloading file:/home/commie/.m2/repository/commons-io/commons-io/2.4/commons-io-2.4.jar ...
[SUCCESSFUL ] commons-io#commons-io;2.4!commons-io.jar (1ms)
:: resolution report :: resolve 5682ms :: artifacts dl 5ms
:: modules in use:
commons-io#commons-io;2.4 from local-m2-cache in [default]
harsha2010#magellan;1.0.5-s_2.11 from local-m2-cache in [default]
---------------------------------------------------------------------
| | modules || artifacts |
| conf | number| search|dwnlded|evicted|| number|dwnlded|
---------------------------------------------------------------------
| default | 2 | 2 | 2 | 0 || 2 | 2 |
---------------------------------------------------------------------
:: retrieving :: org.apache.spark#spark-submit-parent-9f1a17b3-3111-417d-9732-5e084a4fd8a1
confs: [default]
2 artifacts copied, 0 already retrieved (647kB/7ms)
24/02/08 09:36:30 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
Spark context Web UI available at http://192.168.2.5:4040
Spark context available as 'sc' (master = local[*], app id = local-1707356194428).
Spark session available as 'spark'.
Welcome to
____ __
/ / ___ / /
\ / _ / _ `/ __/ '/
// .__/_,// //_\ version 3.0.0
//
Using Scala version 2.12.10 (OpenJDK 64-Bit Server VM, Java 1.8.0_392)
Type in expressions to have them evaluated.
Type :help for more information.
scala> spark.read.format("magellan").load("/home/$HOSTNAME/Downloads/gis_osm_pois_free_1.shp")
java.lang.NoClassDefFoundError: scala/Product$class
at magellan.BoundingBox.(BoundingBox.scala:32)
at magellan.index.ZOrderCurveIndexer.(ZOrderCurveIndexer.scala:28)
at magellan.SpatialRelation$class.$init$(SpatialRelation.scala:35)
at magellan.ShapeFileRelation.(ShapefileRelation.scala:37)
at magellan.DefaultSource.createRelation(DefaultSource.scala:37)
at magellan.DefaultSource.createRelation(DefaultSource.scala:30)
at org.apache.spark.sql.execution.datasources.DataSource.resolveRelation(DataSource.scala:339)
at org.apache.spark.sql.DataFrameReader.loadV1Source(DataFrameReader.scala:279)
at org.apache.spark.sql.DataFrameReader.$anonfun$load$2(DataFrameReader.scala:268)
at scala.Option.getOrElse(Option.scala:189)
at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:268)
at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:214)
... 47 elided
'''
I have trouble in running code examples in readme.md and am not quite sure about what was happening and have encountered java.lang.NoClassDefFoundError. The information is delivered above. If you have any ideas about this issue, please let me know! Thanks a lot.