Distributed Spark client app issue
Closed this issue · 2 comments
NicoLaval commented
java.io.InvalidClassException: scala.collection.mutable.WrappedArray$ofRef; local class incompatible: stream classdesc serialVersionUID = 1028182004549731694, local class serialVersionUID = 3456489343829468865
at java.base/java.io.ObjectStreamClass.initNonProxy(Unknown Source)
at java.base/java.io.ObjectInputStream.readNonProxyDesc(Unknown Source)
at java.base/java.io.ObjectInputStream.readClassDesc(Unknown Source)
at java.base/java.io.ObjectInputStream.readOrdinaryObject(Unknown Source)
Executors throw errors.
See Trevas Spark Hadoop potential conflicts?
hadrienk commented
Looks like the scala WrappedArray#ofRef doesn't have the same serialization version. This could happen if the versions of scala are different i think.
NicoLaval commented
Fix in executor runtime img https://github.com/InseeFrLab/Trevas-Spark-Hadoop