apache/incubator-hugegraph-toolchain

[Bug] Spark loader meet Exception: Class is not registered

JackyYangPassion opened this issue · 7 comments

Bug Type (问题类型)

others (please comment below)

Before submit

  • I had searched in the issues and found no similar issues.

Environment (环境信息)

  • Server Version: v1.0.0
  • Toolchain Version: v1.0.0
  • Spark Version: spark-3.3.1-hadoop3
  • java: 8

Expected & Actual behavior (期望与实际表现)

CMD :
sh bin/hugegraph-spark-loader.sh --master local --name spark-hugegraph-loader --file example/spark/struct.json --username admin --token admin --host 127.0.0.1 --port 8080 --graph graph-test

java.lang.IllegalArgumentException: Class is not registered: org.apache.spark.sql.types.StructType
Note: To register this class use: kryo.register(org.apache.spark.sql.types.StructType.class);
	at com.esotericsoftware.kryo.Kryo.getRegistration(Kryo.java:503)
	at com.esotericsoftware.kryo.util.DefaultClassResolver.writeClass(DefaultClassResolver.java:97)
	at com.esotericsoftware.kryo.Kryo.writeClass(Kryo.java:540)
	at com.esotericsoftware.kryo.Kryo.writeClassAndObject(Kryo.java:645)
	at org.apache.spark.serializer.KryoSerializerInstance.serialize(KryoSerializer.scala:387)
	at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:593)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:750)
23/01/01 14:17:42 WARN TaskSetManager: Lost task 0.0 in stage 0.0 (TID 0) (192.168.1.5 executor driver): java.lang.IllegalArgumentException: Class is not registered: org.apache.spark.sql.types.StructType
Note: To register this class use: kryo.register(org.apache.spark.sql.types.StructType.class);
	at com.esotericsoftware.kryo.Kryo.getRegistration(Kryo.java:503)
	at com.esotericsoftware.kryo.util.DefaultClassResolver.writeClass(DefaultClassResolver.java:97)
	at com.esotericsoftware.kryo.Kryo.writeClass(Kryo.java:540)
	at com.esotericsoftware.kryo.Kryo.writeClassAndObject(Kryo.java:645)
	at org.apache.spark.serializer.KryoSerializerInstance.serialize(KryoSerializer.scala:387)
	at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:593)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:750)

23/01/01 14:17:42 ERROR TaskSetManager: Task 0 in stage 0.0 failed 1 times; aborting job
23/01/01 14:17:42 INFO TaskSchedulerImpl: Removed TaskSet 0.0, whose tasks have all completed, from pool 
23/01/01 14:17:42 INFO TaskSchedulerImpl: Cancelling stage 0
23/01/01 14:17:42 INFO TaskSchedulerImpl: Killing all running tasks in stage 0: Stage cancelled
23/01/01 14:17:42 INFO DAGScheduler: ResultStage 0 (json at HugeGraphSparkLoader.java:234) failed in 1.370 s due to Job aborted due to stage failure: Task 0 in stage 0.0 failed 1 times, most recent failure: Lost task 0.0 in stage 0.0 (TID 0) (192.168.1.5 executor driver): java.lang.IllegalArgumentException: Class is not registered: org.apache.spark.sql.types.StructType
Note: To register this class use: kryo.register(org.apache.spark.sql.types.StructType.class);
	at com.esotericsoftware.kryo.Kryo.getRegistration(Kryo.java:503)
	at com.esotericsoftware.kryo.util.DefaultClassResolver.writeClass(DefaultClassResolver.java:97)
	at com.esotericsoftware.kryo.Kryo.writeClass(Kryo.java:540)
	at com.esotericsoftware.kryo.Kryo.writeClassAndObject(Kryo.java:645)
	at org.apache.spark.serializer.KryoSerializerInstance.serialize(KryoSerializer.scala:387)
	at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:593)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:750)

Driver stacktrace:
23/01/01 14:17:42 INFO DAGScheduler: Job 0 failed: json at HugeGraphSparkLoader.java:234, took 1.420781 s
Exception in thread "main" org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in stage 0.0 failed 1 times, most recent failure: Lost task 0.0 in stage 0.0 (TID 0) (192.168.1.5 executor driver): java.lang.IllegalArgumentException: Class is not registered: org.apache.spark.sql.types.StructType
Note: To register this class use: kryo.register(org.apache.spark.sql.types.StructType.class);
	at com.esotericsoftware.kryo.Kryo.getRegistration(Kryo.java:503)
	at com.esotericsoftware.kryo.util.DefaultClassResolver.writeClass(DefaultClassResolver.java:97)
	at com.esotericsoftware.kryo.Kryo.writeClass(Kryo.java:540)
	at com.esotericsoftware.kryo.Kryo.writeClassAndObject(Kryo.java:645)
	at org.apache.spark.serializer.KryoSerializerInstance.serialize(KryoSerializer.scala:387)
	at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:593)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:750)

Driver stacktrace:
	at org.apache.spark.scheduler.DAGScheduler.failJobAndIndependentStages(DAGScheduler.scala:2672)
	at org.apache.spark.scheduler.DAGScheduler.$anonfun$abortStage$2(DAGScheduler.scala:2608)
	at org.apache.spark.scheduler.DAGScheduler.$anonfun$abortStage$2$adapted(DAGScheduler.scala:2607)
	at scala.collection.mutable.ResizableArray.foreach(ResizableArray.scala:62)
	at scala.collection.mutable.ResizableArray.foreach$(ResizableArray.scala:55)
	at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:49)
	at org.apache.spark.scheduler.DAGScheduler.abortStage(DAGScheduler.scala:2607)
	at org.apache.spark.scheduler.DAGScheduler.$anonfun$handleTaskSetFailed$1(DAGScheduler.scala:1182)
	at org.apache.spark.scheduler.DAGScheduler.$anonfun$handleTaskSetFailed$1$adapted(DAGScheduler.scala:1182)
	at scala.Option.foreach(Option.scala:407)
	at org.apache.spark.scheduler.DAGScheduler.handleTaskSetFailed(DAGScheduler.scala:1182)
	at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.doOnReceive(DAGScheduler.scala:2860)
	at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:2802)
	at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:2791)
	at org.apache.spark.util.EventLoop$$anon$1.run(EventLoop.scala:49)
	at org.apache.spark.scheduler.DAGScheduler.runJob(DAGScheduler.scala:952)
	at org.apache.spark.SparkContext.runJob(SparkContext.scala:2228)
	at org.apache.spark.SparkContext.runJob(SparkContext.scala:2323)
	at org.apache.spark.sql.catalyst.json.JsonInferSchema.infer(JsonInferSchema.scala:116)
	at org.apache.spark.sql.execution.datasources.json.TextInputJsonDataSource$.$anonfun$inferFromDataset$5(JsonDataSource.scala:110)
	at org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:169)
	at org.apache.spark.sql.execution.datasources.json.TextInputJsonDataSource$.inferFromDataset(JsonDataSource.scala:110)
	at org.apache.spark.sql.execution.datasources.json.TextInputJsonDataSource$.infer(JsonDataSource.scala:99)
	at org.apache.spark.sql.execution.datasources.json.JsonDataSource.inferSchema(JsonDataSource.scala:65)
	at org.apache.spark.sql.execution.datasources.json.JsonFileFormat.inferSchema(JsonFileFormat.scala:59)
	at org.apache.spark.sql.execution.datasources.DataSource.$anonfun$getOrInferFileFormatSchema$11(DataSource.scala:210)
	at scala.Option.orElse(Option.scala:447)
	at org.apache.spark.sql.execution.datasources.DataSource.getOrInferFileFormatSchema(DataSource.scala:207)
	at org.apache.spark.sql.execution.datasources.DataSource.resolveRelation(DataSource.scala:411)
	at org.apache.spark.sql.DataFrameReader.loadV1Source(DataFrameReader.scala:228)
	at org.apache.spark.sql.DataFrameReader.$anonfun$load$2(DataFrameReader.scala:210)
	at scala.Option.getOrElse(Option.scala:189)
	at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:210)
	at org.apache.spark.sql.DataFrameReader.json(DataFrameReader.scala:361)
	at org.apache.spark.sql.DataFrameReader.json(DataFrameReader.scala:340)
	at org.apache.hugegraph.loader.spark.HugeGraphSparkLoader.read(HugeGraphSparkLoader.java:234)
	at org.apache.hugegraph.loader.spark.HugeGraphSparkLoader.load(HugeGraphSparkLoader.java:135)
	at org.apache.hugegraph.loader.spark.HugeGraphSparkLoader.main(HugeGraphSparkLoader.java:86)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
	at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:958)
	at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:180)
	at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:203)
	at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:90)
	at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1046)
	at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1055)
	at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.lang.IllegalArgumentException: Class is not registered: org.apache.spark.sql.types.StructType
Note: To register this class use: kryo.register(org.apache.spark.sql.types.StructType.class);
	at com.esotericsoftware.kryo.Kryo.getRegistration(Kryo.java:503)
	at com.esotericsoftware.kryo.util.DefaultClassResolver.writeClass(DefaultClassResolver.java:97)
	at com.esotericsoftware.kryo.Kryo.writeClass(Kryo.java:540)
	at com.esotericsoftware.kryo.Kryo.writeClassAndObject(Kryo.java:645)
	at org.apache.spark.serializer.KryoSerializerInstance.serialize(KryoSerializer.scala:387)
	at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:593)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:750)

but i find org.apache.spark.sql.types.StructType has registered in the source code :

conf.registerKryoClasses(new Class[]{
                    org.apache.hadoop.hbase.io.ImmutableBytesWritable.class,
                    org.apache.hadoop.hbase.KeyValue.class,
                    org.apache.spark.sql.types.StructType.class,
                    StructField[].class,
                    StructField.class,
                    org.apache.spark.sql.types.LongType$.class,
                    org.apache.spark.sql.types.Metadata.class,
                    org.apache.spark.sql.types.StringType$.class,
                    Class.forName(
                            "org.apache.spark.internal.io.FileCommitProtocol$TaskCommitMessage"),
                    Class.forName("scala.reflect.ClassTag$$anon$1"),
                    Class.forName("scala.collection.immutable.Set$EmptySet$"),
                    Class.forName("org.apache.spark.sql.types.DoubleType$")
 });

with java11 the same error

@simon824 could help to check when free

@JackyYangPassion can you try spark3.2.2

Hi, @JackyYangPassion, Did you solve this problem, I have the same problem.

when run ./bin/hugegraph-spark-loader.sh --master local[*] --name spark-hugegraph-loader --file ./example/spark/struct.json --host 192.168.34.164 --port 18080 --graph graph-test to load data by Spark.

23/05/16 23:12:35 INFO CodeGenerator: Code generated in 124.547366 ms
23/05/16 23:12:35 ERROR Executor: Exception in task 0.0 in stage 0.0 (TID 0)
java.lang.IllegalArgumentException: Class is not registered: org.apache.spark.sql.types.StructType
Note: To register this class use: kryo.register(org.apache.spark.sql.types.StructType.class);
        at com.esotericsoftware.kryo.Kryo.getRegistration(Kryo.java:503)
        at com.esotericsoftware.kryo.util.DefaultClassResolver.writeClass(DefaultClassResolver.java:97)
        at com.esotericsoftware.kryo.Kryo.writeClass(Kryo.java:540)
        at com.esotericsoftware.kryo.Kryo.writeClassAndObject(Kryo.java:645)
        at org.apache.spark.serializer.KryoSerializerInstance.serialize(KryoSerializer.scala:387)
        at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:551)
        at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
        at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
        at java.base/java.lang.Thread.run(Thread.java:834)
23/05/16 23:12:35 WARN TaskSetManager: Lost task 0.0 in stage 0.0 (TID 0) (192.168.34.164 executor driver): java.lang.IllegalArgumentException: Class is not registered: org.apache.spark.sql.types.StructType
Note: To register this class use: kryo.register(org.apache.spark.sql.types.StructType.class);
        at com.esotericsoftware.kryo.Kryo.getRegistration(Kryo.java:503)
        at com.esotericsoftware.kryo.util.DefaultClassResolver.writeClass(DefaultClassResolver.java:97)
        at com.esotericsoftware.kryo.Kryo.writeClass(Kryo.java:540)
        at com.esotericsoftware.kryo.Kryo.writeClassAndObject(Kryo.java:645)
        at org.apache.spark.serializer.KryoSerializerInstance.serialize(KryoSerializer.scala:387)
        at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:551)
        at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
        at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
        at java.base/java.lang.Thread.run(Thread.java:834)

23/05/16 23:12:35 ERROR TaskSetManager: Task 0 in stage 0.0 failed 1 times; aborting job
23/05/16 23:12:35 INFO TaskSchedulerImpl: Removed TaskSet 0.0, whose tasks have all completed, from pool
23/05/16 23:12:35 INFO TaskSchedulerImpl: Cancelling stage 0
23/05/16 23:12:35 INFO TaskSchedulerImpl: Killing all running tasks in stage 0: Stage cancelled
23/05/16 23:12:35 INFO DAGScheduler: ResultStage 0 (json at HugeGraphSparkLoader.java:232) failed in 0.408 s due to Job aborted due to stage failure: Task 0 in stage 0.0 failed 1 times, most recent failure: Lost task 0.0 in stage 0.0 (TID 0) (192.168.34.164 executor driver): java.lang.IllegalArgumentException: Class is not registered: org.apache.spark.sql.types.StructType
Note: To register this class use: kryo.register(org.apache.spark.sql.types.StructType.class);
        at com.esotericsoftware.kryo.Kryo.getRegistration(Kryo.java:503)
        at com.esotericsoftware.kryo.util.DefaultClassResolver.writeClass(DefaultClassResolver.java:97)
        at com.esotericsoftware.kryo.Kryo.writeClass(Kryo.java:540)
        at com.esotericsoftware.kryo.Kryo.writeClassAndObject(Kryo.java:645)
        at org.apache.spark.serializer.KryoSerializerInstance.serialize(KryoSerializer.scala:387)
        at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:551)
        at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
        at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
        at java.base/java.lang.Thread.run(Thread.java:834)

Driver stacktrace:
23/05/16 23:12:35 INFO DAGScheduler: Job 0 failed: json at HugeGraphSparkLoader.java:232, took 0.434906 s
Exception in thread "main" org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in stage 0.0 failed 1 times, most recent failure: Lost task 0.0 in stage 0.0 (TID 0) (192.168.34.164 executor driver): java.lang.IllegalArgumentException: Class is not registered: org.apache.spark.sql.types.StructType
Note: To register this class use: kryo.register(org.apache.spark.sql.types.StructType.class);
        at com.esotericsoftware.kryo.Kryo.getRegistration(Kryo.java:503)
        at com.esotericsoftware.kryo.util.DefaultClassResolver.writeClass(DefaultClassResolver.java:97)
        at com.esotericsoftware.kryo.Kryo.writeClass(Kryo.java:540)
        at com.esotericsoftware.kryo.Kryo.writeClassAndObject(Kryo.java:645)
        at org.apache.spark.serializer.KryoSerializerInstance.serialize(KryoSerializer.scala:387)
        at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:551)
        at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
        at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
        at java.base/java.lang.Thread.run(Thread.java:834)

Driver stacktrace:
        at org.apache.spark.scheduler.DAGScheduler.failJobAndIndependentStages(DAGScheduler.scala:2454)
        at org.apache.spark.scheduler.DAGScheduler.$anonfun$abortStage$2(DAGScheduler.scala:2403)
        at org.apache.spark.scheduler.DAGScheduler.$anonfun$abortStage$2$adapted(DAGScheduler.scala:2402)
        at scala.collection.mutable.ResizableArray.foreach(ResizableArray.scala:62)
        at scala.collection.mutable.ResizableArray.foreach$(ResizableArray.scala:55)
        at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:49)
        at org.apache.spark.scheduler.DAGScheduler.abortStage(DAGScheduler.scala:2402)
        at org.apache.spark.scheduler.DAGScheduler.$anonfun$handleTaskSetFailed$1(DAGScheduler.scala:1160)
        at org.apache.spark.scheduler.DAGScheduler.$anonfun$handleTaskSetFailed$1$adapted(DAGScheduler.scala:1160)
        at scala.Option.foreach(Option.scala:407)
        at org.apache.spark.scheduler.DAGScheduler.handleTaskSetFailed(DAGScheduler.scala:1160)
        at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.doOnReceive(DAGScheduler.scala:2642)
        at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:2584)
        at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:2573)
        at org.apache.spark.util.EventLoop$$anon$1.run(EventLoop.scala:49)
        at org.apache.spark.scheduler.DAGScheduler.runJob(DAGScheduler.scala:938)
        at org.apache.spark.SparkContext.runJob(SparkContext.scala:2214)
        at org.apache.spark.SparkContext.runJob(SparkContext.scala:2309)
        at org.apache.spark.sql.catalyst.json.JsonInferSchema.infer(JsonInferSchema.scala:93)
        at org.apache.spark.sql.execution.datasources.json.TextInputJsonDataSource$.$anonfun$inferFromDataset$5(JsonDataSource.scala:110)
        at org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:163)
        at org.apache.spark.sql.execution.datasources.json.TextInputJsonDataSource$.inferFromDataset(JsonDataSource.scala:110)
        at org.apache.spark.sql.execution.datasources.json.TextInputJsonDataSource$.infer(JsonDataSource.scala:99)
        at org.apache.spark.sql.execution.datasources.json.JsonDataSource.inferSchema(JsonDataSource.scala:65)
        at org.apache.spark.sql.execution.datasources.json.JsonFileFormat.inferSchema(JsonFileFormat.scala:59)
        at org.apache.spark.sql.execution.datasources.DataSource.$anonfun$getOrInferFileFormatSchema$11(DataSource.scala:210)
        at scala.Option.orElse(Option.scala:447)
        at org.apache.spark.sql.execution.datasources.DataSource.getOrInferFileFormatSchema(DataSource.scala:207)
        at org.apache.spark.sql.execution.datasources.DataSource.resolveRelation(DataSource.scala:411)
        at org.apache.spark.sql.DataFrameReader.loadV1Source(DataFrameReader.scala:274)
        at org.apache.spark.sql.DataFrameReader.$anonfun$load$3(DataFrameReader.scala:245)
        at scala.Option.getOrElse(Option.scala:189)
        at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:245)
        at org.apache.spark.sql.DataFrameReader.json(DataFrameReader.scala:405)
        at org.apache.spark.sql.DataFrameReader.json(DataFrameReader.scala:386)
        at org.apache.hugegraph.loader.spark.HugeGraphSparkLoader.read(HugeGraphSparkLoader.java:232)
        at org.apache.hugegraph.loader.spark.HugeGraphSparkLoader.load(HugeGraphSparkLoader.java:133)
        at org.apache.hugegraph.loader.spark.HugeGraphSparkLoader.main(HugeGraphSparkLoader.java:84)
        at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.base/java.lang.reflect.Method.invoke(Method.java:566)
        at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
        at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:955)
        at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:180)
        at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:203)
        at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:90)
        at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1043)
        at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1052)
        at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.lang.IllegalArgumentException: Class is not registered: org.apache.spark.sql.types.StructType
Note: To register this class use: kryo.register(org.apache.spark.sql.types.StructType.class);
        at com.esotericsoftware.kryo.Kryo.getRegistration(Kryo.java:503)
        at com.esotericsoftware.kryo.util.DefaultClassResolver.writeClass(DefaultClassResolver.java:97)
        at com.esotericsoftware.kryo.Kryo.writeClass(Kryo.java:540)
        at com.esotericsoftware.kryo.Kryo.writeClassAndObject(Kryo.java:645)
        at org.apache.spark.serializer.KryoSerializerInstance.serialize(KryoSerializer.scala:387)
        at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:551)
        at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
        at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
        at java.base/java.lang.Thread.run(Thread.java:834)
23/05/16 23:12:35 INFO SparkContext: Invoking stop() from shutdown hook
23/05/16 23:12:35 INFO SparkUI: Stopped Spark web UI at http://192.168.34.164:4040
23/05/16 23:12:35 INFO MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped!

Environment (环境信息):

@simon824 @imbajin Hi, please assign this to me. This error occurs because some classes that do not exist in Scala 2.12 version are registered when registering kryoclasses。

@simon824 @imbajin Hi, please assign this to me. This error occurs because some classes that do not exist in Scala 2.12 version are registered when registering kryoclasses。

Thanks, and could also join our developer wechat group if not in (from wechat official account)

@simon824 @imbajin Hi, please assign this to me. This error occurs because some classes that do not exist in Scala 2.12 version are registered when registering kryoclasses。

Thanks, and could also join our developer wechat group if not in (from wechat official account)

thanks,I have been pulled into wechat group by @JackyYangPassion.