audienceproject/spark-dynamodb

Null Pointer Exception with version 1.1.2

Opened this issue · 1 comments

Hi.

Having this issue in version 1.1.2.

Spark 3.1.1
Scala 2.12.10

Error:

Caused by: java.lang.NullPointerException
at org.apache.spark.sql.catalyst.InternalRow.getString(InternalRow.scala:34)
at com.audienceproject.spark.dynamodb.catalyst.JavaConverter$.convertRowValue(JavaConverter.scala:39)
at com.audienceproject.spark.dynamodb.connector.TableConnector.updateItem(TableConnector.scala:154)
at com.audienceproject.spark.dynamodb.datasource.DynamoDataUpdateWriter.write(DynamoDataUpdateWriter.scala:37)
at com.audienceproject.spark.dynamodb.datasource.DynamoDataUpdateWriter.write(DynamoDataUpdateWriter.scala:29)
at org.apache.spark.sql.execution.datasources.v2.DataWritingSparkTask$.$anonfun$run$1(WriteToDataSourceV2Exec.scala:416)
at org.apache.spark.util.Utils$.tryWithSafeFinallyAndFailureCallbacks(Utils.scala:1473)
at org.apache.spark.sql.execution.datasources.v2.DataWritingSparkTask$.run(WriteToDataSourceV2Exec.scala:452)
at org.apache.spark.sql.execution.datasources.v2.V2TableWriteExec.$anonfun$writeWithV2$2(WriteToDataSourceV2Exec.scala:360)
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:90)
at org.apache.spark.scheduler.Task.run(Task.scala:131)
at org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$3(Executor.scala:497)
at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1439)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:500)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)

Just leaving the note that in the issue #42 this seems to be solved, but now is happening again.

@jacobfi do you have any possible idea?