audienceproject/spark-dynamodb

Error while connecting to Dynamodb

Closed this issue · 2 comments

Hi
We are getting below error , when we are tring to read from dynamo db, using below configuration

  1. Scala version : 2.11.12
  2. Spark version : 2.4.4
  3. Hadoop version : 2.8.5
  4. com.audienceproject :spark-dynamodb_2.12 , vesrion : 1.0.1

//Code snippet
val conf = new SparkConf()
.set("spark.default.parallelism", "4")

val sparkSession = SparkSession
  .builder()
  .master("yarn")
  .appName("ApplicationName")
  .config(conf)
  .getOrCreate()


val par = sparkSession.sparkContext.defaultParallelism
println("Parallelism " + par)

import com.audienceproject.spark.dynamodb.implicits._

val df = sparkSession.read.dynamodb("Table_Name") // Failing at this read 

//

java.lang.NoSuchMethodError: scala.Some.value()Ljava/lang/Object;
at com.audienceproject.spark.dynamodb.datasource.DefaultSource.getDefaultParallelism(DefaultSource.scala:65)
at com.audienceproject.spark.dynamodb.datasource.DefaultSource.$anonfun$createReader$4(DefaultSource.scala:47)
at scala.runtime.java8.JFunction0$mcI$sp.apply(JFunction0$mcI$sp.java:23)
at scala.Option.getOrElse(Option.scala:121)
at com.audienceproject.spark.dynamodb.datasource.DefaultSource.createReader(DefaultSource.scala:47)
at org.apache.spark.sql.execution.datasources.v2.DataSourceV2Relation$SourceHelpers.createReader(DataSourceV2Relation.scala:161)
at org.apache.spark.sql.execution.datasources.v2.DataSourceV2Relation$.create(DataSourceV2Relation.scala:178)
at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:204)
at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:167)
at com.audienceproject.spark.dynamodb.implicits$DynamoDBDataFrameReader.dynamodb(implicits.scala:37)

	Please suggest.
	
	Thanks
ryge commented

It looks like you are trying to use a library build for scala 2.12 on scala 2.11. Try it with scala 2.12 and see if the same error occurs.

It worked..
Thanks :)