audienceproject/spark-dynamodb

How to import the module in pyspark?

Opened this issue · 1 comments

I am trying to import in pyspark in EMR and not able to find the module. It is getting downloaded though:

Arguments :spark-submit --deploy-mode cluster --packages com.audienceproject:spark-dynamodb_2.12:1.1.1 s3://filepath

import com.audienceproject.spark.dynamodb_

if name == 'main':

sc = SparkContext(appName="reading csv")

# creating the context
sqlContext = SQLContext(sc)

dynamoDf = sparkSession.read.option("tableName", "event-poc").format("dynamodb").load() 
# Scan the table for the first 100 items (the order is arbitrary) and print them.
dynamoDf.show(10)

# write to some other table overwriting existing item with same keys
dynamoDf.write.option("tableName", "emr-test").format("dynamodb").save()