audienceproject/spark-dynamodb

unresolved dependency error

Closed this issue · 3 comments

Spawned EMR with Spark: Spark 2.4.4 on Hadoop 2.8.5, clone the repo and packaged sbt.

[hadoop@ip-xxxxxx spark-dynamodb]$ pyspark --packages com.audienceproject:spark-dynamodb_2.11.12:1.0.3
Python 2.7.16 (default, Oct 14 2019, 21:26:56) 
[GCC 4.8.5 20150623 (Red Hat 4.8.5-28)] on linux2
Type "help", "copyright", "credits" or "license" for more information.
Ivy Default Cache set to: /home/hadoop/.ivy2/cache
The jars for the packages stored in: /home/hadoop/.ivy2/jars
:: loading settings :: url = jar:file:/usr/lib/spark/jars/ivy-2.4.0.jar!/org/apache/ivy/core/settings/ivysettings.xml
com.audienceproject#spark-dynamodb_2.11.12 added as a dependency
:: resolving dependencies :: org.apache.spark#spark-submit-parent-6748f4f1-9ccf-4ef2-9556-6e70ac4121b1;1.0
	confs: [default]
:: resolution report :: resolve 780ms :: artifacts dl 0ms
	:: modules in use:
	---------------------------------------------------------------------
	|                  |            modules            ||   artifacts   |
	|       conf       | number| search|dwnlded|evicted|| number|dwnlded|
	---------------------------------------------------------------------
	|      default     |   1   |   0   |   0   |   0   ||   0   |   0   |
	---------------------------------------------------------------------
:: problems summary ::
:::: WARNINGS
		module not found: com.audienceproject#spark-dynamodb_2.11.12;1.0.3
	==== local-m2-cache: tried
	  file:/home/hadoop/.m2/repository/com/audienceproject/spark-dynamodb_2.11.12/1.0.3/spark-dynamodb_2.11.12-1.0.3.pom
	  -- artifact com.audienceproject#spark-dynamodb_2.11.12;1.0.3!spark-dynamodb_2.11.12.jar:
	  file:/home/hadoop/.m2/repository/com/audienceproject/spark-dynamodb_2.11.12/1.0.3/spark-dynamodb_2.11.12-1.0.3.jar
	==== local-ivy-cache: tried
	  /home/hadoop/.ivy2/local/com.audienceproject/spark-dynamodb_2.11.12/1.0.3/ivys/ivy.xml
	  -- artifact com.audienceproject#spark-dynamodb_2.11.12;1.0.3!spark-dynamodb_2.11.12.jar:
	  /home/hadoop/.ivy2/local/com.audienceproject/spark-dynamodb_2.11.12/1.0.3/jars/spark-dynamodb_2.11.12.jar
	==== central: tried
	  https://repo1.maven.org/maven2/com/audienceproject/spark-dynamodb_2.11.12/1.0.3/spark-dynamodb_2.11.12-1.0.3.pom
	  -- artifact com.audienceproject#spark-dynamodb_2.11.12;1.0.3!spark-dynamodb_2.11.12.jar:
	  https://repo1.maven.org/maven2/com/audienceproject/spark-dynamodb_2.11.12/1.0.3/spark-dynamodb_2.11.12-1.0.3.jar
	==== spark-packages: tried
	  https://dl.bintray.com/spark-packages/maven/com/audienceproject/spark-dynamodb_2.11.12/1.0.3/spark-dynamodb_2.11.12-1.0.3.pom
	  -- artifact com.audienceproject#spark-dynamodb_2.11.12;1.0.3!spark-dynamodb_2.11.12.jar:
	  https://dl.bintray.com/spark-packages/maven/com/audienceproject/spark-dynamodb_2.11.12/1.0.3/spark-dynamodb_2.11.12-1.0.3.jar
		::::::::::::::::::::::::::::::::::::::::::::::
		::          UNRESOLVED DEPENDENCIES         ::
		::::::::::::::::::::::::::::::::::::::::::::::
		:: com.audienceproject#spark-dynamodb_2.11.12;1.0.3: not found
		:::::::::::::::::::::::::::::::::::::::::::::
:: USE VERBOSE OR DEBUG MESSAGE LEVEL FOR MORE DETAILS
Exception in thread "main" java.lang.RuntimeException: [unresolved dependency: com.audienceproject#spark-dynamodb_2.11.12;1.0.3: not found]
	at org.apache.spark.deploy.SparkSubmitUtils$.resolveMavenCoordinates(SparkSubmit.scala:1310)
	at org.apache.spark.deploy.DependencyUtils$.resolveMavenDependencies(DependencyUtils.scala:54)
	at org.apache.spark.deploy.SparkSubmit.prepareSubmitEnvironment(SparkSubmit.scala:304)
	at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:782)
	at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:161)
	at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:184)
	at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:86)
	at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:928)
	at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:937)
	at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Traceback (most recent call last):
  File "/usr/lib/spark/python/pyspark/shell.py", line 38, in <module>
    SparkContext._ensure_initialized()
  File "/usr/lib/spark/python/pyspark/context.py", line 324, in _ensure_initialized
    SparkContext._gateway = gateway or launch_gateway(conf)
  File "/usr/lib/spark/python/pyspark/java_gateway.py", line 46, in launch_gateway
    return _launch_gateway(conf)
  File "/usr/lib/spark/python/pyspark/java_gateway.py", line 108, in _launch_gateway
    raise Exception("Java gateway process exited before sending its port number")
Exception: Java gateway process exited before sending its port number

We're a little caught up at the moment, but we'll look into it as soon as possible

Hi
It seems you have attempted to run version 1.0.3, however this build is not yet released on Maven.
Current release is 1.0.2

Please try version 1.0.2
Closing this issue.