databricks/sbt-spark-package

sbt-spark-package does not follow redirects

sadikovi opened this issue · 4 comments

It looks like sbt-spark-package does not follow redirects.
URL http://spark-packages.org/api/submit-package returns HTTP 302 and reports it as an error. 1ee3cf2 updated to https, and older package versions (at least 0.2.3) seem to fail with error.

I am using 0.2.3, and get this:

[info] Packaging /developer/spark-netflow/target/scala-2.11/spark-netflow_2.11-1.1.0.jar ...
[info] Done packaging.

Zip File created at: /developer/spark-netflow/target/spark-netflow-1.1.0-s_2.11.zip

ERROR: 302 - 
[success] Total time: 52 s, completed 26/06/2016 7:29:06 PM

Solvable by upgrading the package to 0.2.4.

@brkyvz Can you have a look at this? Thanks.

Minor thing, it looks like URL in success message has not been updated - http://spark-packages.org/staging?id=1261, should be https.

Thank you for reporting this! Please use 0.2.4 and above from now on.
Thanks,
Burak
On Jun 26, 2016 12:47 AM, "Jony" notifications@github.com wrote:

Minor thing, it looks like URL in success message has not been updated -
http://spark-packages.org/staging?id=1261, should be https.


You are receiving this because you were mentioned.

Reply to this email directly, view it on GitHub, or mute the thread.

@brkyvz thank you. Yeah, I updated version.