spark tests on travis
freeman-lab opened this issue · 2 comments
It appears that the travis CI tests may not in fact be running against Spark via the --engine
flag, even though the same definitely works locally. We need to investigate this!
I made a branch that deliberately fails a test and includes a print statement showing the current "engine". The same thing run locally shows a SparkContext
as the engine.
https://travis-ci.org/thunder-project/thunder/jobs/131200190
I am getting the same thing locally as station.start(spark=True)
fails silently if it cant import spark.
For my it was fixed by installing py4j
Ah! Nice job @boazmohar you totally figured it out! It can I think be fixed in our travis either by pip installing py4j (as you did) or adding the version bundled with Spark to the PYTHONPATH. We used to do this but as of Spark 1.6 the bundled version changed so my line to add it no longer worked. I'll probably do the second thing just for consistency, making the change now.