MrPowers/spark-daria

Java, JDK and Hadoop versions for CI

Closed this issue · 2 comments

Apache Spark itself is using these settings:

matrix:
  java: [ '1.8', '11' ]
  hadoop: [ 'hadoop-2.7', 'hadoop-3.2' ]
  exclude:
  - java: '11'
    hadoop: 'hadoop-2.7'

I think spark-daria can simply be tested with Java 1.8 and without any Hadoop specified, correct? I don't think we need to be testing multiple different Java / Hadoop versions.

I just learned that Java 8 and Java 1.8 are the same thing... what?!

I'm not even going to ask why Java 9 and Java 10 aren't included in this discussion. So confusing!!

@nvander1 - can you please help out here!?