Java, JDK and Hadoop versions for CI
Closed this issue · 2 comments
MrPowers commented
Apache Spark itself is using these settings:
matrix:
java: [ '1.8', '11' ]
hadoop: [ 'hadoop-2.7', 'hadoop-3.2' ]
exclude:
- java: '11'
hadoop: 'hadoop-2.7'
I think spark-daria can simply be tested with Java 1.8 and without any Hadoop specified, correct? I don't think we need to be testing multiple different Java / Hadoop versions.
I just learned that Java 8 and Java 1.8 are the same thing... what?!
I'm not even going to ask why Java 9 and Java 10 aren't included in this discussion. So confusing!!
MrPowers commented
Check out the spark-daria GitHub Actions run here: https://github.com/MrPowers/spark-daria/commit/493923bb19db857374892022cd121a9b96223fcc/checks?check_suite_id=249793167