viadee/bpmn.ai

Fix outdated integration test

fkoehne opened this issue · 6 comments

Running de.viadee.ki.sparkimporter.CSVImportAndProcessingApplicationIntegrationTest
0    [main] WARN  org.apache.hadoop.util.NativeCodeLoader  - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
7555 [main] WARN  de.viadee.ki.spark.importer  - Ignoring empty filter query.
7728 [main] WARN  org.apache.spark.util.Utils  - Truncated the string representation of a plan since it was too large. This behavior can be adjusted by setting 'spark.debug.maxToStringFields' in SparkEnv.conf.
8872 [main] WARN  de.viadee.ki.spark.importer  - Ignoring variable name mapping 'a' -> ''.
Tests run: 3, Failures: 1, Errors: 0, Skipped: 0, Time elapsed: 29.39 sec <<< FAILURE!
testLineValuesHashes(de.viadee.ki.sparkimporter.CSVImportAndProcessingApplicationIntegrationTest)  Time elapsed: 0.005 sec  <<< FAILURE!
org.junit.ComparisonFailure: expected:<[54A250FFFBC2D61E7D98C68BACB67572]> but was:<[A768D78F9934947563E7C7E2ED49799A]>
	at org.junit.Assert.assertEquals(Assert.java:115)
	at org.junit.Assert.assertEquals(Assert.java:144)

where do those tests fail? works on my machine locally.

@CarolineMethner is on it - it is quite probably a problem of test execution order / test interdependency that is only visible on travis (not locally and not on our jenkins machine either).

ah ok. alright!

We have a green light. :)

However, I think we should refactor the test in order to get rid of the checksum approach altogether. What do you think.

yes. I am just checking how to do it differently as there are some test not yet active again which fail. That's the one with the checksum from java class we talked about. so seems like we need to change that approach.

works now. Was related to different timezone on travis. Tests now set a timezone before start. checksum approach kept in for now as it did not cause the problem.