Setting dedicated warehouse and metastore temp-directory doens't work
qnob opened this issue · 0 comments
Hello,
I've noticed, when I run one of my tests, the default warehouse directory (current directory is used. Same for metastore directory. This directories will be reused by each Spark Session, therefore, my tests will have dependencies between each other.
I've found out that the framework actually handles that problem by setting these directories, among other configuration. See
DataFrameSuiteBaseLike#sqlBeforeAllTestCases
However, these settings are never used, because at the moment you're creating this configuration, the Spark Context will already be created. And therefore all future settings are ignored. More precisely, the SharedSparkContext#beforeAll method is called before DataFrameSuiteBaseLike#sqlBeforeAllTestCases
Unfortunately, I haven't found a way to fix it. Therefore, I won't be able to make a PR. Maybe someone else might help out.
Thanks.
Kuno