Spark configuration config path format
Closed this issue · 1 comments
qxzzxq commented
Is your feature request related to a problem? Please describe.
Currently, SETL will load by default its configuration from the config path setl.config
. User can define their Spark configuration within the path setl.config.spark
, for example:
setl.config {
spark {
spark.app.name = "test_app"
spark.sql.shuffle.partitions = 200
}
}
However, we have a redundant "spark" keyword in our config path. I'd like to have a configuration like the following:
setl.config {
spark.app.name = "test_app"
spark.sql.shuffle.partitions = 200
}
SETL should be able to handle both the two cases.
Describe the solution you'd like
In SparkSessionBuilder
, before setting a variable, check if it has the prefix spark. and append the prefix if it's not present.
nourrammal commented
I will take this!