Support for basic authentication (collection control)
pigeon-geng opened this issue · 2 comments
pigeon-geng commented
@kiranchitturi
I see the #118 . Why not like this
val options = Map(
"collection" -> "{solr_collection_name}",
"zkhost" -> "{zk_connect_string}",
"httpBasicAuthUser" -> "{httpBasicAuthUser}",
"httpBasicAuthPassword" -> "{httpBasicAuthPassword}",
)
val df = spark.read.format("solr")
.options(options)
.load
This seems to be more friendly to multi-permission support.
It can be controlled by spark options.
kiranchitturi commented
We didn't add this due to the auth being exposed but if the opts are controlled via external variable, then this does make sense. Patches are welcome
washcycle commented
I was looking for this feature as well. I use the solr-operator in k8s and have a lot of solr instances and right now I have to have a Spark cluster designated for each instance of solr. I'd like to be able to switch on the fly. The patch implemented looked good to me.