Burrow SASL_SSL configs
ashishvashisht1 opened this issue · 7 comments
Hello
i am trying to configure Burrow to connect to our Kafka cluster which is kerberized and is SASL_SSL; is there any sample config/examples that i could follow to add specific configs in burrow.toml?
Thanks
Ashish
Added few SASL_SSL parms;
[sasl.SASL_SSL]
#username=kafka
security_protocol="SASL_SSL"
sasl_mechanism="GSSAPI"
ssl_cafile="truststore.pem"
handshake-first=false
getting this error:
{"level":"error","ts":1658961613.1148155,"msg":"failed to start client","type":"module","coordinator":"cluster","class":"kafka","name":"local","error":"kafka: invalid configuration (Net.SASL.User must not be empty when SASL is enabled)"}
Hi @ashishvashisht1, you can check this comment.
Thanks @gklp
Seems like i have those configs (broadly) and still not able to connect:
All Configs for Burrow below.. still getting errors:
"level":"error","ts":1658860339.0624013,"msg":"failed to start client","type":"module","coordinator":"cluster","class":"kafka","name":"local","error":"kafka: client has run out of available brokers to talk to (Is your cluster reachable?)"}
{"level":"info","ts":1658860339.0624447,"msg":"stopping","type":"coordinator","name":"notifier"}
{"level":"info","ts":1658860339.062451,"msg":"shutdown","type":"coordinator","name":"httpserver"}
{"level":"info","ts":1658860339.0624893,"msg":"stopping","type":"coordinator","name":"evaluator"}
{"level":"info","ts":1658860339.062495,"msg":"stopping","type":"module","coordinator":"evaluator","class":"caching","name":"default"}
{"level":"info","ts":1658860339.0625021,"msg":"stopping","type":"coordinator","name":"storage"}
{"level":"info","ts":1658860339.062509,"msg":"stopping","type":"module","coordinator":"storage","class":"inmemory","name":"default"}
{"level":"info","ts":1658860339.0625546,"msg":"stopping","type":"coordinator","name":"zookeeper"}
{"level":"info","ts":1658860339.0648248,"msg":"recv loop terminated: err=EOF","type":"coordinator","name":"zookeeper"}
{"level":"info","ts":1658860339.0648563,"msg":"send loop terminated: err=<nil>","type":"coordinator","name":"zookeeper"}
`[general]
pidfile="burrow.pid"
stdout-logfile="burrow.out"
access-control-allow-origin="mysite.example.com"
[logging]
filename="logs/burrow.log"
level="debug"
maxsize=100
maxbackups=30
maxage=10
use-localtime=true
use-compression=true
[zookeeper]
servers=[ "HOST1:2181", "HOST2:2181", "HOST2:2181" ]
timeout=6
root-path="/burrow"
[client-profile.test]
client-id="burrow-test"
kafka-version="0.10.0"
sasl="SASL_SSL"
tls="kafka-certs"
[tls.kafka-certs]
certfile="truststore.jks"
keyfile="keystore.jks"
cafile="rootca.pem"
noverify=true
[sasl.SASL_SSL]
#username=kafka
security_protocol="SASL_SSL"
sasl_mechanism="GSSAPI"
ssl_cafile="/truststore.pem"
handshake-first=false
[cluster.local]
class-name="kafka"
servers=[ "HOST1:9093", "HOST2:9093", "HOST2:9093" ]
client-profile="test"
topic-refresh=120
offset-refresh=30
groups-reaper-refresh=0
[consumer.local]
class-name="kafka"
cluster="local"
servers=[ "HOST1:9093", "HOST2:9093", "HOST2:9093" ]
client-profile="test"
group-denylist="^(console-consumer-|python-kafka-consumer-|quick-).*$"
group-allowlist=""
[consumer.local_zk]
class-name="kafka_zk"
cluster="local"
servers=[ "HOST1:2181", "HOST2:2181", "HOST2:2181" ]
zookeeper-path="/kafka-cluster"
zookeeper-timeout=30
group-denylist="^(console-consumer-|python-kafka-consumer-|quick-).*$"
group-allowlist=""
[httpserver.default]
address=":8000"
[storage.default]
class-name="inmemory"
workers=20
intervals=15
expire-group=604800
min-distance=1
`
I guess that documentation has missing points. There might be one more configuration. I've seen it in code.
Burrow/core/internal/helpers/sarama.go
Line 121 in be40f44
`
[sasl.SASL_SSL]
#username=kafka
security_protocol="SASL_SSL"
sasl_mechanism="GSSAPI" /// should be "mechanism" and two options -> SCRAM-SHA-256 or SCRAM-SHA-512, you can see in the code
ssl_cafile="/truststore.pem"
handshake-first=false`
Well, New Error now, seems like enabling SASL requires username & password
Net.SASL.User must not be empty when SASL is enabled Net.SASL.Password must not be empty when SASL is enabled
We don't use open user/pass and use connect via service principles that are specifically granted Roles.
Not sure if we are the only ones doing it.. i assume SASL_SSL is default protocol used by everyone.
@ashishvashisht1 did you try it without sasl part ? maybe you just need tls config.
@gklp ,
I tried that as well, still getting errors; i am not sure if I mentioned it clearly, we do have to use Kerberos auth and i do explicit jaas.conf declare and kinit prior to running burrow.
Configs:
[client-profile.test]
client-id="burrow-test"
kafka-version="0.10.0"
#sasl="SASL_SSL"
tls="kafka-certs"
{"level":"debug","ts":1659116098.7474747,"msg":"Successful SASL handshake. Available mechanisms: [SCRAM-SHA-512 GSSAPI SCRAM-SHA-256]","name":"sarama"} {"level":"debug","ts":1659116098.7477207,"msg":"Failed to read response header while authenticating with SASL to broker HOST1:9093: EOF","name":"sarama"} {"level":"debug","ts":1659116098.7477582,"msg":"Closed connection to broker HOST1:9093","name":"sarama"} {"level":"debug","ts":1659116098.7477732,"msg":"client/metadata got error from broker -1 while fetching metadata: EOF","name":"sarama"} {"level":"debug","ts":1659116098.7477832,"msg":"client/metadata fetching metadata for all topics from broker