Kafka Properties not getting passed properly while initialisation of datahub_stream
nj7 opened this issue · 2 comments
Kafka properties are not getting passed properly during initial deployment datahub-actions run some ingestion over kafka stream,
I have protected Kafka, that does not use simple SSL connection instead uses sasl
This is the kafka config that I am using
security.protocol: SASL_SSL
sasl.mechanism: SCRAM-SHA-512
client.sasl.mechanism: SCRAM-SHA-512
kafkastore.security.protocol: SSL
ssl.endpoint.identification.algorithm: https
ssl.keystore.type: JKS
ssl.protocol: TLSv1.2
ssl.truststore.type: JKS
even though they are passed correctly to the container env variables but are not populated in the config file that gets executed,
here is the config that was generated (found from container logs)
{'source':
{ 'type': 'datahub-stream',
'config': {
'auto_offset_reset': 'latest',
'connection': {
'bootstrap': 'XXXXXXXXXXXXXX',
'schema_registry_url': 'XXXXXXXXXXXXX',
'consumer_config': {'security.protocol': 'SASL_SSL'}
},
'actions': [
{ 'type': 'executor',
'config': {
'local_executor_enabled': True,
'remote_executor_enabled': 'False',
'remote_executor_type': 'acryl.executor.sqs.producer.sqs_producer.SqsRemoteExecutor',
'remote_executor_config': {
'id': 'remote',
'aws_access_key_id': '""',
'aws_secret_access_key': '""',
'aws_session_token': '""',
'aws_command_queue_url': '""',
'aws_region': '""'
}
}
}
],
'topic_routes': {
'mae': 'MetadataAuditEvent_v4',
'mcl': 'MetadataChangeLog_Versioned_v1'
}
}
},
'sink': {'type': 'console'},
'datahub_api': {'server': 'http://datahub-datahub-gms:8080', 'extra_headers': {'Authorization': 'Basic __datahub_system:NOTPASSING'}}
}
as in the above config, it is clear that other required configurations are not getting passed.
Datahub-Actions : "v0.0.1-beta.13"
Hi @nj7! I think I responded on Slack also, but in this case it seems you are using the old acryl-datahub-actions container.
Since releasing actions, we've moved to acryldata/datahub-actions container - please use the v0.0.1 version there, which does support custom kafka properties.