- Info
- Assumptions / Requirements
- Deployed Resource URLs
- Running the Playbook
- Additional Resources
- Backlog for enhancements
- PRs welcome!
Ansible playbook for provisioning a Debezium demo using my Summit Lab Spring Music application as the "monolith". The Debezium connector is configured to use the Outbox Event Router.
The application is a simple Spring Boot application connected to a MySQL database. We'll install a 3 replica Kafka cluster with Kafka connect and then install the Debezium MySQL connector.
The database credentials are stored in a Secret
and then mounted into the Kafka Connect cluster.
The Kafka Broker, Kafka Connect, and Kafka Bridge are all authenticated via OAuth 2.0. Red Hat Single Sign-on is installed and used as the authorization server. A new realm is automatically created and provisioned.
- The OpenShift
sso73-postgresql-persistent
template is installed in theopenshift
namespace - OperatorHub is available with the following operators available
- The
openssl
utility is installed - The
keytool
utility is installed
All the below resource URLs are suffixed with the apps url of the cluster (i.e. for an RHPDS environment, apps.cluster-##GUID##.##GUID##.example.opentlc.com
).
To run this you would do something like
$ ansible-playbook -v main.yml -e ocp_api_url=<OCP_API_URL> -e ocp_admin_pwd=<OCP_ADMIN_USER_PASSWORD>
You'll need to replace the following variables with appropriate values:
Variable | Description |
---|---|
<OCP_API_URL> |
API url of your cluster |
<OCP_ADMIN_USER_PASSWORD> |
Password for the OCP admin account |
This playbook also makes some assumptions about some things within the cluster. These variables can be overridden with the -e
switch when running the playbook.
Description | Variable | Default Value |
---|---|---|
OpenShift admin user name | ocp_admin |
opentlc-mgr |
OCP user to install demo into | ocp_proj_user |
user1 |
OCP user password for above user | ocp_proj_user_pwd |
openshift |
Project name to install demo into | proj_nm_demo |
demo |
- MySQL Database Template
- AMQ Streams Template
- Includes
Kafka
,KafkaConnect
,KafkaConnector
, andKafkaBridge
custom resources
- Includes
- Kafdrop Template
- Red Hat SSO Realm Config
- Guide used for help in setting this all up
- Thanks @sigreen!
PRs welcome!
- Enabling schema registry and using AVRO serializtion/deserialization
- Prometheus metrics
- Grafana dashboards
- Add authorization on the topics to the different clients
- Getting Kafdrop to authenticate with the broker
- This will allow removal of the
plain
listener on the broker
- This will allow removal of the
- Integrate the
KafkaBridge
with something (3scale?) - Build some kind of consumer(s) to read the messages & do something with them