Presentation Presentation-Batches_to_Streams_with_Apache_Kafka.pdf
I plan to do a YouTube video with what was presented at Orlando Code Camp, and when I get that done I'll drop the link here.
NOTE: I apologize that the steps below seem a little disjointed, but the reality is that in a REAL production application that all these various pieces, this is the nature of the beast. Honestly, stepping through all of these will give an appreciation for the types of interactions that exist in a real environment rather than just being able to push a button and have everything magically happen.
-
Download the SQL Server JDBC Driver https://www.microsoft.com/en-us/download/details.aspx?id=57782
-
Unzip the mssql-jdbc-7.2.1.jre8.jar to ./kafka-connect/jars folder NOTE: When unzipping the jdbc zip, my jar was located in sqljdbc_7.2.1.0_enu.tar\sqljdbc_7.2.1.0_enu\sqljdbc_7.2\enu
-
Start the stack - look in the docker-compose.yml file to see a list of the services that will get started
docker-compose up -d
-
See if SQL Server is alive
docker-compose logs --tail 100 sql-server
-
Run the Create DB Scripts - I used SSMS db-01.sql
-
Run the select scripts to look at the rewards / transaction tables - do this in another tab in SSMS db-02.sql
-
Start up Postman - Get it here https://www.getpostman.com/downloads/
-
Import Postman Scripts OrlandoCC.postman_collection.json
-
Run the User Rewards Source Connector in Postman
-
Launch Confluent Control Center at http://localhost:9021/
-
Look at the Topics tab
-
After we see some data in the rewards topic, run the User Rewards Transactions Source Connector in Postman
-
Streams Time - Open up ksql-streams.txt and run the blocks of statments one at a time
-
Run the Sink Connectors
- User Rewards Real Time DB
- User Rewards Real Time Elastic