/spring-cloud-stream-event-sourcing-testcontainers

Goal: create a Spring Boot application that handles users using Event Sourcing. So, whenever a user is created, updated, or deleted, an event informing this change is sent to Kafka. Also, we will implement another application that listens to those events and saves them in Cassandra. Finally, we will use Testcontainers for end-to-end testing.

Primary LanguageJava

spring-cloud-stream-event-sourcing-testcontainers

The goal of this project is to create a Spring Boot application that handles users using Event Sourcing. So, besides the traditional create/update/delete, whenever a user is created, updated, or deleted, an event informing this change is sent to Kafka. Furthermore, we will implement another Spring Boot application that listens to those events and saves them in Cassandra. Finally, we will use Testcontainers for end-to-end testing.

Note: In kubernetes-minikube-environment repository, it's shown how to deploy this project in Kubernetes (Minikube)

Proof-of-Concepts & Articles

On ivangfr.github.io, I have compiled my Proof-of-Concepts (PoCs) and articles. You can easily search for the technology you are interested in by using the filter. Who knows, perhaps I have already implemented a PoC or written an article about what you are looking for.

Additional Readings

Project Architecture

project-diagram

Applications

  • user-service

    Spring Boot Web Java application responsible for handling users. The user information is stored in MySQL. Once a user is created, updated or deleted, an event is sent to Kafka.

    user-service

    • Serialization format

      user-service can use JSON or Avro format to serialize data to the binary format used by Kafka. If we choose Avro, both services will benefit by the Schema Registry that is running as Docker container. The serialization format to be used is defined by the value set to the environment variable SPRING_PROFILES_ACTIVE.

      Configuration Format
      SPRING_PROFILES_ACTIVE=default JSON
      SPRING_PROFILES_ACTIVE=avro Avro
  • event-service

    Spring Boot Web Java application responsible for listening events from Kafka and saving them in Cassandra.

    event-service

    • Deserialization

      Differently from user-service, event-service has no specific Spring profile to select the deserialization format. Spring Cloud Stream provides a stack of MessageConverters that handle the conversion of many types of content-types, including application/json. Besides, as event-service has SchemaRegistryClient bean registered, Spring Cloud Stream auto configures an Apache Avro message converter for schema management.

      In order to handle different content-types, Spring Cloud Stream has a "content-type negotiation and transformation" strategy (more here). The precedence orders are: first, content-type present in the message header; second, content-type defined in the binding; and finally, content-type is application/json (default).

      The producer (in the case user-service) always sets the content-type in the message header. The content-type can be application/json or application/*+avro, depending on with which SPRING_PROFILES_ACTIVE the user-service is started.

    • Java classes from Avro Schema

      Run the following command in spring-cloud-stream-event-sourcing-testcontainers root folder. It will re-generate the Java classes from the Avro schema present at event-service/src/main/resources/avro.

      ./mvnw compile --projects event-service
      
  • end-to-end-test

    Spring Boot Web Java application used to perform end-to-end tests on user-service and event-service. It uses Testcontainers will start automatically Zookeeper, Kafka, MySQL, Cassandra, user-service and event-service Docker containers before the tests begin and will shut them down when the tests finish.

Prerequisites

Start Environment

  • In a terminal and inside spring-cloud-stream-event-sourcing-testcontainers root folder run:

    docker compose up -d
    
  • Wait for Docker containers to be up and running. To check it, run:

    docker compose ps
    

Running Applications with Maven

  • user-service

    • In a terminal, make sure you are inside spring-cloud-stream-event-sourcing-testcontainers root folder;

    • In order to run the application, you can pick between JSON or Avro:

      • Using JSON
        ./mvnw clean spring-boot:run --projects user-service
        
      • Using Avro
        ./mvnw clean spring-boot:run --projects user-service -Dspring-boot.run.profiles=avro
        
  • event-service

    • In a new terminal, make sure you are inside spring-cloud-stream-event-sourcing-testcontainers root folder;

    • Run the following command:

      ./mvnw clean spring-boot:run --projects event-service
      

Running Applications as Docker containers

  • Build Docker Images

    • In a terminal, make sure you are inside spring-cloud-stream-event-sourcing-testcontainers root folder;

    • Run the following script to build the Docker images:

      • JVM
        ./docker-build.sh
        
      • Native (it's not implemented yet)
        ./docker-build.sh native
        
  • Environment Variables

    • user-service

      Environment Variable Description
      MYSQL_HOST Specify host of the MySQL database to use (default localhost)
      MYSQL_PORT Specify port of the MySQL database to use (default 3306)
      KAFKA_HOST Specify host of the Kafka message broker to use (default localhost)
      KAFKA_PORT Specify port of the Kafka message broker to use (default 29092)
      SCHEMA_REGISTRY_HOST Specify host of the Schema Registry to use (default localhost)
      SCHEMA_REGISTRY_PORT Specify port of the Schema Registry to use (default 8081)
      ZIPKIN_HOST Specify host of the Zipkin distributed tracing system to use (default localhost)
      ZIPKIN_PORT Specify port of the Zipkin distributed tracing system to use (default 9411)
    • event-service

      Environment Variable Description
      CASSANDRA_HOST Specify host of the Cassandra database to use (default localhost)
      CASSANDRA_PORT Specify port of the Cassandra database to use (default 9042)
      KAFKA_HOST Specify host of the Kafka message broker to use (default localhost)
      KAFKA_PORT Specify port of the Kafka message broker to use (default 29092)
      SCHEMA_REGISTRY_HOST Specify host of the Schema Registry to use (default localhost)
      SCHEMA_REGISTRY_PORT Specify port of the Schema Registry to use (default 8081)
      ZIPKIN_HOST Specify host of the Zipkin distributed tracing system to use (default localhost)
      ZIPKIN_PORT Specify port of the Zipkin distributed tracing system to use (default 9411)
  • Run Docker Containers

    • In a terminal, make sure you are inside spring-cloud-stream-event-sourcing-testcontainers root folder;

    • In order to run the application's Docker container, you can pick between JSON or Avro:

      • Using JSON
        ./start-apps.sh
        
      • Using Avro
        ./start-apps.sh avro
        

Applications URLs

Application URL
user-service http://localhost:9080/swagger-ui.html
event-service http://localhost:9081/swagger-ui.html

Playing around

  1. Create a user:

    curl -i -X POST localhost:9080/api/users \
      -H  "Content-Type: application/json" \
      -d '{"email":"ivan.franchin@test.com","fullName":"Ivan Franchin","active":true}'
    
  2. Check whether the event related to the user creation was received by event-service:

    curl -i "localhost:9081/api/events?userId=1"
    
  3. You can check the traces in Zipkin http://localhost:9411.

  4. Access user-service and create new users and/or update/delete existing ones. Then, access event-service Swagger website to validate if the events were sent correctly.

Useful Commands & Links

  • MySQL

    docker exec -it -e MYSQL_PWD=secret mysql mysql -uroot --database userdb
    SELECT * FROM users;
    

    Type exit to leave MySQL Monitor

  • Cassandra

    docker exec -it cassandra cqlsh
    USE ivanfranchin;
    SELECT * FROM user_events;
    

    Type exit to leave CQL shell

  • Zipkin

    Zipkin can be accessed at http://localhost:9411

    zipkin

  • Kafka Topics UI

    Kafka Topics UI can be accessed at http://localhost:8085

    kafka-topics-ui

  • Schema Registry UI

    Schema Registry UI can be accessed at http://localhost:8001

    schema-registry-ui

  • Kafka Manager

    Kafka Manager can be accessed at http://localhost:9000

    Configuration

    • First, you must create a new cluster. Click on Cluster (dropdown button on the header) and then on Add Cluster
    • Type the name of your cluster in Cluster Name field, for example: MyCluster
    • Type zookeeper:2181 in Cluster Zookeeper Hosts field
    • Enable checkbox Poll consumer information (Not recommended for large # of consumers if ZK is used for offsets tracking on older Kafka versions)
    • Click on Save button at the bottom of the page.

    The image below shows the topics present in Kafka, including the topic com.ivanfranchin.userservice.user with 3 partitions.

    kafka-manager

Shutdown

  • Stop applications

    • If they were started with Maven, go to the terminals where they are running and press Ctrl+C;
    • If they were started as a Docker container, run the script below:
      ./stop-apps.sh
      
  • To stop and remove docker compose containers, networks and volumes, make sure you are inside spring-cloud-stream-event-sourcing-testcontainers root folder and run:

    docker compose down -v
    

Running tests

  • event-service

    • Run the command below to start the Unit Tests

      Note: Testcontainers will start automatically Cassandra Docker container before some tests begin and will shut it down when the tests finish.

      ./mvnw clean test --projects event-service
      
  • user-service

    • Run the command below to start the Unit Tests

      ./mvnw clean test --projects user-service
      
    • Run the command below to start the End-To-End Tests

      Warning: Make sure you have user-service and event-service Docker images.

      Note: Testcontainers will start automatically Zookeeper, Kafka, MySQL, Cassandra, user-service and event-service Docker containers before the tests begin and will shut them down when the tests finish.

      • Using JSON
        ./mvnw clean test --projects end-to-end-test -DargLine="-Dspring.profiles.active=test"
        
      • Using Avro
        ./mvnw clean test --projects end-to-end-test -DargLine="-Dspring.profiles.active=test,avro"
        

Cleanup

To remove the Docker images created by this project, go to a terminal and, inside spring-cloud-stream-event-sourcing-testcontainers root folder, run the following script:

./remove-docker-images.sh