/comics-encyclopedia-api

API REST project using microservices architecture to create a comic book encyclopedia based on Super Hero API. Built with Java 11, Spring Boot, Node.js, Express.js, MongoDB, Apache Kafka and Docker.

Primary LanguageJavaMIT LicenseMIT

Comics Encyclopedia API

Logo

Comics Encyclopedia API is a REST API project using microservices architecture to create a comic book encyclopedia based on Super Hero API.

Objectives

Well, everyone who knows me personally knows that I love comic books and super heroes, specially DC Comics. I'm a comic book reader and collector since 2013, and now I had the idea of building a REST API using concepts of distributed systems and microservices using synchronous and asynchronous tasks such as messaging processing with Apache Kafka (I already know RabbitMQ, but I choose Kafka for learning a new data streamming and messaging technology).

For the two APIs, I'll use Java 11 with Spring Boot for the main API, which is the API that we will consume, and I'll also use Node.js with Express.js and ES6 Modules (with Sucrase) for building the comic book data (from Super Hero API) processing API, which will be only consumed by the main API and communicated with message topics using Kafka.

Stack

  • Java 11
  • Spring Boot 2
  • Javascript ES6
  • Node.js
  • MongoDB
  • Apache Kafka
  • Spring Data MongoDB
  • Spring Data OpenFeign
  • Mongoose
  • Kafdrop

Architecture

  • REST API architecture
  • Microservices architecture

Infrastructure locally

  • Docker
  • Docker-compose

Architecture design

This diagram represents how the components works in the system, and the execution flow.

Architecture design

Get your Super Hero API access token

First of all, go to the Super Hero API website: https://superheroapi.com

Then, log in with your Facebook account and get your access token. After that, just replace your access token at docker-compose.yml file. Change the value of the SUPER_HERO_API_ACCESS_TOKEN variable present at comics-encyclopedia-api container.

Run application

As we use a docker-compose file, to run everything you just have to type:

docker-compose up --build

In yout terminal at the same directory as the docker-compose file exists.

And if you doesn't want to see the logs of each container during initialization, just add the -d flag at the end of the command. Ex: docker-compose up --build -d

Access applications

After running the docker-compose file, there will be 6 docker containers, and it will be possible to access at:

Kafka Topics

We'll have 6 topics:

  • dc_comics_request.topic (for DC Comics data processing)
  • dc_comics_response.topic (for DC Comics data response to Comics Encyclopedia API)
  • marvel_comics_request.topic (for Marvel Comics data processing)
  • marvel_comics_response.topic (for Marvel Comics data response to Comics Encyclopedia API)
  • not_informed_publisher_request.topic (for data which has no publisher as DC or Marvel Comics for processing)
  • not_informed_publisher_response.topic (for data which has no publisher as DC or Marvel Comics for response)

Creating topics manually

All of the topics are automatically created after any of the microservices starts. But, if you want to create the topics manually, use those commands:

DC Comics Request:

docker-compose exec kafka kafka-topics --create --topic dc_comics_request.topic --partitions 1 --replication-factor 1 --if-not-exists --zookeeper zookeeper:2181

DC Comics Response:

docker-compose exec kafka kafka-topics --create --topic dc_comics_response.topic --partitions 1 --replication-factor 1 --if-not-exists --zookeeper zookeeper:2181

Marvel Comics Request:

docker-compose exec kafka kafka-topics --create --topic marvel_comics_request.topic --partitions 1 --replication-factor 1 --if-not-exists --zookeeper zookeeper:2181

Marvel Comics Response:

docker-compose exec kafka kafka-topics --create --topic marvel_comics_response.topic --partitions 1 --replication-factor 1 --if-not-exists --zookeeper zookeeper:2181

Not Informed Publisher Request:

docker-compose exec kafka kafka-topics --create --topic not_informed_publisher_request.topic --partitions 1 --replication-factor 1 --if-not-exists --zookeeper zookeeper:2181

Not Informed Publisher Response

docker-compose exec kafka kafka-topics --create --topic not_informed_publisher_response.topic --partitions 1 --replication-factor 1 --if-not-exists --zookeeper zookeeper:2181

And if you don't want to use CLI, by accessing Kafdrop you can also create or delete topics.

Monitor your Kafka with Kafdrop

By accessing http://localhost:19000, you'll be able to monitor your Apache Kafka with Kafdrop.

Kafdrop

It'll be possible to monitor your topics, your Kafka specs, and the messages in each topic.

Author

  • Victor Hugo Negrisoli
  • Back-end Software Developer
  • LinkedIn