Comics Encyclopedia API is a REST API project using microservices architecture to create a comic book encyclopedia based on Super Hero API.
Well, everyone who knows me personally knows that I love comic books and super heroes, specially DC Comics. I'm a comic book reader and collector since 2013, and now I had the idea of building a REST API using concepts of distributed systems and microservices using synchronous and asynchronous tasks such as messaging processing with Apache Kafka (I already know RabbitMQ, but I choose Kafka for learning a new data streamming and messaging technology).
For the two APIs, I'll use Java 11 with Spring Boot for the main API, which is the API that we will consume, and I'll also use Node.js with Express.js and ES6 Modules (with Sucrase) for building the comic book data (from Super Hero API) processing API, which will be only consumed by the main API and communicated with message topics using Kafka.
- Java 11
- Spring Boot 2
- Javascript ES6
- Node.js
- MongoDB
- Apache Kafka
- Spring Data MongoDB
- Spring Data OpenFeign
- Mongoose
- Kafdrop
- REST API architecture
- Microservices architecture
- Docker
- Docker-compose
This diagram represents how the components works in the system, and the execution flow.
First of all, go to the Super Hero API website: https://superheroapi.com
Then, log in with your Facebook account and get your access token. After that, just replace your
access token at docker-compose.yml
file. Change the value of the SUPER_HERO_API_ACCESS_TOKEN
variable present at comics-encyclopedia-api
container.
As we use a docker-compose file, to run everything you just have to type:
docker-compose up --build
In yout terminal at the same directory as the docker-compose file exists.
And if you doesn't want to see the logs of each container during initialization, just add the -d
flag at the end of the command. Ex: docker-compose up --build -d
After running the docker-compose file, there will be 6 docker containers, and it will be possible to access at:
- zookeper -> http://localhost:2181
- kafka -> http://localhost:9092
- mongodb -> http://localhost:27017
- kafdrop -> http://localhost:19000
- comics-encyclopedia-api -> http://localhost:8080
- comics-processor-api -> http://localhost:8081
We'll have 6 topics:
- dc_comics_request.topic (for DC Comics data processing)
- dc_comics_response.topic (for DC Comics data response to Comics Encyclopedia API)
- marvel_comics_request.topic (for Marvel Comics data processing)
- marvel_comics_response.topic (for Marvel Comics data response to Comics Encyclopedia API)
- not_informed_publisher_request.topic (for data which has no publisher as DC or Marvel Comics for processing)
- not_informed_publisher_response.topic (for data which has no publisher as DC or Marvel Comics for response)
All of the topics are automatically created after any of the microservices starts. But, if you want to create the topics manually, use those commands:
DC Comics Request:
docker-compose exec kafka kafka-topics --create --topic dc_comics_request.topic --partitions 1 --replication-factor 1 --if-not-exists --zookeeper zookeeper:2181
DC Comics Response:
docker-compose exec kafka kafka-topics --create --topic dc_comics_response.topic --partitions 1 --replication-factor 1 --if-not-exists --zookeeper zookeeper:2181
Marvel Comics Request:
docker-compose exec kafka kafka-topics --create --topic marvel_comics_request.topic --partitions 1 --replication-factor 1 --if-not-exists --zookeeper zookeeper:2181
Marvel Comics Response:
docker-compose exec kafka kafka-topics --create --topic marvel_comics_response.topic --partitions 1 --replication-factor 1 --if-not-exists --zookeeper zookeeper:2181
Not Informed Publisher Request:
docker-compose exec kafka kafka-topics --create --topic not_informed_publisher_request.topic --partitions 1 --replication-factor 1 --if-not-exists --zookeeper zookeeper:2181
Not Informed Publisher Response
docker-compose exec kafka kafka-topics --create --topic not_informed_publisher_response.topic --partitions 1 --replication-factor 1 --if-not-exists --zookeeper zookeeper:2181
And if you don't want to use CLI, by accessing Kafdrop you can also create or delete topics.
By accessing http://localhost:19000, you'll be able to monitor your Apache Kafka with Kafdrop.
It'll be possible to monitor your topics, your Kafka specs, and the messages in each topic.
- Victor Hugo Negrisoli
- Back-end Software Developer