This project is divided into two main parts: the first part involves creating a MongoDB database with collections for authors and quotes, and a script for querying this data. The second part uses RabbitMQ for simulating email notifications to contacts stored in the MongoDB database, demonstrating producer and consumer patterns.
- MongoDB Collections:
authors
: Contains author details.quotes
: Contains quotes with author references.
- Data Querying:
- Script to search quotes by tag, author name, or a set of tags.
- Support for abbreviated search queries.
- Redis Caching:
- Caching query results for faster retrieval.
- Set up an Atlas MongoDB cloud database.
- Create
authors
andquotes
collections. - Use Mongoengine ODM for modeling and data operations.
- Run the script and input commands in the format:
command:value
. - Supported commands:
name:<author>
,tag:<tag>
,tags:<tag1>,<tag2>
,exit
. - Use Redis for caching query results.
- MongoDB Contact Model:
- Model for storing contact details with fields for name, email, and a boolean flag indicating if an email has been sent.
- Producer Script (
producer.py
):- Generates fake contacts using Faker and stores them in MongoDB.
- Sends messages to RabbitMQ queue with the ObjectID of each contact.
- Consumer Script (
consumer.py
):- Receives messages from RabbitMQ queue.
- Simulates email sending and updates the contact's sent flag in MongoDB.
- Use RabbitMQ for message queuing.
- Set up queues for managing contact notifications.
- Run
producer.py
to generate contacts and queue messages. - Run
consumer.py
to process messages and simulate email sending.
- Clone the repository.
- Requires Python with dependencies: Mongoengine, Faker, Redis, RabbitMQ client.
- Docker for running MongoDB and RabbitMQ containers.
- Tested MongoDB models and query functionalities for accuracy.
- Validated the integration and message handling with RabbitMQ.
- Ensured Redis caching improves query response times.