Gogo Crawler is a web crawling tool designed to crawl the web and store the data in a MongoDB database.
Before you begin, ensure you have met the following requirements:
-
Clone the repository:
git clone https://github.com/DeveloperJosh/gogo-crawler.git cd gogo-crawler
-
Install the dependencies:
bun install
-
Create a
.env
file in the root of the project and add your MongoDB connection details:MONGO_URL=your_mongo_url
-
Ensure your MongoDB instance is running and accessible.
To start the Gogo Crawler, run the following command:
bun crawl
And that's it! The crawler will start crawling the web and storing the data in your MongoDB database.
This project was inspired by the anime-crawler project by riimuru.