About CrawlBot

CrawlBot is a simple shell script which can be used to crawl the 'robots.txt' of a given domain. If the process is successful, the result will be listed into 3 different text files (allows.txt, disallows.txt, all.txt) in the created domain directory.

Screenshots

Installation

user@example:~git clone https://github.com/adhithyanmv/crawlbot.git
user@example:~cd crawlbot
user@example:~chmod +x *
user@example:~./crawlbot.sh domain_name

Usage

example:~./crawlbot.sh domain_name
user@example:~./crawlbot.sh google.com

Credits

Special thanks to:

Version

Current version is 1.0