filamentgroup/glyphhanger

Prevent spider from crawling certain directories or files

seezee opened this issue · 0 comments

I've got some PHP files to prevent scrapers & form spammers. If a spider ignores my robots.txt file they are blacklisted. Currently, glyphhanger doesn't respect robots.txt and tries to crawl the directories or files.

An option like --spider-nocrawl=/wp-content/uploads/forbiddenfolder&&/wp-content/uploads/forbidden-file.php would be nice!