robots.txt
Closed this issue · 2 comments
ljm42 commented
To keep Unraid servers from being indexed by search engines, please consider adding a robots.txt in /usr/local/emhttp that contains:
User-agent: *
Disallow: /
and then adding this to rc.nginx to serve the file without requiring authentication:
#
# robots.txt available without authentication
#
location = /robots.txt {
auth_basic off;
allow all;
}
ljm42 commented
great! this should keep the well-behaved robots away without giving away any details to the bad bots.