The Nginx log indicates a 404 error even though the route exists
foremtehan opened this issue ยท 2 comments
I am using the exact same nginx config file in a Laravel application. However, today I noticed that some bots are encountering a 404 error on /robots.txt, whereas when I visit the route myself, everything seems to be working fine. What could be causing this issue?
this is a 404 log on AhrefsBot:
172.71.131.89 - - [09/Jun/2023:04:25:08 +0000] "GET /robots.txt HTTP/1.1" ๐ 404 12175 "-" "Mozilla/5.0 (compatible; AhrefsBot/7.0; +http://ahrefs.com/robot/)" "54.36.148.162" 0.025 0.025 . -
Visit by myself in a browser and i can see the content of file:
172.18.0.1 - - [09/Jun/2023:06:32:38 +0000] "GET /robots.txt HTTP/1.0" ๐ 200 24 "-" "Mozilla/5.0 (Windows NT 10.0; Win64; x64; rv:109.0) Gecko/20100101 Firefox/113.0" "my.ip.here" 0.000 - . -
what's the issue?
Hi @foremtehan,
I don't see how this could be caused by the Nginx configuration.
What I notice in your log lines is that the request from Ahrefs has a upstream response time (0.025) and the request from you doesn't have that. That could mean that the request from Ahrefs is forwarded to PHP-FPM and your request isn't.
Is robots.txt an actual file or is it handled by the Laravel application?
I have that file under /public/robots.txt, but nevermind; I did not find any 404 errors from that moment until now. Thanks for the response <3