Way to disable robots and sitemap on non-production envs
maykelesser opened this issue · 3 comments
Is your feature request related to a problem? Please describe.
No, is just an enhancement.
Describe the solution you'd like
I'd like to set an option to disable robots and sitemaps on dev environments, for example. This way i can avoid to be crawled on my dev/sandbox envs.
Describe alternatives you've considered
I'm not able to think in a solution to it, to be honest. Any ideas would be appreaciated for sure.
i'm also looking for this.
i'd like to generate different robots.txt
according to an env variable.
a non-complete solution for your question: you could disable robots.txt generation for non-production environments as shown below. however there is a caveat. once generated for production, it will stay there. this solution won't remove or alter the robots.txt
(which was made for production) when you want to deploy to non-production environment. so be careful.
// next-sitemap.config.js
generateRobotsTxt: process.env.NEXT_DEPLOY_ENV === 'production'
// package.json
"scripts": {
"build:staging": "NEXT_DEPLOY_ENV=staging next build",
"build:production": "NEXT_DEPLOY_ENV=production next build",
...
}
Closing this issue due to inactivity.
Why did this get closed