nuxt-modules/robots

Merge with Nuxt Simple Robots

harlan-zw opened this issue · 7 comments

Migration Notes

@nuxtjs/robots v3 breaking changes

  • The configPath config is no longer supported. For custom runtime config you should use Nitro Hooks.
  • The rules config is deprecated but will continue to work. Any BlankLine or Comment rules will no longer work.
  • Using CleanParam, CrawlDelay and Disavow requires targeting the Yandex user agent.

This issue is to discuss the possibility of merging with the nuxt-simple-robots module. This is not meant to imply intent, I'll only go ahead with merging if the current maintainers and team give approval for me to do so.

Why Merge

I built the nuxt-simple-robots package as I felt the feature set for this module didn't meet my needs, specifically around modules integrating with it (i.e sitemap module).

The nuxt-simple-robots module offers many important features for end users and module integrations. Some of the ones I think that most useful:

  • Non-production environments hidden by robots by default
  • Validation of robots.txt output
  • Route rules support
  • Deep Nitro integrations
  • Dedicated docs

For a comparison in usage:

  • nuxt-simple-robots - 57k downloads / mo
  • @nuxtjs/robots - ~74k downloads / mo (v3 tag)

Having multiple modules in the ecosystem that are tackling the same issue is good in my opinion, it gives the end user choice and encourages innovation. However, given that this robots module is "official" and it's one of the most downloaded for Nuxt, I feel we should be providing a single choice that provides delightful DX out of the box.

What Does Merging Look Like

  • A v4 tag would be released that contains all of the code from nuxt-simple-robots. The nuxt-simple-robots repo would be updated to redirect users to nuxt-modules/robots.
  • The nuxt-simple-robots npm package would be deprecated for @nuxtjs/robots
  • Migration guide provided for users for v3 -> v4
  • Keep existing maintainers within the package.json

Who WIll Maintain v4 Onwards?

I'd like to take over the the primary maintainer as I have most context on how the code is working, but would be amazing to have help in continuing to maintain the module.

Hey @ricardogobbosouza you seem to be the sole maintainer at the moment, would you review this when you have a chance, would be great to have your input!

cc: @atinux

Hey @harlan-zw. I tried to migrate to nuxt-simple-robots today but with no success.

As I can tell, nuxt-simple-robots does support only Allow and Disallow rules of robots.txt. @nuxtjs/robots on the other hand does support much more rules (e.g. CleanParam, CrawlDelay, Host). And as I need to use CleanParam, I can't migrate

I think it is a great idea to get single official module, but it should not be merged as is, first it needs to support all features of @nuxtjs/robots.

All google directives are supported.

I'm open to adding others if you want to make an issue for what's missing, but I wouldn't encourage end users to use them

  • CleanParam is now supported when targeting Yandex user agent ✔️
  • CrawlDelay does not seem supported in Google or Yandex, happy to support if there's a documented agent using it ❓
  • Host which user agent supports this? ❓
  • Disavow I looked at the docs linked to the PR and it doesn't say anything about robots.txt supporting it https://support.google.com/webmasters/answer/2648487?hl=en

I want to actively avoid adding rules in robots.txt that don't do anything, this is part of the validation feature.

I am happy with this change in order to bring consistency to the Nuxt community 👍

I'll be going ahead with this and plan to release an v4 rc of @nuxtjs/robots that is nuxt-simple-robots ~tomorrow, with a migration guide.

The migration is now complete as of v4