This plugin enables 'enterprise-class' Google Sitemaps to be easily generated for a Rails site as a rake task, using a simple 'Rails Routes'-like DSL.
Most of the Sitemap plugins out there seem to try to recreate the Sitemap links by iterating the Rails routes. In some cases this is possible, but for a great deal of cases it isn't.
a) There are probably quite a few routes in your routes file that don't need inclusion in the Sitemap. (AJAX routes I'm looking at you.)
and
b) How would you infer the correct series of links for the following route?
map.zipcode 'location/:state/:city/:zipcode', :controller => 'zipcode', :action => 'index'
Don't tell me it's trivial, because it isn't. It just looks trivial.
So my idea is to have another file similar to 'routes.rb' called 'sitemap.rb', where you can define what goes into the Sitemap.
Here's my solution:
Zipcode.find(:all, :include => :city).each do |z|
sitemap.add zipcode_path(:state => z.city.state, :city => z.city, :zipcode => z)
end
Easy hey?
Other Sitemap settings for the link, like lastmod
, priority
, changefreq
and host
are entered automatically, although you can override them if you need to.
Other "difficult" Sitemap issues, solved by this plugin:
- Support for more than 50,000 urls (using a Sitemap Index file)
- Gzip of Sitemap files
- Variable priority of links
- Paging/sorting links (e.g. my_list?page=3)
- SSL host links (e.g. https:)
- Rails apps which are installed on a sub-path (e.g. example.com/blog_app/)
As a gem
-
Add the gem as a dependency in your config/environment.rb
config.gem 'sitemap_generator', :lib => false, :source => 'http://gemcutter.org'
-
$ rake gems:install
-
Add the following line to your RAILS_ROOT/Rakefile
require 'sitemap_generator/tasks' rescue LoadError
-
$ rake sitemap:install
As a plugin
-
Install plugin as normal
$ ./script/plugin install git://github.com/adamsalter/sitemap_generator.git
Installation should create a 'config/sitemap.rb' file which will contain your logic for generation of the Sitemap files. (If you want to recreate this file manually run rake sitemap:install
)
You can run rake sitemap:refresh
as needed to create Sitemap files. This will also ping all the 'major' search engines. (if you want to disable all non-essential output run the rake task thusly rake -s sitemap:refresh
)
Sitemaps with many urls (100,000+) take quite a long time to generate, so if you need to refresh your Sitemaps regularly you can set the rake task up as a cron job. Most cron agents will only send you an email if there is output from the cron task.
Optionally, you can add the following to your robots.txt file, so that robots can find the sitemap file.
Sitemap: <hostname>/sitemap_index.xml.gz
The robots.txt Sitemap URL should be the complete URL to the Sitemap Index, such as: http://www.example.org/sitemap_index.xml.gz
# Set the host name for URL creation
SitemapGenerator::Sitemap.default_host = "http://www.example.com"
SitemapGenerator::Sitemap.add_links do |sitemap|
# Put links creation logic here.
#
# The Root Path ('/') and Sitemap Index file are added automatically.
# Links are added to the Sitemap output in the order they are specified.
#
# Usage: sitemap.add path, options
# (default options are used if you don't specify them)
#
# Defaults: :priority => 0.5, :changefreq => 'weekly',
# :lastmod => Time.now, :host => default_host
# Examples:
# add '/articles'
sitemap.add articles_path, :priority => 0.7, :changefreq => 'daily'
# add all individual articles
Article.find(:all).each do |a|
sitemap.add article_path(a), :lastmod => a.updated_at
end
# add merchant path
sitemap.add '/purchase', :priority => 0.7, :host => "https://www.example.com"
end
-
Tested/working on Rails 1.x.x <=> 2.x.x, no guarantees made for Rails 3.0.
-
For large sitemaps it may be useful to split your generation into batches to avoid running out of memory. E.g.:
Movie.find_in_batches(:batch_size => 1000) do |movies| movies.each do |movie| sitemap.add "/movies/show/#{movie.to_param}", :lastmod => movie.updated_at, :changefreq => 'weekly' end end
-
New Capistrano deploys will remove your Sitemap files, unless you run
rake sitemap:refresh
. The way around this is to create a cap task:after "deploy:update_code", "deploy:copy_old_sitemap"
namespace :deploy do task :copy_old_sitemap do run "if [ -e #{previous_release}/public/sitemap_index.xml.gz ]; then cp #{previous_release}/public/sitemap* #{current_release}/public/; fi" end end
- Sitemaps.org states that no Sitemap XML file should be more than 10Mb uncompressed. The plugin will warn you about this, but does nothing to avoid it (like move some URLs into a later file).
- There's no check on the size of a URL which isn't supposed to exceed 2,048 bytes.
- Currently only supports one Sitemap Index file, which can contain 50,000 Sitemap files which can each contain 50,000 urls, so it only supports up to 2,500,000,000 (2.5 billion) urls. I personally have no need of support for more urls, but plugin could be improved to support this.
Twitter: twitter.com/adamsalter
Github: github.com/adamsalter
Copyright (c) 2009 Adam @ Codebright.net, released under the MIT license