Ruby gem to check if an IP really belongs to some bot, typically a search engine. This can of much help if one wants to protect his/her web site from malicious scanners who pretend to be e.g. a Googlebot.
Suppose you have a Web request and you'd like to make sure it's not from a fake search engine:
bot = Legitbot.bot(userAgent, ip)
bot
will be nil
if no bot signature was found in the User-Agent
. Otherwise,
it will be an instance with methods
bot.detected_as # => "Google"
bot.valid? # => true
bot.fake? # => false
Sometimes you already know what search engine to expect. For example, you may be using rack-attack:
Rack::Attack.blocklist("fake Googlebot") do |req|
req.user_agent =~ %r(Googlebot) && Legitbot::Google.fake?(req.ip)
end
- Rails middleware
- More testing for Facebook
- Review for thread safety
- Make it possible to reload Facebook IP ranges
- Bots masquerading as someone else, e.g.
Telegram (like Twitter)
- what to do?
Apache 2.0
- I have initially created Play Framework version in Scala: play-legitbot
- Article When (Fake) Googlebots Attack Your Rails App
- Voight-Kampff is a Ruby gem which
detects bots by
User-Agent
- browser is a Ruby gem which may tell you if the request comes from a search engine.