robots — is parser for robots.txt files for Go
$ go get github.com/gvenkat/robots
Here's an example of using robots
import "github.com/gvenkat/robots"
...
// Instantiate from any io.Reader instance
parser := robots.FromReader(...)
// Or from a URL
parser := robots.FromURL(...)
// Or from a file
parser := robots.FromFile(...)
Default crawler user-agent is:
Mozilla/5.0 (X11; Linux i686; rv:5.0) Gecko/20100101 Firefox/5.0
See LICENSE file.