Simple script to crawl websites and create a sitemap.xml of all public link in it.
Warning : This script only works with Python3
>>> python main.py --domain http://blog.lesite.us --output sitemap.xml
Read a config file to set parameters: You can overide (or add for list) any parameters define in the config.json
>>> python main.py --config config/config.json
$ python main.py --domain https://blog.lesite.us --output sitemap.xml --debug
$ python main.py --domain https://blog.lesite.us --output sitemap.xml --verbose
More informations here https://support.google.com/webmasters/answer/178636?hl=en
$ python main.py --domain https://blog.lesite.us --output sitemap.xml --images
$ python main.py --domain https://blog.lesite.us --output sitemap.xml --report
$ python main.py --domain https://blog.lesite.us --output sitemap.xml --skipext pdf --skipext xml
$ python main.py --domain https://blog.lesite.us --output sitemap.xml --drop "id=[0-9]{5}"
$ python main.py --domain https://blog.lesite.us --output sitemap.xml --exclude "action=edit"
$ python main.py --domain https://blog.lesite.us --output sitemap.xml --parserobots
$ python3 main.py --domain https://blog.lesite.us --images --parserobots | xmllint --format -
$ python3 main.py --domain https://blog.lesite.us --num-workers 4
$ docker build -t python-sitemap:latest .
$ docker run -it python-sitemap
$ docker run -it python-sitemap --domain https://www.graylog.fr
You need to configure config.json file before
$ docker run -it -v `pwd`/config/:/config/ -v `pwd`:/home/python-sitemap/ python-sitemap --config config/config.json