/SpiderKeeper

admin ui for scrapy/open source scrapinghub

Primary LanguagePython

SpiderKeeper

Latest Version Python Versions The MIT License

A scalable admin ui for spider service

Features

  • Manage your spiders from a dashboard. Schedule them to run automatically
  • With a single click deploy the scrapy project
  • Show spider running stats
  • Provide api

Current Support spider service

Screenshot

job dashboard periodic job running stats

Getting Started

Installing

pip install spiderkeeper

Deployment


spiderkeeper [options]

Options:

  -h, --help          show this help message and exit
  --type=SERVER_TYPE  access spider server type, default:scrapyd
  --host=HOST         host, default:0.0.0.0
  --port=PORT         port, default:5000
  --server=SERVERS    servers (support multiply server), default:http://localhost:6800

Usage

Visit: 

- web ui : http://localhost:5000

1. Create Project

2. Use [scrapyd-client](https://github.com/scrapy/scrapyd-client) to generate egg file 

   scrapyd-deploy --build-egg output.egg

2. upload egg file (make sure you started scrapyd server)

3. Done & Enjoy it

- api swagger: http://localhost:5000/api.html

TODO

  • Job dashboard support filter
  • User Authentication
  • Collect & Show scrapy crawl stats
  • Optimize load balancing

Versioning

We use SemVer for versioning. For the versions available, see the tags on this repository.

Authors

See also the list of contributors who participated in this project.

License

This project is licensed under the MIT License - see the LICENSE.md file for details

Contributing

Contributions are welcomed!

Contact