/scrapy_helper

Dynamic configurable crawl (动态可配置化爬虫)

Primary LanguageCSSMIT LicenseMIT

scrapy_helper

Dynamic configurable crawl (动态可配置化爬虫)

Install

  • git clone git@github.com:facert/scrapy_helper.git && cd scrapy_helper
  • virtualenv .env
  • source .env/bin/activate
  • pip install -r requirements.txt
  • python manage.py migrate

Run

  • python manage.py runserver
  • open browser http://127.0.0.1:8000/
  • use test account (username: demo/password: demo ) to login

Online site

http://www.anycrawl.info/

Screenshots