/Gerapy

Distributed Crawler Management Framework Based on Scrapy, Scrapyd, Scrapyd-Client, Scrapyd-API, Django and Vue.js

Primary LanguagePythonMIT LicenseMIT

Gerapy

Distributed Crawler Management Framework Based on Scrapy, Scrapyd, Scrapyd-Client, Scrapyd-API, Django and Vue.js.

Support

Gerapy is developed over Python 3.x. Python 2.x will be supported later.

Usage

Install Gerapy by pip:

pip3 install gerapy

After the installation, you need to do these things below to run Gerapy server:

If you have installed Gerapy successfully, you can use command gerapy. If not, check the installation.

First use this command to initialize the workspace:

gerapy init

Now you will get a folder named gerapy.

Then cd to this folder, and run this command to initialize the Database:

cd gerapy
gerapy migrate

Next you can runserver by this command:

gerapy runserver

Then you can visit http://localhost:8000 to enjoy it.

Or you can configure host and port like this:

gerapy runserver 0.0.0.0:8888

Then it will run with public host and port 8888.

You can create a configurable project and then configure and generate code automatically.Also you can drag your Scrapy Project to gerapy/projects folder. Then refresh web, it will appear in the Project Index Page and comes to un-configurable, but you can edit this project in the web interface.

As for the deploy, you can move to Deploy Page. Firstly you need to build your project and add client in the Client Index Page, then you can deploy the project by clicking button.

After the deployment, you can manage the job in Monitor Page.

Docker

Just run this command:

docker run -d -v ~/gerapy:/app/gerapy -p 8000:8000 thsheep/gerapy:master

Then it will run at port 8000.

Command:

docker run -d -v <your_workspace>:/app/gerapy -p <public_port>:<container_port> thsheep/gerapy:master

Please specify your workspace to mount Gerapy workspace by -v <your_workspace>:/app/gerapy and specify server port by -p <public_port>:<container_port>.

If you run Gerapy by Docker, you can visit Gerapy website such as http://localhost:8000 and enjoy it, no need to do other initialzation things.

Preview

Client Management:

Spider Monitor:

Project Management:

Project Edit:

Project Deploy:

Project Configuration:

TodoList

  • Add Visual Configuration of Spider with Previewing Website
  • Add Scrapyd Auth Management
  • Add Automatic Python & Scrapyd Environment Deployment
  • Add MongoDB & Redis & MySQL Monitor
  • Add Timed Task Scheduler

Communication

If you have any questions or ideas, you can join this QQ Group: