Web crawler using scrapy.
Clone the repository, then install prerequisites.
pip install scrapy
pip install pymongo
Install scrapy.
mongod
scrapy crawl [SPIDER]
Make sure the Mongo daemon is running, then run spider to crawl through webpages.
mongo
Open the mongo shell.
show dbs
use stackoverflow
show collections
db.questions.count()
db.questions.find()
Some useful commands for viewing your data.