/Crawler

Simple crawler

Primary LanguageJavaMIT LicenseMIT

Crawler Build Status

Simple crawler based on ForkJoinPool tasks.

Software needed to run:

  • Java: 1.8 How to install
  • Maven: 3.3.x How to install
  • Make sure, that you have set JAVA_HOME and MAVEN_HOME environment variables mentioned in above links.

How to build project

Make sure, that you have installed maven and java, then in project root folder type: mvn clean install -DskipTests

How to run tests

Make sure, that you have installed maven and java, then in project root folder type: mvn clean install

How to run crawler

To run crawler run crawler.bat (Windows) or crawler.sh (Unix/Linux) with arguments given bellow:

usage: Windows: crawler -u [-d] [-g] | Linux/Unix: ./crawler.sh -u [-d] [-g]
 -u,--url       Initial url from which crawler start. 
                Url should has "http://" or "https://" prefix.
 -d,--depth     Depth level of the crawler search. 
                Default value is 100. [optional]
 -g,--grouped   Grouping found links by PageLinkType
                and save them to separate files. [optional]

Result files

When crawler finishes work, discovered links are saved as xml files to %root_project_folder%/output/%given_url_as_param%/

Trade off's

  • Software uses apache commons UrlValidator, which some correct links recognizes as invalid.
  • Lack of validating files existence for grouping by PageLinkType links serialization. (Crawler does not know what type of links given domain has, so he does not know what name of files should find.)
  • Crawler can't finds link from dynamic generating components. Maybe this can be feature extension?
  • When link without protocol or domain address has been find, crawler pastes given as param url before it.
  • Performance depends on user internet connection and visited server domain.

Feature extension TODO

  • Serialization to other type of files (i.e. Json)
  • Mapping to other structure types (i.e. Map, where key is page, and value is it children)
  • Better handling http and connection exceptions
  • Excluding from serialization to file some domains given as param.
  • Maybe improve concurrency algorithm?
  • GUI