/WebCrawler

Web Crawler to crawl websites

Primary LanguageJavaMIT LicenseMIT

WebCrawler

Web Crawler to crawl websites This Application will be to crawl websites with the configuration provided.

mongod.exe --dbpath D:\mongodb_server_data\data

docker compose up -d docker ps docker logs -f broker docker compose down

Elastic Search, Kibana and Logstash

  1. Download Elastic Search 8.9 higher and run elasticsearch.bat Local Disable Security

Enable security features

xpack.security.enabled: false

xpack.security.enrollment.enabled: false

Enable encryption for HTTP API client connections, such as Kibana, Logstash, and Agents

xpack.security.http.ssl: enabled: false

Enable encryption and mutual authentication between cluster nodes

xpack.security.transport.ssl: enabled: false

  1. http://locahost:9200
  2. Download Kibana
  3. Configure kibana.yml file in config folder
  4. UnComment ElasticSearch
  5. Execute kibana.bat
  6. http://localhost:5601/
  7. Download Logstash
  8. Create a file logstash.conf in config folder

input { file { path => "D:/logs/elk-stack.log" start_position => "beginning" } }

output {

elasticsearch {
	hosts => "localhost:9200"
}

stdout {
	codec => rubydebug
}

}

MicroService Configuration

Received log over TCP

input { tcp { port => 5000 codec => json } }

#index => "microservice-logs-inv" output { elasticsearch { hosts => ["localhost:9200"] index => "micro-%{appName}" }

stdout {
	codec => json_lines
}

}

  1. ./logstash -f ./config/logstash.conf ./logstash -f ./config/logstash-microservice.conf

  2. http://localhost:9600/

  3. Eureka Discovery Service: http://localhost:8761/

  4. Identity-service for authentication and authorization

  5. report-service implements EnableWebSecurity and EnableMethod Security