Web Crawler to crawl websites This Application will be to crawl websites with the configuration provided.
mongod.exe --dbpath D:\mongodb_server_data\data
docker compose up -d docker ps docker logs -f broker docker compose down
Elastic Search, Kibana and Logstash
- Download Elastic Search 8.9 higher and run elasticsearch.bat Local Disable Security
xpack.security.enabled: false
xpack.security.enrollment.enabled: false
xpack.security.http.ssl: enabled: false
xpack.security.transport.ssl: enabled: false
- http://locahost:9200
- Download Kibana
- Configure kibana.yml file in config folder
- UnComment ElasticSearch
- Execute kibana.bat
- http://localhost:5601/
- Download Logstash
- Create a file logstash.conf in config folder
input { file { path => "D:/logs/elk-stack.log" start_position => "beginning" } }
output {
elasticsearch {
hosts => "localhost:9200"
}
stdout {
codec => rubydebug
}
}
input { tcp { port => 5000 codec => json } }
#index => "microservice-logs-inv" output { elasticsearch { hosts => ["localhost:9200"] index => "micro-%{appName}" }
stdout {
codec => json_lines
}
}
-
./logstash -f ./config/logstash.conf ./logstash -f ./config/logstash-microservice.conf
-
Eureka Discovery Service: http://localhost:8761/
-
Identity-service for authentication and authorization
-
report-service implements EnableWebSecurity and EnableMethod Security