Run the latest version of the ELK (Elasticsearch, Logstash, Kibana) stack to process the syslog logs via Docker and Docker-compose.You can redirect your logs to this stack with syslog driver in your Docker instances and get the ability to analyze any data set by using the searching/aggregation capabilities of Elasticsearch and the visualization power of Kibana.
Based on the official images:
Installed in Dockerfile
and configure in ./logstash/pipeline/logstash.conf
- Install Docker.
- Install Docker-compose.
git clone https://github.com/yangjunsss/docker-elk-syslog
cd docker-elk-syslog && ./install.sh
- After install successfully then access Kibana UI by hitting http://localhost:5601 with a web browser.
By default, the stack exposes the following ports:
- 5140: Logstash syslog input.
- 9200: Elasticsearch HTTP
- 9300: Elasticsearch TCP transport
- 5601: Kibana
docker-compose -f docker-compose.yml down -v
Take nginx for example:
echo "192.168.0.4 syslog_host" >> /etc/hosts
docker-compose -f docker-compose.nginx.yml up
After started, you can verify the logs from http://192.168.0.4:5601 by Kibana.
NOTE: Configuration is not dynamically reloaded, you will need to restart the stack after any change in the configuration of a component.
The Kibana default configuration is stored in kibana/config/kibana.yml
.
The Logstash configuration is stored in logstash/config/logstash.yml
.
The Logstash pipeline configuration is stored in logstash/pipeline/logstash.conf
The Elasticsearch configuration is stored in elasticsearch/config/elasticsearch.yml
.
Follow the instructions from the Wiki: Scaling up Elasticsearch
The data stored in Elasticsearch will be persisted under ./elasticsearch/data