Elasticsearch. Logstash. Kibana. Nginx. Docker.
All with logstash-forwarder, secured with nginx, and gift wrapped with docker-compose.
Yesh. That's some serious awesomesauce.
git clone https://github.com/caktux/elk-compose.git
cd elk-compose
Replace your.logstashdomain.tld in there.
openssl req -x509 -batch -nodes -newkey rsa:2048 \
-keyout logstash/conf/logstash-forwarder.key \
-out logstash/conf/logstash-forwarder.crt \
-subj /CN=your.logstashdomain.tld
htpasswd -c nginx/conf/htpasswd username
Add the htpasswd file to the conf folder.
Add your filters in logstash/conf.d, which get linked as a volume in the logstash container to /etc/logstash/conf.d. Patterns can be added in logstash/patterns and can be used with patterns_dir => '/opt/logstash/patterns_extra' in grok sections of your filters.
Keep the certificate and key you created earlier handy, you'll need those.
On every machine you need to send logs from, install logstash-forwarder:
wget -O - http://packages.elasticsearch.org/GPG-KEY-elasticsearch | sudo apt-key add -
echo "deb http://packages.elasticsearch.org/logstashforwarder/debian stable main" | sudo tee -a /etc/apt/sources.list.d/elasticsearch.list
sudo apt-get update && sudo apt-get install logstash-forwarder
docker-compose up
Use with -d once you like what you're seeing.
Your data and indices get stored in /var/lib/elasticsearch, also mounted as a volume.
Released under the MIT License, see LICENSE file.