Run ELK stack in docker container

Takes latest version of the ELK (Elasticseach, Logstash, Kibana) stack with Docker and Docker-compose.

It will give you the ability to analyze logs (or any dataset) by using Elasticseach and the visualization power of Kibana.

Based on the official images :

Requirements

How to start

  1. Install Docker.
  2. Install Docker-compose.
  3. Clone this repository and run docker-compose up under the main directory

Usage

Start the ELK stack using docker-compose:

$ docker-compose up

You can also choose to run it in background (detached mode):

$ docker-compose up -d

Now that the stack is running, you'll want to inject logs in it. The shipped logstash configuration is looking for filebeats log shipper to send the syslogs filebeats

And then access Kibana UI by hitting http://localhost:5601 with a web browser.

By default, the stack exposes the following ports: 5044: Logstash input. 9200: Elasticsearch API 9300: Elasticsearch TCP transport 5601: Kibana

If you want to override the default configuration, it can be found in the yml file

logstash:
  image: logstash:latest
  command: logstash -f /etc/logstash/conf.d/logstash.conf
  volumes:
    - ./logstash/config:/etc/logstash/conf.d
  ports:
    - "5000:5000"
  links:
    - elasticsearch


# Storage

## How can I store Elasticsearch data?

The data stored in Elasticsearch will be persisted after container reboot but not after container removal.

In order to persist Elasticsearch data even after removing the Elasticsearch container, you'll have to mount a volume on your Docker host. Update the elasticsearch container declaration to:

```yml
elasticsearch:
  build: elasticsearch/
  command: elasticsearch -Des.network.host=_non_loopback_
  ports:
    - "9200:9200"
  volumes:
    - /opt/storage:/usr/share/elasticsearch/data

This will store elasticsearch data inside /opt/storage.