Dockerized Python Logstash Test App
Uses Logstash for logging all Python events, while running Python in a Docker container.
How do I use this?
- Clone this docker-compose ELK stack example. This is the recommended example of how to use docker-compose for Elasticsearch, Logstash, and Kibana (ELK).
- Add a section to the logstash config:
tcp { port => 5999 codec => json }
- Add the json codec to the logstash Dockerfile so it will be installed at
build time:
RUN logstash-plugin install logstash-codec-json
- Add port 5999 to logstash in the docker-compose.yml.
- Create .env file in this repository's directory. This contains the environment
for your app. (See
env.sample
for an example.) Set the following variables:- LOGSTASH_HOST=dockerelk_logstash_1
- LOGSTASH_PORT=5999
- FLASK_APP=src/flask_app.py
- FLASK_DEBUG=1
- Now build this app's Docker image:
./mn_build
- Run the first test app, to generate some log data:
./python src/app.py
- Open Kibana and see if you have content in Elasticsearch.
- Navigate to http://localhost:5601
- Click Management in the left menu, then Index Patterns. Then choose
timestamp
for the time series value and click save. - Click Discover in the left menu. You should see logstash as the selected index in the gray column in the left middle. If you see "No Results Found" then run your app again and click the magnifying glass in the search box above.
- When you have data you should see blue bars in the timeline and log records in a list below. It may take several seconds for log events to make it to Elasticsearch. If you don't see them in 20 seconds, there is probably something configured incorrectly.
- Run the flask app:
./flask run --host=0.0.0.0
- Hit the website and see "Hello World.":
curl http://localhost:6000
- Check Kibana again to see your logged messages from flask.