HASecuritySolutions/VulnWhisperer

Issues with ELK7

vivekshwarup opened this issue · 5 comments

Set fielddata=true on [plugin_name] in order to load fielddata in memory by uninverting the inverted index. Note that this can however use significant memory. Alternatively use a keyword field

Same problem here with Elastic stack 7.3.1.
Logstash pulls correctly the data from my Nessus installation, but seems that fails to write to Elasticsearch. Kibana has the correct visualization imported but the index pattern gives an error on the scripted field "scan_fingerprint", in the script "doc['asset.keyword']+'_'+doc['plugin_id']", with the same error and suggestion to solve it as vivekswarup wrote.

Same error on ELK 7.5.1. Is there a workaround until the bug fix is released?

Could someone assist with the steps to reproduce this? I encountered a similar issue in the beta-2.0 branch, albeit with a different index field (scan_name). My novice understanding is that one or more of the visualizations are set to aggregate on a text field, so adding the keyword value to the index or visualization (or both) is required. I can't reproduce it on the master branch though -- I'm assuming this issue references master because plugin_name was renamed to signature in beta-2.0.

Here are the exact steps I followed on a fresh Ubuntu VM with Docker and Docker Compose:

# Clone
scott@elk:~$ git clone https://github.com/HASecuritySolutions/VulnWhisperer.git
scott@elk:~$ cd VulnWhisperer

# Build image
scott@elk:~/VulnWhisperer$ sudo docker build -t vulnwhisperer-local .

# Bump ELK images to 7.5.2
# Change kibana-config template to logstash-vulnwhisperer-template_elk7.json
# Add "discovery.type=single-node" to elasticsearch environment
# Change vulnwhisperer image to vulnwhisperer-local
scott@elk:~/VulnWhisperer$ cp docker-compose.v6.yml docker-compose.yml && nano docker-compose.yml

# Make necessary config changes
scott@elk:~/VulnWhisperer$ nano resources/elk6/vulnwhisperer.ini

# Up
scott@elk:~/VulnWhisperer$ sudo docker-compose up -d

# No errors observed in Kibana with any dashboard or visualization

That said, my fix for scan_name in beta-2.0 was to add field entries for scan_name.keyword to both the index-pattern and VulnWhisperer - ScanName visualization API calls in kibana_APIonly.json.

I was able to get past this with these steps:

  1. Install VW/ELK6 using the docker-compose provided
  2. This step may not be necessary but I ensured it was fully functional by pulling in data from a Nessus scan
  3. Delete the VW and ELK containers but do not delete the data
  4. Change the ELK version in the docker-compose to 7.5.1 and run the docker-compose again
  5. The issue in this thread did not occur

I don't think this is related to the issue discussed here, but this change to the docker-compose was necessary to keep ES from crashing - Add "discovery.type=single-node" to elasticsearch environment

Hey guys, just going to give you my two cents:

  • at the moment all the ELK part is paused and will not have any updates; as we are looking forward working on the refactor of the project and implementation of the ElasticSearch Scheme vulnerability standard, all the dashboards will need to be modified anyway (right now there are different dashboards for different scanners, and the goal is to have one to rule them all).

  • the beta-2.0 branch is under development and does have several changes regarding the implementation of the vulnerability standard mapping that are not being tested/ready for production, I would personally not recommend using that version as for now.

  • the reason why when you follow the steps mentioned by @threatangler-ga (if I am not wrong) is because once ELK 7 starts up with the data from ELK 6, it migrates the data/schemes "correctly" to the new version; this should indeed a useful workaround.

If you guys manage to make the stuff work nicely in ELK 7, feel free to do a PR to the project and I will review and merge it; my intention is to make the project something easy to contribute by the community so that we can all build a useful piece of code that makes our lives easier :)