This github hosts my project files towards more standard SOC reports. The goal is to define an open source reporting standard for Security Operation Center reports.
It starts with the paper about Use Case Applicability, to create a standard measuring Security Monitoring capabilities. This repo also includes my presentations I held about my taxonomy.
Suggested KPIs, as presented at HACK.LU 2019 are:
KPI | Explanation | Target Value | Owner | Business Impact |
---|---|---|---|---|
Number of Log Management Rule Configuration Error events per month | This value reflects the rules configured in the SIEM by the SOC Analysts. A high number suspects bad quality of rules, more training or experience needed. | < 10 % | Compliance/ Operational | SOC Operational risk |
Number of Announced Administrative/User Action events per month | This value reflects suppressions that should be improved. | < 10 % | Operational/ Contractual | Operational Risk management |
Number of Bad IOC/rule pattern value events per month | If too many events were created by bad IOCs or rule pattern values, the source or the trust in it should be questioned. | < 5 % | Compliance/ Operational | SOC Operational risk |
Number of Confirmed Attack attempt without IR actions (best matched with Log Source Category) | Number of events detected but prevented by measures in place or where the alert isn’t viewed as a high risk. | > 50 % | Policy/Operational/Contractual | Potentially Poor Risk Acceptance Practices |
Number of Confirmed Attack attempt with IR actions (best matched with Log Source Category) | Very high numbers → Security Architecture should be updated Very low numbers → The rules aren‘t detecting or you are safe |
:) | Policy | Potential intrusion/Prioritise investigation |
The full slides can be found in this repository, the Video recording can be found here: https://www.youtube.com/watch?v=NifSKzogSrI
Before FIRST, there also was a podcast published where I could talk to Chris John Riley about this taxonomy and the thoughts behind. This recording can be found here: https://media.first.org/podcasts/FIRST2019-DesireeSacher.mp3
FIRST gave me the opportunity to have the paper published in an ACM DTRAP special journal edition, this peer reviewed version can be found here: https://dl.acm.org/doi/10.1145/3370084
Frequently asked questions and a collection of KPIs have been collected over in the wiki: https://github.com/d3sre/Use_Case_Applicability/wiki/Frequently-Asked-Questions
This paper was written by Desiree Sacher
Thank you for initially trialing the idea: Christoph Weber and Michael Kurth
Thank you to everyone who proof read it before publication: Raphaël Vinot, Corsin Camichel, Eireann Leverett, Florian Roth, Ian Amit, Meline Sieber, Frank Boldewin, Jochen Raymaekers, Francesco Picasso and Amanda Berlin
The paper was published under Creative Commons BY License: https://creativecommons.org/licenses/by/4.0/