ge-high-assurance/VERDICT

Cutset likelihood switches between scientific and decimal notation

Closed this issue · 1 comments

Cutset likelihood is represented in scientific notation for likelihoods < 1e-5 but as decimals otherwise. The notation effects MBAS output which maps likelihood levels to severity levels where doubles are not expected and default to passing "E" severity.
To reproduce: set a defense DAL to 4 on an otherwise passing aadl configuration, run MBAS and verify results are still passing. Generate the GSN security fragments on the same AADL to visualize the likelihood formatting differences between cutset likelihoods
image

Background: The soteria_pp visualization:sprint_each_csImp prints all likelihoods with string_of_float from stdlib which uses the format pattern "%.12g" - g being:

g or G: convert a floating-point argument to decimal notation, in style f or e, E (whichever is more compact). Moreover, any trailing zeros are removed from the fractional part of the result and the decimal-point character is removed if there is no fractional part remaining.

Implementation Details:
Soteria ++ output should always format cutset likelihood in e+-dd decimal notation ("1." in the case of 1e+00)

References:


https://github.com/ocaml/ocaml/blob/trunk/stdlib/stdlib.ml#L287
https://v2.ocaml.org/api/Printf.html