usnistgov/SP800-90B_EntropyAssessment

Is ARM supported in this project?

Closed this issue · 8 comments

Hi,

I'm trying to cross compile these testing tools for ARM, but it throws several errors. I was able to build it for ARM64 doing a few modifications in the source code like for example in the main Makefile. Attached you can find the patch for the Makefile:

From: Arturo Buzarra <arturo.buzarra@digi.com>
Date: Mon, 31 Jan 2022 15:13:55 +0100
Subject: [PATCH 1/2] Makefile: add support to cross compilation

Signed-off-by: Arturo Buzarra <arturo.buzarra@digi.com>
---
 cpp/Makefile | 11 +++++++++--
 1 file changed, 9 insertions(+), 2 deletions(-)

diff --git a/cpp/Makefile b/cpp/Makefile
index 86e58e8..a1c1138 100644
--- a/cpp/Makefile
+++ b/cpp/Makefile
@@ -1,7 +1,14 @@
-CXX = g++
-CXXFLAGS = -std=c++11 -fopenmp -O2 -msse2 -ffloat-store -march=native
+ARCH ?= x86
+
+CXX ?= $(CROSS_COMPILE)g++
+
+CXXFLAGS = -std=c++11 -fopenmp -O2 -ffloat-store
+ifeq ($(ARCH),x86)
+CXXFLAGS += -msse2 -march=native
+endif
+
 #CXX = clang++-8
 #CXXFLAGS = -Wno-padded -Wno-disabled-macro-expansion -Wno-gnu-statement-expression -Wno-bad-function-cast -fopenmp -O1 -fsanitize=address -fsanitize=undefined -fdenormal-fp-math=ieee -msse2 -march=native
 #static analysis in clang using
 #scan-build-8 --use-c++=/usr/bin/clang++-8 make
 LIB = -lbz2 -lpthread -ldivsufsort 

Is there ARM support for your project?

Thanks in advance,

Arturo.

celic commented

We do not actively support ARM. If it works and you would like to contribute this to the repository, great. It is not something we will test for future development.

I have compiled and used this tool on Amazon EC2 A1 instances (which are ARM64 based) but as you indicate it requires some changes to the Makefile. It is slower (and less cost efficient) than ec2 r5.12xlarge (large single NUMA node Intel instances) so I have not done a great deal of work on ARM architectures.

We do not actively support ARM. If it works and you would like to contribute this to the repository, great. It is not something we will test for future development.

I sent a PR for that, I think that at least the change in the Makefile are only a few lines, with a minimum risk for the x86 compilation ( I tested it)

As I said in my previous comment I was able to compile it for ARM64, however for 32-bit ARM I can't, because in the cpp/share/utils.h file the structure for a 128-bit integer is used internally to do the calculations ( __uint128_t ), finally a uint64_t is returned, so my question is if is possible use another data structure compatible with these architectures of 32-bit to perform these calculations (maybe two uint64_t upper/lower part).

Thanks,

Arturo.

This is in randomRange64? That does genuinely require 128 bit integers. If those aren't available, restructuring this code is possible, but, frankly, it isn't clear to me why that's useful. Why are you trying to use this tool on a 32-bit architecture?

This is in randomRange64? That does genuinely require 128 bit integers. If those aren't available, restructuring this code is possible, but, frankly, it isn't clear to me why that's useful. Why are you trying to use this tool on a 32-bit architecture?

AFAIK, we can use these tools to measure the quality of a source of entropy, internal source (SOC) or external (maybe a Crypto Authentication chip) but it is running in a system based on ARM/ARM64.

These tools are used during validation assessment (e.g., during a FIPS 140 validation process), and are not expected to run on the environment where the entropy source is implemented. One can get a dataset from any entropy source (hardware without a general purpose processor, TPM, SE, etc.) and then perform validation testing using the dataset with some architecture more appropriate for large-scale analysis.

These tools are used during validation assessment (e.g., during a FIPS 140 validation process), and are not expected to run on the environment where the entropy source is implemented. One can get a dataset from any entropy source (hardware without a general purpose processor, TPM, SE, etc.) and then perform validation testing using the dataset with some architecture more appropriate for large-scale analysis.

I thought that this suite provides both the tools to collect the entropy data and to test the quality of it, so this is the reason to execute both on the same platform. Is there any tool to collect the entropy data with the right format to use this test suite?

Thanks in advance,

Arturo.

celic commented

Data can be formatted for this tool by using Python or any scripting language. The tool expects binary files, there are no special formatting requirements beyond that.