mc2-project/opaque-sql

Build error : (*:buildFlatbuffers) Flatbuffers build failed.

Closed this issue · 15 comments

Hi I try to reproduce the benchmark but i failed to compile the whole project. Your prebuilt docker image works fine but I am afraid that benchmarking within a container may affect its best performance. Have you met this flatbuffer error before? Thanks!

./build/sbt compile
./build/sbt: line 201: [[: .: syntax error: operand expected (error token is ".")
OpenJDK 64-Bit Server VM warning: Ignoring option MaxPermSize; support was removed in 8.0
[info] Loading project definition from /home/lqp0562/opaque/project
[info] Compiling 1 Scala source to /home/lqp0562/opaque/project/target/scala-2.10/sbt-0.13/classes...
[info] Set current project to opaque (in build file:/home/lqp0562/opaque/)
[info] Executing in batch mode. For better performance use sbt's shell
[info] Generating flatbuffers for /home/lqp0562/opaque/src/flatbuffers/Expr.fbs
/home/lqp0562/opaque/target/flatbuffers/flatbuffers-1.7.0/flatc: error: Unable to generate C++ for Expr
Usage: /home/lqp0562/opaque/target/flatbuffers/flatbuffers-1.7.0/flatc [OPTION]... FILE... [-- FILE...]
--binary -b Generate wire format binaries for any data definitions.
--json -t Generate text output for any data definitions.
--cpp -c Generate C++ headers for tables/structs.
--go -g Generate Go files for tables/structs.
--java -j Generate Java classes for tables/structs.
--js -s Generate JavaScript code for tables/structs.
--ts -T Generate TypeScript code for tables/structs.
--csharp -n Generate C# classes for tables/structs.
--python -p Generate Python files for tables/structs.
--php Generate PHP files for tables/structs.
-o PATH Prefix PATH to all generated files.
-I PATH Search for includes in the specified path.
-M Print make rules for generated files.
--version Print the version number of flatc and exit.
--strict-json Strict JSON: field names must be / will be quoted,
no trailing commas in tables/vectors.
--allow-non-utf8 Pass non-UTF-8 input through parser and emit nonstandard
\x escapes in JSON. (Default is to raise parse error on
non-UTF-8 input.)
--defaults-json Output fields whose value is the default when
writing JSON
--unknown-json Allow fields in JSON that are not defined in the
schema. These fields will be discared when generating
binaries.
--no-prefix Don't prefix enum values with the enum type in C++.
--scoped-enums Use C++11 style scoped and strongly typed enums.
also implies --no-prefix.
--gen-includes (deprecated), this is the default behavior.
If the original behavior is required (no include
statements) use --no-includes.
--no-includes Don't generate include statements for included
schemas the generated file depends on (C++).
--gen-mutable Generate accessors that can mutate buffers in-place.
--gen-onefile Generate single output file for C#.
--gen-name-strings Generate type name functions for C++.
--escape-proto-ids Disable appending '_' in namespaces names.
--gen-object-api Generate an additional object-based API.
--cpp-ptr-type T Set object API pointer type (default std::unique_ptr)
--cpp-str-type T Set object API string type (default std::string)
T::c_str() and T::length() must be supported
--no-js-exports Removes Node.js style export lines in JS.
--goog-js-export Uses goog.exports* for closure compiler exporting in JS.
--go-namespace Generate the overrided namespace in Golang.
--raw-binary Allow binaries without file_indentifier to be read.
This may crash flatc given a mismatched schema.
--proto Input is a .proto, translate to .fbs.
--grpc Generate GRPC interfaces for the specified languages
--schema Serialize schemas instead of JSON (use with -b)
--bfbs-comments Add doc comments to the binary schema files.
--conform FILE Specify a schema the following schemas should be
an evolution of. Gives errors if not.
--conform-includes Include path for the schema given with --conform
PATH
--include-prefix Prefix this path to any generated include statements.
PATH
--keep-prefix Keep original prefix of schema include statement.
--no-fb-import Don't include flatbuffers import statement for TypeScript.
--no-ts-reexport Don't re-export imported dependencies for TypeScript.
FILEs may be schemas, or JSON files (conforming to preceding schema)
FILEs after the -- must be binary flatbuffer format files.
Output files are named using the base file name of the input,
and written to the current directory or the path given by -o.
example: /home/lqp0562/opaque/target/flatbuffers/flatbuffers-1.7.0/flatc -c -b schema1.fbs schema2.fbs data.json
java.lang.RuntimeException: Flatbuffers build failed.
at scala.sys.package$.error(package.scala:27)
at $97f688414a332020e117$$anonfun$$sbtdef$1$$anonfun$apply$2.apply(/home/lqp0562/opaque/build.sbt:219)
at $97f688414a332020e117$$anonfun$$sbtdef$1$$anonfun$apply$2.apply(/home/lqp0562/opaque/build.sbt:215)
at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:47)
at $97f688414a332020e117$$anonfun$$sbtdef$1.apply(/home/lqp0562/opaque/build.sbt:215)
at $97f688414a332020e117$$anonfun$$sbtdef$1.apply(/home/lqp0562/opaque/build.sbt:200)
at scala.Function1$$anonfun$compose$1.apply(Function1.scala:47)
at sbt.$tilde$greater$$anonfun$$u2219$1.apply(TypeFunctions.scala:40)
at sbt.std.Transform$$anon$4.work(System.scala:63)
at sbt.Execute$$anonfun$submit$1$$anonfun$apply$1.apply(Execute.scala:228)
at sbt.Execute$$anonfun$submit$1$$anonfun$apply$1.apply(Execute.scala:228)
at sbt.ErrorHandling$.wideConvert(ErrorHandling.scala:17)
at sbt.Execute.work(Execute.scala:237)
at sbt.Execute$$anonfun$submit$1.apply(Execute.scala:228)
at sbt.Execute$$anonfun$submit$1.apply(Execute.scala:228)
at sbt.ConcurrentRestrictions$$anon$4$$anonfun$1.apply(ConcurrentRestrictions.scala:159)
at sbt.CompletionService$$anon$2.call(CompletionService.scala:28)
at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515)
at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
at java.base/java.lang.Thread.run(Thread.java:834)
[error] (*:buildFlatbuffers) Flatbuffers build failed.
[error] Total time: 0 s, completed Aug 14, 2020, 12:28:43 PM

Hi, we haven't seen this before but it could be because of an environment problem. What is the output for java -version for you?

openjdk version "11.0.7" 2020-04-14
OpenJDK Runtime Environment (build 11.0.7+10-post-Ubuntu-2ubuntu218.04)
OpenJDK 64-Bit Server VM (build 11.0.7+10-post-Ubuntu-2ubuntu218.04, mixed mode, sharing)

I don't think Opaque will work with OpenJDK 11. I believe this limitation comes from our use of Spark 2.4.0 and Scala 2.11. I'd suggest downgrading to OpenJDK 8 (sudo apt install openjdk-8-jdk; sudo update-alternatives --config java).

It appears that our build/sbt script parses the Java version incorrectly for OpenJDK 11. We should upgrade to the latest one.

I'm not sure how that could produce the Flatbuffers error you see, though.

Thanks for your suggestion! I have another concern here. Like I said before, I can run the prebuilt docker image provided by Opaque repo. I run several benchmarks in the container. Here are some results. The SparkSQL performance is good. BDB1 over 1million rows 0.2 seconds and BDB2 over 1million rows 3.5 seconds. BDB3 is 24.2seconds. However, the Encrypted Opaque mode is much slower than SparkSQL : over 1 million rows, BDB1 is 8 seconds. BDB2 is 32 seconds and BDB3 is 237 seconds. Have you tried running BDB queries over 1 million rows in your docker image? I am quite confused with this performance gap between the Opaque Encrypted and Spark SQL because according to Opaque NSDI17 paper, they should have similar performance over 1million rows.

In the single-machine setting, Opaque’s encryption mode performance varies from 58% performance gain to 2.5x performance loss when compared with the Spark SQL baseline.

When query over BDB "tiny" dataset, the performance gap between Spark SQL and Opaque Encrypted is acceptable. You can refer to the results whose size=10000 in below logs. Opaque paper does not mention the BDB dataset size in evaluation section. As the second author of Opaque paper, do you still remember it? Thanks!

I generate benchmark dataset of different size(10k, 100k, 1million, 10million). And here are my testing scripts:(I have modified the buid.sbt file to set the JVM memory upper bound as ~60GB.)

javaOptions in run ++= Seq("-Xmx60048m", "-XX:ReservedCodeCacheSize=384m", "-Dspark.master=local[1]")

sudo docker run -it -m 64g --name opaque-bench -w /home/opaque/opaque ankurdave/opaque
sudo docker start opaque-bench
sudo docker cp ../benchmark/opaque/build.sbt opaque-bench:/home/opaque/opaque/
sudo docker cp ../benchmark/opaque/Benchmark.scala opaque-bench:/home/opaque/opaque/src/main/scala/edu/berkeley/cs/rise/opaque/benchmark/
sudo docker cp ../benchmark/data/rankings opaque-bench:/home/opaque/opaque/data/bdb/
sudo docker cp ../benchmark/data/uservisits opaque-bench:/home/opaque/opaque/data/bdb/
sudo docker exec -w /home/opaque/opaque opaque-bench build/sbt run edu.berkeley.cs.rise.opaque.benchmark.Benchmark 2>&1 >> ../benchmark/opaque_log.txt
sudo stop opaque-bench

here comes the spark SQL ones:

�[0m20/08/15 03:39:08 INFO Utils: {"size": "10000", "system": "spark sql", "query": "big data 1", "distributed": false, "sgx": "sim", "time": 40.065295}�[0m
�[0m20/08/15 03:39:08 INFO Utils: {"size": "100000", "system": "spark sql", "query": "big data 1", "distributed": false, "sgx": "sim", "time": 47.199406}�[0m
�[0m20/08/15 03:39:09 INFO Utils: {"size": "1000000", "system": "spark sql", "query": "big data 1", "distributed": false, "sgx": "sim", "time": 352.605568}�[0m
�[0m20/08/15 03:39:13 INFO Utils: {"size": "10000000", "system": "spark sql", "query": "big data 1", "distributed": true, "sgx": "sim", "time": 1654.752109}�[0m
�[0m20/08/15 03:39:13 INFO Utils: {"size": "10000", "system": "spark sql", "query": "big data 1", "distributed": false, "sgx": "sim", "time": 32.353021}�[0m
�[0m20/08/15 03:39:13 INFO Utils: {"size": "100000", "system": "spark sql", "query": "big data 1", "distributed": false, "sgx": "sim", "time": 90.206368}�[0m
�[0m20/08/15 03:39:14 INFO Utils: {"size": "1000000", "system": "spark sql", "query": "big data 1", "distributed": false, "sgx": "sim", "time": 345.923982}�[0m
�[0m20/08/15 03:39:17 INFO Utils: {"size": "10000000", "system": "spark sql", "query": "big data 1", "distributed": true, "sgx": "sim", "time": 1415.399961}�[0m
�[0m20/08/15 03:39:18 INFO Utils: {"size": "10000", "system": "spark sql", "query": "big data 1", "distributed": false, "sgx": "sim", "time": 30.165402}�[0m
�[0m20/08/15 03:39:18 INFO Utils: {"size": "100000", "system": "spark sql", "query": "big data 1", "distributed": false, "sgx": "sim", "time": 42.678564}�[0m
�[0m20/08/15 03:39:18 INFO Utils: {"size": "1000000", "system": "spark sql", "query": "big data 1", "distributed": false, "sgx": "sim", "time": 345.353915}�[0m
�[0m20/08/15 03:39:22 INFO Utils: {"size": "10000000", "system": "spark sql", "query": "big data 1", "distributed": true, "sgx": "sim", "time": 1678.413155}�[0m
�[0m20/08/15 03:39:25 INFO Utils: {"size": "10000", "system": "spark sql", "query": "big data 2", "distributed": false, "sgx": "sim", "time": 2177.881876}�[0m
�[0m20/08/15 03:39:33 INFO Utils: {"size": "100000", "system": "spark sql", "query": "big data 2", "distributed": false, "sgx": "sim", "time": 1764.519742}�[0m
�[0m20/08/15 03:40:26 INFO Utils: {"size": "1000000", "system": "spark sql", "query": "big data 2", "distributed": false, "sgx": "sim", "time": 3237.214506}�[0m
�[0m20/08/15 03:49:01 INFO Utils: {"size": "10000000", "system": "spark sql", "query": "big data 2", "distributed": true, "sgx": "sim", "time": 21724.740592}�[0m
�[0m20/08/15 03:49:06 INFO Utils: {"size": "10000", "system": "spark sql", "query": "big data 2", "distributed": false, "sgx": "sim", "time": 4342.017193}�[0m
�[0m20/08/15 03:49:08 INFO Utils: {"size": "100000", "system": "spark sql", "query": "big data 2", "distributed": false, "sgx": "sim", "time": 1502.131684}�[0m
�[0m20/08/15 03:49:13 INFO Utils: {"size": "1000000", "system": "spark sql", "query": "big data 2", "distributed": false, "sgx": "sim", "time": 2915.579597}�[0m
�[0m20/08/15 03:49:50 INFO Utils: {"size": "10000000", "system": "spark sql", "query": "big data 2", "distributed": true, "sgx": "sim", "time": 20585.059524}�[0m
�[0m20/08/15 03:49:51 INFO Utils: {"size": "10000", "system": "spark sql", "query": "big data 2", "distributed": false, "sgx": "sim", "time": 698.273579}�[0m
�[0m20/08/15 03:49:53 INFO Utils: {"size": "100000", "system": "spark sql", "query": "big data 2", "distributed": false, "sgx": "sim", "time": 1872.915314}�[0m
�[0m20/08/15 03:49:58 INFO Utils: {"size": "1000000", "system": "spark sql", "query": "big data 2", "distributed": false, "sgx": "sim", "time": 2867.894131}�[0m
�[0m20/08/15 03:50:38 INFO Utils: {"size": "10000000", "system": "spark sql", "query": "big data 2", "distributed": true, "sgx": "sim", "time": 21935.647826}�[0m
�[0m20/08/20 17:23:46 INFO Utils: {"size": "10000", "system": "spark sql", "query": "big data 1", "distributed": false, "sgx": "sim", "time": 846.47859}�[0m
�[0m20/08/20 17:23:48 INFO Utils: {"size": "100000", "system": "spark sql", "query": "big data 1", "distributed": false, "sgx": "sim", "time": 208.491093}�[0m
�[0m20/08/20 17:23:54 INFO Utils: {"size": "1000000", "system": "spark sql", "query": "big data 1", "distributed": false, "sgx": "sim", "time": 237.443078}�[0m
�[0m20/08/20 17:23:57 INFO Utils: {"size": "10000", "system": "spark sql", "query": "big data 2", "distributed": false, "sgx": "sim", "time": 2257.048925}�[0m
�[0m20/08/20 17:24:05 INFO Utils: {"size": "100000", "system": "spark sql", "query": "big data 2", "distributed": false, "sgx": "sim", "time": 1913.835306}�[0m
�[0m20/08/20 17:24:59 INFO Utils: {"size": "1000000", "system": "spark sql", "query": "big data 2", "distributed": false, "sgx": "sim", "time": 3511.305008}�[0m
�[0m20/08/20 17:25:06 INFO Utils: {"size": "10000", "system": "spark sql", "query": "big data 3", "distributed": false, "sgx": "sim", "time": 6651.975442}�[0m
�[0m20/08/20 17:25:14 INFO Utils: {"size": "100000", "system": "spark sql", "query": "big data 3", "distributed": false, "sgx": "sim", "time": 8026.138777}�[0m
�[0m20/08/20 17:25:41 INFO Utils: {"size": "1000000", "system": "spark sql", "query": "big data 3", "distributed": false, "sgx": "sim", "time": 24220.148276}�[0m

here comes the encrypted ones:

�[0m20/08/20 17:25:44 INFO Utils: {"size": "10000", "system": "encrypted", "query": "big data 1", "distributed": false, "sgx": "sim", "time": 158.813051}�[0m
�[0m20/08/20 17:25:46 INFO Utils: {"size": "100000", "system": "encrypted", "query": "big data 1", "distributed": false, "sgx": "sim", "time": 1096.49591}�[0m
�[0m20/08/20 17:26:04 INFO Utils: {"size": "1000000", "system": "encrypted", "query": "big data 1", "distributed": true, "sgx": "sim", "time": 8236.803192}�[0m
�[0m20/08/20 17:26:07 INFO Utils: {"size": "10000", "system": "encrypted", "query": "big data 2", "distributed": false, "sgx": "sim", "time": 2409.730804}�[0m
�[0m20/08/20 17:26:36 INFO Utils: {"size": "100000", "system": "encrypted", "query": "big data 2", "distributed": false, "sgx": "sim", "time": 23512.367981}�[0m
�[0m20/08/20 17:33:16 INFO Utils: {"size": "1000000", "system": "encrypted", "query": "big data 2", "distributed": true, "sgx": "sim", "time": 323343.223713}�[0m
�[0m20/08/20 17:33:20 INFO Utils: {"size": "10000", "system": "encrypted", "query": "big data 3", "distributed": false, "sgx": "sim", "time": 3318.489524}�[0m
�[0m20/08/20 17:33:43 INFO Utils: {"size": "100000", "system": "encrypted", "query": "big data 3", "distributed": false, "sgx": "sim", "time": 18009.025002}�[0m
�[0m20/08/20 17:38:31 INFO Utils: {"size": "1000000", "system": "encrypted", "query": "big data 3", "distributed": true, "sgx": "sim", "time": 236927.872135}�[0m

Yes, this a known performance regression that was introduced in #12. It's because we switched from (manual) compilation to interpretation. If you want to reproduce the paper's performance numbers, I would suggest using this version of the code: https://github.com/mc2-project/opaque/tree/c42fe1bb758a93239fae284885c3d64991affddf

Also, I believe the BDB size in the paper was 1 million.

I see. Would you mind providing a docker image of this version? We are writing a paper and want to add Opaque in prior work comparison section. Clearly it is unfair to compare with Opaque with the slower one. I met too many errors during compiling opaque:( Thanks!

BTW. I meet this error during building the current opaque docker image using docker/Dockerfile myself. Maybe you should replace the URL with a stable one. I think the updated URL should be https://dl.bintray.com/typesafe/ivy-releases/org.scala-sbt/sbt-launch/0.13.17/sbt-launch.jar
I am not familiar with sbt, but I think you should update the sbt in the project?

Downloading sbt launcher for 0.13.17:
From http://repo.typesafe.com/typesafe/ivy-releases/org.scala-sbt/sbt-launch/0.13.17/sbt-launch.jar
To /home/opaque/.sbt/launchers/0.13.17/sbt-launch.jar
Download failed. Obtain the jar manually and place it at /home/opaque/.sbt/launchers/0.13.17/sbt-launch.jar

I am trying to build the c42fe1bb758a93239fae284885c3d64991affddf version opaque which uses a different version of sbt. I add RUN git checkout c42fe1bb758a93239fae284885c3d64991affddf after cloning opaque repo in the dockerfile. But I can not build the image successfully.

CXX <= ServiceProvider/Main.cpp
CXX <= ServiceProvider/ecp.cpp
ServiceProvider/ecp.cpp:136:60: note: #pragma message: Default key derivation function is used.
#pragma message ("Default key derivation function is used.")
^
CXX <= ServiceProvider/service_provider.cpp
CXX <= ServiceProvider/ias_ra.cpp
CXX <= ServiceProvider/sp_crypto.cpp
LINK => libservice_provider.so
GEN => App/Enclave_u.c
error: The attribute 'sizefunc' is deprecated. Please use 'size' attribute instead.
Makefile:176: recipe for target 'App/Enclave_u.c' failed
make: *** [App/Enclave_u.c] Error 255
java.lang.RuntimeException: C++ build failed.
at scala.sys.package$.error(package.scala:27)
at $a51b140b3d8bfd9910ef$$anonfun$$sbtdef$1.apply(/home/opaque/opaque/build.sbt:33)
at $a51b140b3d8bfd9910ef$$anonfun$$sbtdef$1.apply(/home/opaque/opaque/build.sbt:30)
at scala.Function1$$anonfun$compose$1.apply(Function1.scala:47)
at sbt.$tilde$greater$$anonfun$$u2219$1.apply(TypeFunctions.scala:40)
at sbt.std.Transform$$anon$4.work(System.scala:63)
at sbt.Execute$$anonfun$submit$1$$anonfun$apply$1.apply(Execute.scala:228)
at sbt.Execute$$anonfun$submit$1$$anonfun$apply$1.apply(Execute.scala:228)
at sbt.ErrorHandling$.wideConvert(ErrorHandling.scala:17)
at sbt.Execute.work(Execute.scala:237)
at sbt.Execute$$anonfun$submit$1.apply(Execute.scala:228)
at sbt.Execute$$anonfun$submit$1.apply(Execute.scala:228)
at sbt.ConcurrentRestrictions$$anon$4$$anonfun$1.apply(ConcurrentRestrictions.scala:159)
at sbt.CompletionService$$anon$2.call(CompletionService.scala:28)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
[error] (*:enclaveBuild) C++ build failed.
[error] Total time: 27 s, completed Aug 20, 2020 8:07:26 PM

Unfortunately I don't have a Docker image of that version, so I'd have to go through these steps as well.

Regarding the sizefunc deprecation error, perhaps you could try to disable fatal warnings by removing -Werror from https://github.com/mc2-project/opaque/blob/c42fe1bb758a93239fae284885c3d64991affddf/src/enclave/Makefile?

Oh i will. I did this before when I tried to compile it not in the docker container

I have to modify

src/enclave/Enclave/Edger8rSyntax/Pointers.edl

public void ecall_pointer_sizefunc([size = get_buffer_len, in, out] char *buf);

to

public void ecall_pointer_sizefunc([user_check] char *buf);

to pass compilation, because it can not find the get_buffer_len method.

I am benchmark it again. Thanks for your support!

Well i still want to build it on host machine and run in hardware mode. But i met strange error here:

build.sh: 5: build.sh: Bad substitution
CXX  <=  ServiceProvider/Main.cpp
CXX  <=  ServiceProvider/ecp.cpp
ServiceProvider/ecp.cpp:136:60: note: #pragma message: Default key derivation function is used.
 #pragma message ("Default key derivation function is used.")
                                                            ^
CXX  <=  ServiceProvider/service_provider.cpp
CXX  <=  ServiceProvider/ias_ra.cpp
CXX  <=  ServiceProvider/sp_crypto.cpp
ServiceProvider/sp_crypto.cpp: In function ‘lc_status_t lc_ecdsa_sign(const uint8_t*, uint32_t, lc_ec256_private_t*, lc_ec256_signature_t*, lc_ecc_state_handle_t)’:
ServiceProvider/sp_crypto.cpp:555:16: error: invalid use of incomplete type ‘ECDSA_SIG {aka struct ECDSA_SIG_st}’
   BN_bn2bin(sig->r, (uint8_t *) x_);
                ^~
In file included from /usr/include/openssl/x509.h:22:0,
                 from /usr/include/openssl/pem.h:17,
                 from ServiceProvider/sp_crypto.h:41,
                 from ServiceProvider/sp_crypto.cpp:1:
/usr/include/openssl/ec.h:1120:16: note: forward declaration of ‘ECDSA_SIG {aka struct ECDSA_SIG_st}’
 typedef struct ECDSA_SIG_st ECDSA_SIG;
                ^~~~~~~~~~~~
ServiceProvider/sp_crypto.cpp:556:16: error: invalid use of incomplete type ‘ECDSA_SIG {aka struct ECDSA_SIG_st}’
   BN_bn2bin(sig->s, (uint8_t *) y_);
                ^~
In file included from /usr/include/openssl/x509.h:22:0,
                 from /usr/include/openssl/pem.h:17,
                 from ServiceProvider/sp_crypto.h:41,
                 from ServiceProvider/sp_crypto.cpp:1:
/usr/include/openssl/ec.h:1120:16: note: forward declaration of ‘ECDSA_SIG {aka struct ECDSA_SIG_st}’
 typedef struct ECDSA_SIG_st ECDSA_SIG;
                ^~~~~~~~~~~~
Makefile:227: recipe for target 'ServiceProvider/sp_crypto.o' failed
make: *** [ServiceProvider/sp_crypto.o] Error 1

This could be due to a libssl version problem. What version do you have installed? You may want to try building using version 1.1.0.

Thanks. It is indeed SSL version problem. After downgrading to 1.0.2, I can compile opaque now.