SparkRDMA is a high performance ShuffleManager plugin for Apache Spark that uses RDMA (instead of TCP) when performing Shuffle data transfers in Spark jobs.
This open-source project is developed, maintained and supported by Mellanox Technologies.
Example performance speedup for HiBench TeraSort:
Running TeraSort with SparkRDMA is x1.41 faster than standard Spark (runtime in seconds)
Testbed:
175GB Workload
15 Workers, 2x Intel Xeon E5-2697 v3 @ 2.60GHz, 28 cores per Worker, 256GB RAM, non-flash storage (HDD)
Mellanox ConnectX-4 network adapter with 100GbE RoCE fabric, connected with a Mellanox Spectrum switch
For more information on configuration, performance tuning and troubleshooting, please visit the SparkRDMA GitHub Wiki
- Apache Spark 2.0.0/2.1.0/2.2.0
- Java 8
- An RDMA-supported network, e.g. RoCE or Infiniband
Please use the "Releases" page to download pre-built binaries.
If you would like to build the project yourself, please refer to the "Build" section below.
The pre-built binaries are packed as an archive that contains the following files:
- spark-rdma-1.0-for-spark-2.0.0-jar-with-dependencies.jar
- spark-rdma-1.0-for-spark-2.1.0-jar-with-dependencies.jar
- spark-rdma-1.0-for-spark-2.2.0-jar-with-dependencies.jar
- libdisni.so
libdisni.so must be installed on every Spark Master and Worker (usually in /usr/lib)
Provide Spark the location of the SparkRDMA plugin jars by using the extraClassPath option. For standalone mode this can be added to either spark-defaults.conf or any runtime configuration file. For client mode this must be added to spark-defaults.conf. For Spark 2.0.0 (Replace with 2.1.0 or 2.2.0 according to your Spark version):
spark.driver.extraClassPath /path/to/SparkRDMA/target/spark-rdma-1.0-for-spark-2.0.0-jar-with-dependencies.jar
spark.executor.extraClassPath /path/to/SparkRDMA/target/spark-rdma-1.0-for-spark-2.0.0-jar-with-dependencies.jar
To enable the SparkRDMA Shuffle Manager plugin, add the following line to either spark-defaults.conf or any runtime configuration file:
spark.shuffle.manager org.apache.spark.shuffle.rdma.RdmaShuffleManager
Building the SparkRDMA plugin requires Apache Maven and Java 8
-
Obtain a clone of SparkRDMA
-
Build the plugin for your Spark version (either 2.0.0, 2.1.0 or 2.2.0), e.g. for Spark 2.0.0:
mvn -DskipTests clean package -Pspark-2.0.0
- Obtain a clone of DiSNI for building libdisni:
git clone https://github.com/zrlio/disni.git
cd disni
git checkout tags/v1.3 -b v1.3
- Compile and install only libdisni (the jars are already included in the SparkRDMA plugin):
cd libdisni
autoprepare.sh
./configure --with-jdk=/path/to/java8/jdk
make
make install
For any questions, issues or suggestions, please use our Google group: https://groups.google.com/forum/#!forum/sparkrdma
Any PR submissions are welcome