Review the deployment scripts of the CI to solve the dependency issue while using scala 2.11
qxzzxq opened this issue · 1 comments
During the deployment of setl_2.11, the profile spark_2.4
is used. It provides the correct dependencies' versions (as some dependencies used in the default profile spark_3.0
don't support scala 2.11). However, when we include setl_2.11 into a project using Spark 2.4 and Scala 2.11, only the default properties in pom.xml
will be used. Therefore some dependencies cannot be resolved by the package manager.
ex: #172
A temporary fix could be done by manually including these dependencies with their right versions in the pom or sbt file, for example:
<dependency>
<groupId>com.audienceproject</groupId>
<artifactId>spark-dynamodb_2.11</artifactId>
<version>1.0.4</version>
</dependency>
<dependency>
<groupId>io.delta</groupId>
<artifactId>delta-core_2.11</artifactId>
<version>0.6.1</version>
</dependency>
<dependency>
<groupId>com.datastax.spark</groupId>
<artifactId>spark-cassandra-connector_2.11</artifactId>
<version>2.5.1</version>
</dependency>
But we could solve this issue by modifying the deployment stage in the CI. e.g we could change the default versions of these dependencies in the script change-scala-version.sh
or we can maintain a separate pom file in the dev
directory
This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.