Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Use mvn -f scala2.13/ in the build scripts to build the 2.13 jars #11608

Merged
merged 1 commit into from
Oct 16, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
8 changes: 2 additions & 6 deletions .github/workflows/mvn-verify-check.yml
Original file line number Diff line number Diff line change
Expand Up @@ -246,12 +246,10 @@ jobs:
echo "Generated Scala 2.13 build files don't match what's in repository"
exit 1
fi
# change to Scala 2.13 Directory
cd scala2.13
# test command, will retry for 3 times if failed.
max_retry=3; delay=30; i=1
while true; do
mvn package \
mvn package -f scala2.13/ \
-pl integration_tests,tests,tools -am -P 'individual,pre-merge' \
-Dbuildver=${{ matrix.spark-version }} -Dmaven.scalastyle.skip=true \
-Drat.skip=true ${{ env.COMMON_MVN_FLAGS }} && break || {
Expand Down Expand Up @@ -303,12 +301,10 @@ jobs:
echo "Generated Scala 2.13 build files don't match what's in repository"
exit 1
fi
# change to Scala 2.13 Directory
cd scala2.13
# test command, will retry for 3 times if failed.
max_retry=3; delay=30; i=1
while true; do
mvn verify \
mvn verify -f scala2.13/ \
-P "individual,pre-merge,source-javadoc" -Dbuildver=${{ matrix.spark-version }} \
${{ env.COMMON_MVN_FLAGS }} && break || {
if [[ $i -le $max_retry ]]; then
Expand Down
7 changes: 2 additions & 5 deletions build/buildall
Original file line number Diff line number Diff line change
Expand Up @@ -86,7 +86,7 @@ function bloopInstall() {

function versionsFromDistProfile() {
[[ "$BUILD_ALL_DEBUG" == "1" ]] && set -x
versionRawStr=$(mvn -B help:evaluate -q -pl dist -P"$1" -Dexpression=included_buildvers -DforceStdout)
versionRawStr=$($MVN -B help:evaluate -q -pl dist -P"$1" -Dexpression=included_buildvers -DforceStdout)
versionStr=${versionRawStr//[$'\n',]/}
echo -n $versionStr
}
Expand Down Expand Up @@ -171,6 +171,7 @@ fi
export MVN="mvn -Dmaven.wagon.http.retryHandler.count=3 ${MVN_OPT}"

if [[ "$SCALA213" == "1" ]]; then
MVN="$MVN -f scala2.13/"
DIST_PROFILE=${DIST_PROFILE:-"noSnapshotsScala213"}
$(dirname $0)/make-scala-version-build-files.sh 2.13
else
Expand Down Expand Up @@ -234,10 +235,6 @@ if [[ "$SKIP_CLEAN" != "1" ]]; then
$MVN -q clean
fi

if [[ "$SCALA213" == "1" ]]; then
cd scala2.13
fi

echo "Building a combined dist jar with Shims for ${SPARK_SHIM_VERSIONS[@]} ..."

function build_single_shim() {
Expand Down
9 changes: 4 additions & 5 deletions jenkins/spark-nightly-build.sh
Original file line number Diff line number Diff line change
Expand Up @@ -19,24 +19,23 @@ set -ex

. jenkins/version-def.sh

## MVN_OPT : maven options environment, e.g. MVN_OPT='-Dspark-rapids-jni.version=xxx' to specify spark-rapids-jni dependency's version.
MVN="mvn -Dmaven.wagon.http.retryHandler.count=3 -DretryFailedDeploymentCount=3 ${MVN_OPT} -Psource-javadoc"

SCALA_BINARY_VER=${SCALA_BINARY_VER:-"2.12"}
if [ $SCALA_BINARY_VER == "2.13" ]; then
# Run scala2.13 build and test against JDK17
export JAVA_HOME=$(echo /usr/lib/jvm/java-1.17.0-*)
update-java-alternatives --set $JAVA_HOME
java -version

cd scala2.13
ln -sf ../jenkins jenkins
MVN="$MVN -f scala2.13/"
fi

WORKSPACE=${WORKSPACE:-$(pwd)}
## export 'M2DIR' so that shims can get the correct Spark dependency info
export M2DIR=${M2DIR:-"$WORKSPACE/.m2"}

## MVN_OPT : maven options environment, e.g. MVN_OPT='-Dspark-rapids-jni.version=xxx' to specify spark-rapids-jni dependency's version.
MVN="mvn -Dmaven.wagon.http.retryHandler.count=3 -DretryFailedDeploymentCount=3 ${MVN_OPT} -Psource-javadoc"

DIST_PL="dist"
function mvnEval {
$MVN help:evaluate -q -pl $DIST_PL $MVN_URM_MIRROR -Prelease320 -Dmaven.repo.local=$M2DIR -DforceStdout -Dexpression=$1
Expand Down
11 changes: 4 additions & 7 deletions jenkins/spark-premerge-build.sh
Original file line number Diff line number Diff line change
Expand Up @@ -191,9 +191,6 @@ ci_scala213() {
update-java-alternatives --set $JAVA_HOME
java -version

cd scala2.13
ln -sf ../jenkins jenkins

# Download a Scala 2.13 version of Spark
prepare_spark 3.3.0 2.13

Expand All @@ -202,15 +199,15 @@ ci_scala213() {
do
echo "Spark version (Scala 2.13): $version"
env -u SPARK_HOME \
$MVN_CMD -U -B $MVN_URM_MIRROR -Dbuildver=$version clean install $MVN_BUILD_ARGS -Dpytest.TEST_TAGS=''
$MVN_CMD -f scala2.13/ -U -B $MVN_URM_MIRROR -Dbuildver=$version clean install $MVN_BUILD_ARGS -Dpytest.TEST_TAGS=''
# Run filecache tests
env -u SPARK_HOME SPARK_CONF=spark.rapids.filecache.enabled=true \
$MVN_CMD -B $MVN_URM_MIRROR -Dbuildver=$version test -rf tests $MVN_BUILD_ARGS -Dpytest.TEST_TAGS='' \
$MVN_CMD -f scala2.13/ -B $MVN_URM_MIRROR -Dbuildver=$version test -rf tests $MVN_BUILD_ARGS -Dpytest.TEST_TAGS='' \
-DwildcardSuites=org.apache.spark.sql.rapids.filecache.FileCacheIntegrationSuite
done

$MVN_CMD -U -B $MVN_URM_MIRROR clean package $MVN_BUILD_ARGS -DskipTests=true
cd .. # Run integration tests in the project root dir to leverage test cases and resource files
$MVN_CMD -f scala2.13/ -U -B $MVN_URM_MIRROR clean package $MVN_BUILD_ARGS -DskipTests=true

export TEST_TAGS="not premerge_ci_1"
export TEST_TYPE="pre-commit"
# SPARK_HOME (and related) must be set to a Spark built with Scala 2.13
Expand Down
6 changes: 2 additions & 4 deletions scala2.13/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -25,8 +25,7 @@ You can use Maven to build the plugin. Like with Scala 2.12, we recommend buildi
phase.

```shell script
cd scala2.13
mvn verify
mvn verify -f scala2.13/
```

After a successful build, the RAPIDS Accelerator jar will be in the `scala2.13/dist/target/` directory.
Expand All @@ -45,7 +44,6 @@ You can also use the `buildall` script in the parent directory to build against
of Apache Spark.

```shell script
cd ..
./build/buildall --profile=noSnapshotsScala213
```

Expand All @@ -72,4 +70,4 @@ That way any new dependencies or other changes will be picked up in the Scala 2.
You should be able to open the `scala2.13` directory directly in IntelliJ as a separate project. You can build and
debug as normal, although there are slight differences in how to navigate the source. In particular, when you select
a particular build profile, you will only be able to navigate the source used by modules that are included for that
spark version.
spark version.
Loading