Skip to content

Commit

Permalink
Reorganize IT directory to prevent unintentional execution from UT (o…
Browse files Browse the repository at this point in the history
…pensearch-project#501)

* Refactor IT and UT task and remove UT in CI

Signed-off-by: Chen Dai <[email protected]>

* Update dev guide

Signed-off-by: Chen Dai <[email protected]>

* Update doc with IT dependency

Signed-off-by: Chen Dai <[email protected]>

---------

Signed-off-by: Chen Dai <[email protected]>
  • Loading branch information
dai-chen authored Aug 6, 2024
1 parent 7b52d81 commit a91b3ef
Show file tree
Hide file tree
Showing 52 changed files with 25 additions and 11 deletions.
5 changes: 1 addition & 4 deletions .github/workflows/test-and-build-workflow.yml
Original file line number Diff line number Diff line change
Expand Up @@ -23,10 +23,7 @@ jobs:
java-version: 11

- name: Integ Test
run: sbt integtest/test

- name: Unit Test
run: sbt test
run: sbt integtest/integration

- name: Style check
run: sbt scalafmtCheckAll
17 changes: 11 additions & 6 deletions DEVELOPER_GUIDE.md
Original file line number Diff line number Diff line change
Expand Up @@ -6,11 +6,16 @@ If you want to package the single jar for, you can do so by running the followin
sbt assembly
```

## Unit Test
To execute the unit tests, run the following command:
```
sbt test
```

## Integration Test
The integration test is defined in the integ-test directory of the project. If you want to run the integration test for the project, you
can do so by running the following command:
The integration test is defined in the `integration` directory of the project. The integration tests will automatically trigger unit tests and will only run if all unit tests pass. If you want to run the integration test for the project, you can do so by running the following command:
```
sbt integtest/test
sbt integtest/integration
```
If you get integration test failures with error message "Previous attempts to find a Docker environment failed" in macOS, fix the issue by following the checklist:
1. Check you've installed Docker in your dev host. If not, install Docker first.
Expand All @@ -19,7 +24,7 @@ If you get integration test failures with error message "Previous attempts to fi
4. If you use Docker Desktop, as an alternative of `3`, check mark the "Allow the default Docker socket to be used (requires password)" in advanced settings of Docker Desktop.

### AWS Integration Test
The integration folder contains tests for cloud server providers. For instance, test against AWS OpenSearch domain, configure the following settings. The client will use the default credential provider to access the AWS OpenSearch domain.
The `aws-integration` folder contains tests for cloud server providers. For instance, test against AWS OpenSearch domain, configure the following settings. The client will use the default credential provider to access the AWS OpenSearch domain.
```
export AWS_OPENSEARCH_HOST=search-xxx.us-west-2.on.aws
export AWS_REGION=us-west-2
Expand All @@ -29,9 +34,9 @@ export AWS_S3_CODE_BUCKET=xxx
export AWS_S3_CODE_PREFIX=xxx
export AWS_OPENSEARCH_RESULT_INDEX=query_execution_result_glue
```
And run the
And run the following command:
```
sbt integtest/integration
sbt integtest/awsIntegration
[info] AWSOpenSearchAccessTestSuite:
[info] - should Create Pit on AWS OpenSearch
Expand Down
14 changes: 13 additions & 1 deletion build.sbt
Original file line number Diff line number Diff line change
Expand Up @@ -217,10 +217,12 @@ lazy val flintSparkIntegration = (project in file("flint-spark-integration"))
assembly / test := (Test / test).value)

lazy val IntegrationTest = config("it") extend Test
lazy val AwsIntegrationTest = config("aws-it") extend Test

// Test assembly package with integration test.
lazy val integtest = (project in file("integ-test"))
.dependsOn(flintCommons % "test->test", flintSparkIntegration % "test->test", pplSparkIntegration % "test->test", sparkSqlApplication % "test->test")
.configs(IntegrationTest, AwsIntegrationTest)
.settings(
commonSettings,
name := "integ-test",
Expand All @@ -231,10 +233,17 @@ lazy val integtest = (project in file("integ-test"))
s"-DpplJar=${(pplSparkIntegration / assembly).value.getAbsolutePath}",
),
inConfig(IntegrationTest)(Defaults.testSettings ++ Seq(
IntegrationTest / javaSource := baseDirectory.value / "src/integration/java",
IntegrationTest / scalaSource := baseDirectory.value / "src/integration/scala",
IntegrationTest / parallelExecution := false,
IntegrationTest / fork := true,
)),
)),
inConfig(AwsIntegrationTest)(Defaults.testSettings ++ Seq(
AwsIntegrationTest / javaSource := baseDirectory.value / "src/aws-integration/java",
AwsIntegrationTest / scalaSource := baseDirectory.value / "src/aws-integration/scala",
AwsIntegrationTest / parallelExecution := false,
AwsIntegrationTest / fork := true,
)),
libraryDependencies ++= Seq(
"com.amazonaws" % "aws-java-sdk" % "1.12.397" % "provided"
exclude ("com.fasterxml.jackson.core", "jackson-databind"),
Expand All @@ -249,9 +258,12 @@ lazy val integtest = (project in file("integ-test"))
(sparkSqlApplication / assembly).value
),
IntegrationTest / dependencyClasspath ++= (Test / dependencyClasspath).value,
AwsIntegrationTest / dependencyClasspath ++= (Test / dependencyClasspath).value,
integration := (IntegrationTest / test).value,
awsIntegration := (AwsIntegrationTest / test).value
)
lazy val integration = taskKey[Unit]("Run integration tests")
lazy val awsIntegration = taskKey[Unit]("Run AWS integration tests")

lazy val standaloneCosmetic = project
.settings(
Expand Down

0 comments on commit a91b3ef

Please sign in to comment.