Build (branch-4.0, Scala 2.13, Hadoop 3, JDK 17) #50
Triggered via schedule
February 21, 2025 12:07
Status
Failure
Total duration
1h 48m 20s
Artifacts
15
build_branch40.yml
on: schedule
Run
/
Check changes
42s
Run
/
Protobuf breaking change detection and Python CodeGen check
0s
Run
/
Run TPC-DS queries with SF=1
1h 20m
Run
/
Run Docker integration tests
1h 39m
Run
/
Run Spark on Kubernetes Integration test
59m 57s
Run
/
Run Spark UI tests
0s
Matrix: Run / build
Run
/
Build modules: sparkr
26m 38s
Run
/
Linters, licenses, and dependencies
28m 10s
Run
/
Documentation generation
0s
Matrix: Run / pyspark
Annotations
11 errors and 1 warning
Run / Run Spark on Kubernetes Integration test
HashSet() did not contain "decomtest-82a455952880ae82-exec-1".
|
Run / Run Spark on Kubernetes Integration test
HashSet() did not contain "decomtest-8e2fe99528819990-exec-1".
|
Run / Run Spark on Kubernetes Integration test
sleep interrupted
|
Run / Run Spark on Kubernetes Integration test
sleep interrupted
|
Run / Run Spark on Kubernetes Integration test
Task io.fabric8.kubernetes.client.utils.internal.SerialExecutor$$Lambda$826/0x00007f83dc6de128@d627ff9 rejected from java.util.concurrent.ThreadPoolExecutor@d2f80a8[Shutting down, pool size = 1, active threads = 1, queued tasks = 0, completed tasks = 327]
|
Run / Run Spark on Kubernetes Integration test
Task io.fabric8.kubernetes.client.utils.internal.SerialExecutor$$Lambda$826/0x00007f83dc6de128@45159e99 rejected from java.util.concurrent.ThreadPoolExecutor@d2f80a8[Shutting down, pool size = 2, active threads = 2, queued tasks = 0, completed tasks = 326]
|
Run / Run Spark on Kubernetes Integration test
HashSet() did not contain "decomtest-0e21cc952894856a-exec-1".
|
Run / Run Spark on Kubernetes Integration test
HashSet() did not contain "decomtest-30e7b6952895752d-exec-1".
|
Run / Run Spark on Kubernetes Integration test
HashSet() did not contain "decomtest-04ab0b95289927ad-exec-1".
|
Run / Run Spark on Kubernetes Integration test
Status(apiVersion=v1, code=404, details=StatusDetails(causes=[], group=null, kind=pods, name=spark-test-app-b6940707c21b433fa9831542d44b7451-driver, retryAfterSeconds=null, uid=null, additionalProperties={}), kind=Status, message=pods "spark-test-app-b6940707c21b433fa9831542d44b7451-driver" not found, metadata=ListMeta(_continue=null, remainingItemCount=null, resourceVersion=null, selfLink=null, additionalProperties={}), reason=NotFound, status=Failure, additionalProperties={})..
|
Run / Run Docker integration tests
Process completed with exit code 18.
|
Run / Base image build
Failed to save: Failed to CreateCacheEntry: Received non-retryable error: Failed request: (409) Conflict: cache entry with the same key, version, and scope already exists
|
Artifacts
Produced during runtime
Name | Size | |
---|---|---|
apache~spark~7O1E99.dockerbuild
|
30.1 KB |
|
apache~spark~VQ09EM.dockerbuild
|
25.4 KB |
|
test-results-api, catalyst, hive-thriftserver--17-hadoop3-hive2.3
|
647 KB |
|
test-results-core, unsafe, kvstore, avro, utils, network-common, network-shuffle, repl, launcher, examples, sketch, variant--17-hadoop3-hive2.3
|
808 KB |
|
test-results-docker-integration--17-hadoop3-hive2.3
|
43.1 KB |
|
test-results-hive-- other tests-17-hadoop3-hive2.3
|
234 KB |
|
test-results-hive-- slow tests-17-hadoop3-hive2.3
|
221 KB |
|
test-results-mllib-local, mllib, graphx, profiler--17-hadoop3-hive2.3
|
477 KB |
|
test-results-sparkr--17-hadoop3-hive2.3
|
17.3 KB |
|
test-results-sql-- extended tests-17-hadoop3-hive2.3
|
1.14 MB |
|
test-results-sql-- other tests-17-hadoop3-hive2.3
|
1.34 MB |
|
test-results-sql-- slow tests-17-hadoop3-hive2.3
|
1.14 MB |
|
test-results-streaming, sql-kafka-0-10, streaming-kafka-0-10, streaming-kinesis-asl, kubernetes, hadoop-cloud, spark-ganglia-lgpl, protobuf, connect--17-hadoop3-hive2.3
|
381 KB |
|
test-results-tpcds--17-hadoop3-hive2.3
|
4.91 KB |
|
unit-tests-log-docker-integration--17-hadoop3-hive2.3
|
38.8 MB |
|