[SPARK-49549][SQL] Assign a name to the error conditions _LEGACY_ERRO… #35
build_main.yml
on: push
Run
/
Check changes
35s
Run
/
Protobuf breaking change detection and Python CodeGen check
59s
Run
/
Run TPC-DS queries with SF=1
1h 36m
Run
/
Run Docker integration tests
1h 24m
Run
/
Run Spark on Kubernetes Integration test
1h 1m
Run
/
Run Spark UI tests
21s
Matrix: Run / build
Run
/
Build modules: sparkr
29m 46s
Run
/
Linters, licenses, and dependencies
28m 1s
Run
/
Documentation generation
48m 54s
Matrix: Run / pyspark
Annotations
9 errors and 1 warning
Run / Run Spark on Kubernetes Integration test
HashSet() did not contain "decomtest-6dce0492729f3e7b-exec-1".
|
Run / Run Spark on Kubernetes Integration test
HashSet() did not contain "decomtest-f005369272a026a0-exec-1".
|
Run / Run Spark on Kubernetes Integration test
sleep interrupted
|
Run / Run Spark on Kubernetes Integration test
Task io.fabric8.kubernetes.client.utils.internal.SerialExecutor$$Lambda$625/0x00007fe8484f6460@151a6599 rejected from java.util.concurrent.ThreadPoolExecutor@625e3940[Shutting down, pool size = 2, active threads = 2, queued tasks = 0, completed tasks = 358]
|
Run / Run Spark on Kubernetes Integration test
Task io.fabric8.kubernetes.client.utils.internal.SerialExecutor$$Lambda$625/0x00007fe8484f6460@26af764a rejected from java.util.concurrent.ThreadPoolExecutor@625e3940[Shutting down, pool size = 1, active threads = 1, queued tasks = 0, completed tasks = 359]
|
Run / Run Spark on Kubernetes Integration test
HashSet() did not contain "decomtest-f8e2119272b2d07e-exec-1".
|
Run / Run Spark on Kubernetes Integration test
HashSet() did not contain "decomtest-a7c2e49272b3bc4e-exec-1".
|
Run / Run Spark on Kubernetes Integration test
HashSet() did not contain "decomtest-7e44e99272b76d11-exec-1".
|
Run / Run Spark on Kubernetes Integration test
Status(apiVersion=v1, code=404, details=StatusDetails(causes=[], group=null, kind=pods, name=spark-test-app-8377504b807d4f82b3611171f543c306-driver, retryAfterSeconds=null, uid=null, additionalProperties={}), kind=Status, message=pods "spark-test-app-8377504b807d4f82b3611171f543c306-driver" not found, metadata=ListMeta(_continue=null, remainingItemCount=null, resourceVersion=null, selfLink=null, additionalProperties={}), reason=NotFound, status=Failure, additionalProperties={})..
|
Run / Build modules: pyspark-core, pyspark-errors, pyspark-streaming
No files were found with the provided path: **/target/test-reports/*.xml. No artifacts will be uploaded.
|
Artifacts
Produced during runtime
Name | Size | |
---|---|---|
test-results-pyspark-pandas-connect-part1--17-hadoop3-hive2.3-python3.11
Expired
|
110 KB |
|