This is an automated email from the ASF dual-hosted git repository.

dongjoon pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git


The following commit(s) were added to refs/heads/master by this push:
     new 1321dd604480 [SPARK-48097][INFRA] Limit GHA job execution time to up 
to 3 hours in `build_and_test.yml`
1321dd604480 is described below

commit 1321dd6044809dbbdd8c1887b8345b0f8d76797d
Author: Dongjoon Hyun <dh...@apple.com>
AuthorDate: Thu May 2 15:10:33 2024 -0700

    [SPARK-48097][INFRA] Limit GHA job execution time to up to 3 hours in 
`build_and_test.yml`
    
    ### What changes were proposed in this pull request?
    
    This PR aims to limit GHA job execution time to up to 3 hours in 
`build_and_test.yml` in order to avoid idle hung time.
    New limit is applied for all jobs except three jobs (`precondition`, 
`infra-image`, and `breaking-changes-buf`) which didn't get a hung situation 
before.
    
    ### Why are the changes needed?
    
    Since SPARK-45010, Apache spark used 5 hours.
    - #42727
    
    This is shorter than GitHub Action's the default value (6 hour) is used.
    
    - 
https://docs.github.com/en/actions/using-workflows/workflow-syntax-for-github-actions#jobsjob_idtimeout-minutes
      > The maximum number of minutes to let a job run before GitHub 
automatically cancels it. Default: 360
    
    This PR reduces to `3 hour` to follow new ASF INFRA policy which has been 
applied since April 20, 2024.
    - https://infra.apache.org/github-actions-policy.html
    
    ### Does this PR introduce _any_ user-facing change?
    
    No.
    
    ### How was this patch tested?
    
    Manual review.
    
    ### Was this patch authored or co-authored using generative AI tooling?
    
    No.
    
    Closes #46344 from dongjoon-hyun/SPARK-48097.
    
    Authored-by: Dongjoon Hyun <dh...@apple.com>
    Signed-off-by: Dongjoon Hyun <dongj...@apache.org>
---
 .github/workflows/build_and_test.yml | 18 +++++++++---------
 1 file changed, 9 insertions(+), 9 deletions(-)

diff --git a/.github/workflows/build_and_test.yml 
b/.github/workflows/build_and_test.yml
index 7e59f7b792b4..92fda7adeb33 100644
--- a/.github/workflows/build_and_test.yml
+++ b/.github/workflows/build_and_test.yml
@@ -123,7 +123,7 @@ jobs:
     needs: precondition
     if: fromJson(needs.precondition.outputs.required).build == 'true'
     runs-on: ubuntu-latest
-    timeout-minutes: 300
+    timeout-minutes: 180
     strategy:
       fail-fast: false
       matrix:
@@ -333,7 +333,7 @@ jobs:
     if: (!cancelled()) && 
fromJson(needs.precondition.outputs.required).pyspark == 'true'
     name: "Build modules: ${{ matrix.modules }}"
     runs-on: ubuntu-latest
-    timeout-minutes: 300
+    timeout-minutes: 180
     container:
       image: ${{ needs.precondition.outputs.image_url }}
     strategy:
@@ -480,7 +480,7 @@ jobs:
     if: (!cancelled()) && fromJson(needs.precondition.outputs.required).sparkr 
== 'true'
     name: "Build modules: sparkr"
     runs-on: ubuntu-latest
-    timeout-minutes: 300
+    timeout-minutes: 180
     container:
       image: ${{ needs.precondition.outputs.image_url }}
     env:
@@ -602,7 +602,7 @@ jobs:
     if: (!cancelled()) && fromJson(needs.precondition.outputs.required).lint 
== 'true'
     name: Linters, licenses, dependencies and documentation generation
     runs-on: ubuntu-latest
-    timeout-minutes: 300
+    timeout-minutes: 180
     env:
       LC_ALL: C.UTF-8
       LANG: C.UTF-8
@@ -801,7 +801,7 @@ jobs:
           - java: 21
             os: macos-14 
     runs-on: ${{ matrix.os }}
-    timeout-minutes: 300
+    timeout-minutes: 180
     steps:
     - name: Checkout Spark repository
       uses: actions/checkout@v4
@@ -853,7 +853,7 @@ jobs:
     name: Run TPC-DS queries with SF=1
     # Pin to 'Ubuntu 20.04' due to 'databricks/tpcds-kit' compilation
     runs-on: ubuntu-20.04
-    timeout-minutes: 300
+    timeout-minutes: 180
     env:
       SPARK_LOCAL_IP: localhost
     steps:
@@ -954,7 +954,7 @@ jobs:
     if: fromJson(needs.precondition.outputs.required).docker-integration-tests 
== 'true'
     name: Run Docker integration tests
     runs-on: ubuntu-latest
-    timeout-minutes: 300
+    timeout-minutes: 180
     env:
       HADOOP_PROFILE: ${{ inputs.hadoop }}
       HIVE_PROFILE: hive2.3
@@ -1022,7 +1022,7 @@ jobs:
     if: fromJson(needs.precondition.outputs.required).k8s-integration-tests == 
'true'
     name: Run Spark on Kubernetes Integration test
     runs-on: ubuntu-latest
-    timeout-minutes: 300
+    timeout-minutes: 180
     steps:
       - name: Checkout Spark repository
         uses: actions/checkout@v4
@@ -1094,7 +1094,7 @@ jobs:
     if: fromJson(needs.precondition.outputs.required).ui == 'true'
     name: Run Spark UI tests
     runs-on: ubuntu-latest
-    timeout-minutes: 300
+    timeout-minutes: 180
     steps:
       - uses: actions/checkout@v4
       - name: Use Node.js


---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org

Reply via email to