This is an automated email from the ASF dual-hosted git repository.

dongjoon pushed a commit to branch branch-3.5
in repository https://gitbox.apache.org/repos/asf/spark.git


The following commit(s) were added to refs/heads/branch-3.5 by this push:
     new 6dbbf081a7d2 [SPARK-48133][INFRA] Run `sparkr` only in PR builders and 
Daily CIs
6dbbf081a7d2 is described below

commit 6dbbf081a7d248ddce62b62e979ff06a3c793f22
Author: Dongjoon Hyun <dh...@apple.com>
AuthorDate: Sun May 5 13:19:23 2024 -0700

    [SPARK-48133][INFRA] Run `sparkr` only in PR builders and Daily CIs
    
    This PR aims to run `sparkr` only in PR builder and Daily Python CIs. In 
other words, only the commit builder will skip it by default.
    
    To reduce GitHub Action usage to meet ASF INFRA policy.
    - https://infra.apache.org/github-actions-policy.html
    
        > All workflows MUST have a job concurrency level less than or equal to 
20. This means a workflow cannot have more than 20 jobs running at the same 
time across all matrices.
    
    No.
    
    Manual review.
    
    No.
    
    Closes #46389 from dongjoon-hyun/SPARK-48133.
    
    Authored-by: Dongjoon Hyun <dh...@apple.com>
    Signed-off-by: Dongjoon Hyun <dh...@apple.com>
    (cherry picked from commit 32ba5c1db62caaaa2674e8acced56f89ed840bf9)
    Signed-off-by: Dongjoon Hyun <dh...@apple.com>
---
 .github/workflows/build_and_test.yml | 4 ++--
 1 file changed, 2 insertions(+), 2 deletions(-)

diff --git a/.github/workflows/build_and_test.yml 
b/.github/workflows/build_and_test.yml
index 645054dc2087..4ad4a243c76d 100644
--- a/.github/workflows/build_and_test.yml
+++ b/.github/workflows/build_and_test.yml
@@ -79,17 +79,17 @@ jobs:
       id: set-outputs
       run: |
         if [ -z "${{ inputs.jobs }}" ]; then
-          pyspark=true; sparkr=true; tpcds=true; docker=true;
           pyspark_modules=`cd dev && python -c "import 
sparktestsupport.modules as m; print(','.join(m.name for m in m.all_modules if 
m.name.startswith('pyspark')))"`
           pyspark=`./dev/is-changed.py -m $pyspark_modules`
           if [[ "${{ github.repository }}" != 'apache/spark' ]]; then
             pandas=$pyspark
             kubernetes=`./dev/is-changed.py -m kubernetes`
+            sparkr=`./dev/is-changed.py -m sparkr`
           else
             pandas=false
             kubernetes=false
+            sparkr=false
           fi
-          sparkr=`./dev/is-changed.py -m sparkr`
           tpcds=`./dev/is-changed.py -m sql`
           docker=`./dev/is-changed.py -m docker-integration-tests`
           build=`./dev/is-changed.py -m 
"core,unsafe,kvstore,avro,utils,network-common,network-shuffle,repl,launcher,examples,sketch,graphx,catalyst,hive-thriftserver,streaming,sql-kafka-0-10,streaming-kafka-0-10,mllib-local,mllib,yarn,mesos,kubernetes,hadoop-cloud,spark-ganglia-lgpl,sql,hive"`


---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org

Reply via email to