This is an automated email from the ASF dual-hosted git repository.

gurwls223 pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git


The following commit(s) were added to refs/heads/master by this push:
     new 10d5303  [SPARK-36839][INFRA][FOLLOW-UP] Add Hadoop 2 daily job into 
schedule
10d5303 is described below

commit 10d5303174bf4a47508f6227bbdb1eaf4c92fcdb
Author: Hyukjin Kwon <gurwls...@apache.org>
AuthorDate: Fri Oct 8 09:40:57 2021 +0900

    [SPARK-36839][INFRA][FOLLOW-UP] Add Hadoop 2 daily job into schedule
    
    ### What changes were proposed in this pull request?
    
    This is a followup of https://github.com/apache/spark/pull/34091 to set the 
Hadoop 2 daily job properly.
    The previous PR mistakenly did not set the schedule so the daily jobs are 
not being triggered - looks like I forgot to sync the PR with the testing base 
in my fork.
    
    ### Why are the changes needed?
    
    In order to run the daily job with Hadoop 2.
    
    ### Does this PR introduce _any_ user-facing change?
    
    No.
    
    ### How was this patch tested?
    
    It will be tested once it's merged. It won't break any build in any event.
    
    Closes #34217 from HyukjinKwon/SPARK-36839-followup.
    
    Authored-by: Hyukjin Kwon <gurwls...@apache.org>
    Signed-off-by: Hyukjin Kwon <gurwls...@apache.org>
---
 .github/workflows/build_and_test.yml | 2 ++
 1 file changed, 2 insertions(+)

diff --git a/.github/workflows/build_and_test.yml 
b/.github/workflows/build_and_test.yml
index c51afd6..1262e65 100644
--- a/.github/workflows/build_and_test.yml
+++ b/.github/workflows/build_and_test.yml
@@ -25,6 +25,8 @@ on:
     - '**'
     - '!branch-*.*'
   schedule:
+    # master, Hadoop 2
+    - cron: '0 1 * * *'
     # master
     - cron: '0 4 * * *'
     # branch-3.2

---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org

Reply via email to