This is an automated email from the ASF dual-hosted git repository. dongjoon pushed a commit to branch branch-3.4 in repository https://gitbox.apache.org/repos/asf/spark.git
The following commit(s) were added to refs/heads/branch-3.4 by this push: new bd54e633121c [SPARK-48192][INFRA] Enable TPC-DS tests in forked repository bd54e633121c is described below commit bd54e633121c77293bbb0cd343eeebb167ca5edf Author: Hyukjin Kwon <gurwls...@apache.org> AuthorDate: Wed May 8 17:13:11 2024 +0900 [SPARK-48192][INFRA] Enable TPC-DS tests in forked repository This PR is a sort of a followup of https://github.com/apache/spark/pull/46361. It proposes to run TPC-DS and Docker integration tests in PRs (that does not consume ASF resources). TPC-DS and Docker integration stuff at least have to be tested in the PR if the PR touches the codes related to that. No, test-only. Manually No. Closes #46470 from HyukjinKwon/SPARK-48192. Authored-by: Hyukjin Kwon <gurwls...@apache.org> Signed-off-by: Hyukjin Kwon <gurwls...@apache.org> (cherry picked from commit f693abc8de949b1fd5f77b9e74037b0cc2298aef) Signed-off-by: Dongjoon Hyun <dh...@apple.com> (cherry picked from commit 82779217b1fa1dea2b18772795969c04c1f34532) Signed-off-by: Dongjoon Hyun <dh...@apple.com> --- .github/workflows/build_and_test.yml | 6 ++++-- 1 file changed, 4 insertions(+), 2 deletions(-) diff --git a/.github/workflows/build_and_test.yml b/.github/workflows/build_and_test.yml index 0166395ceb4a..64f18b5163b1 100644 --- a/.github/workflows/build_and_test.yml +++ b/.github/workflows/build_and_test.yml @@ -84,17 +84,19 @@ jobs: if [ -f "./dev/is-changed.py" ]; then pyspark_modules=`cd dev && python -c "import sparktestsupport.modules as m; print(','.join(m.name for m in m.all_modules if m.name.startswith('pyspark')))"` pyspark=`./dev/is-changed.py -m $pyspark_modules` - tpcds=`./dev/is-changed.py -m sql` - docker=`./dev/is-changed.py -m docker-integration-tests` fi if [[ "${{ github.repository }}" != 'apache/spark' ]]; then pandas=$pyspark kubernetes=`./dev/is-changed.py -m kubernetes` sparkr=`./dev/is-changed.py -m sparkr` + tpcds=`./dev/is-changed.py -m sql` + docker=`./dev/is-changed.py -m docker-integration-tests` else pandas=false kubernetes=false sparkr=false + tpcds=false + docker=false fi # 'build', 'scala-213', and 'java-11-17' are always true for now. # It does not save significant time and most of PRs trigger the build. --------------------------------------------------------------------- To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org For additional commands, e-mail: commits-h...@spark.apache.org