This is an automated email from the ASF dual-hosted git repository.

gurwls223 pushed a commit to branch branch-3.4
in repository https://gitbox.apache.org/repos/asf/spark.git


The following commit(s) were added to refs/heads/branch-3.4 by this push:
     new 96296aac4f4 [SPARK-44544][INFRA][3.4] Deduplicate 
`run_python_packaging_tests`
96296aac4f4 is described below

commit 96296aac4f4f168a7f30ff1ccb33c3b52b433ba4
Author: Ruifeng Zheng <ruife...@apache.org>
AuthorDate: Thu Jul 27 09:22:36 2023 +0900

    [SPARK-44544][INFRA][3.4] Deduplicate `run_python_packaging_tests`
    
    ### What changes were proposed in this pull request?
    cherry-pick https://github.com/apache/spark/pull/42146 to 3.4
    
    ### Why are the changes needed?
    can not cherry-pick clearly, so make this PR
    
    ### Does this PR introduce _any_ user-facing change?
    no, infra-only
    
    ### How was this patch tested?
    updated CI
    
    Closes #42172 from zhengruifeng/cp_fix.
    
    Authored-by: Ruifeng Zheng <ruife...@apache.org>
    Signed-off-by: Hyukjin Kwon <gurwls...@apache.org>
---
 .github/workflows/build_and_test.yml | 16 ++++++++++++++--
 dev/run-tests.py                     |  2 +-
 2 files changed, 15 insertions(+), 3 deletions(-)

diff --git a/.github/workflows/build_and_test.yml 
b/.github/workflows/build_and_test.yml
index 657fec27d52..06f94ea0b25 100644
--- a/.github/workflows/build_and_test.yml
+++ b/.github/workflows/build_and_test.yml
@@ -192,6 +192,7 @@ jobs:
       HIVE_PROFILE: ${{ matrix.hive }}
       GITHUB_PREV_SHA: ${{ github.event.before }}
       SPARK_LOCAL_IP: localhost
+      SKIP_PACKAGING: true
     steps:
     - name: Checkout Spark repository
       uses: actions/checkout@v3
@@ -328,6 +329,8 @@ jobs:
         java:
           - ${{ inputs.java }}
         modules:
+          - >-
+            pyspark-errors
           - >-
             pyspark-sql, pyspark-mllib, pyspark-resource
           - >-
@@ -337,7 +340,7 @@ jobs:
           - >-
             pyspark-pandas-slow
           - >-
-            pyspark-connect, pyspark-errors
+            pyspark-connect
     env:
       MODULES_TO_TEST: ${{ matrix.modules }}
       HADOOP_PROFILE: ${{ inputs.hadoop }}
@@ -346,6 +349,7 @@ jobs:
       SPARK_LOCAL_IP: localhost
       SKIP_UNIDOC: true
       SKIP_MIMA: true
+      SKIP_PACKAGING: true
       METASPACE_SIZE: 1g
     steps:
     - name: Checkout Spark repository
@@ -394,14 +398,20 @@ jobs:
         python3.9 -m pip list
         pypy3 -m pip list
     - name: Install Conda for pip packaging test
+      if: ${{ matrix.modules == 'pyspark-errors' }}
       run: |
         curl -s 
https://repo.anaconda.com/miniconda/Miniconda3-latest-Linux-x86_64.sh > 
miniconda.sh
         bash miniconda.sh -b -p $HOME/miniconda
     # Run the tests.
     - name: Run tests
       env: ${{ fromJSON(inputs.envs) }}
+      shell: 'script -q -e -c "bash {0}"'
       run: |
-        export PATH=$PATH:$HOME/miniconda/bin
+        if [[ "$MODULES_TO_TEST" == "pyspark-errors" ]]; then
+          export PATH=$PATH:$HOME/miniconda/bin
+          export SKIP_PACKAGING=false
+          echo "Python Packaging Tests Enabled!"
+        fi
         ./dev/run-tests --parallelism 1 --modules "$MODULES_TO_TEST"
     - name: Upload coverage to Codecov
       if: fromJSON(inputs.envs).PYSPARK_CODECOV == 'true'
@@ -437,6 +447,7 @@ jobs:
       GITHUB_PREV_SHA: ${{ github.event.before }}
       SPARK_LOCAL_IP: localhost
       SKIP_MIMA: true
+      SKIP_PACKAGING: true
     steps:
     - name: Checkout Spark repository
       uses: actions/checkout@v3
@@ -850,6 +861,7 @@ jobs:
       SPARK_LOCAL_IP: localhost
       ORACLE_DOCKER_IMAGE_NAME: gvenzl/oracle-xe:21.3.0
       SKIP_MIMA: true
+      SKIP_PACKAGING: true
     steps:
     - name: Checkout Spark repository
       uses: actions/checkout@v3
diff --git a/dev/run-tests.py b/dev/run-tests.py
index 92768c96905..dab3dcf7fe6 100755
--- a/dev/run-tests.py
+++ b/dev/run-tests.py
@@ -396,7 +396,7 @@ def run_python_tests(test_modules, parallelism, 
with_coverage=False):
 
 
 def run_python_packaging_tests():
-    if not os.environ.get("SPARK_JENKINS"):
+    if not os.environ.get("SPARK_JENKINS") and 
os.environ.get("SKIP_PACKAGING", "false") != "true":
         set_title_and_block("Running PySpark packaging tests", 
"BLOCK_PYSPARK_PIP_TESTS")
         command = [os.path.join(SPARK_HOME, "dev", "run-pip-tests")]
         run_cmd(command)


---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org

Reply via email to