Github user HyukjinKwon commented on a diff in the pull request:

    https://github.com/apache/spark/pull/23117#discussion_r236486298
  
    --- Diff: dev/run-tests.py ---
    @@ -434,6 +434,63 @@ def run_python_tests(test_modules, parallelism):
         run_cmd(command)
     
     
    +def run_python_tests_with_coverage(test_modules, parallelism):
    +    set_title_and_block("Running PySpark tests with coverage report", 
"BLOCK_PYSPARK_UNIT_TESTS")
    +
    +    command = [os.path.join(SPARK_HOME, "python", 
"run-tests-with-coverage")]
    +    if test_modules != [modules.root]:
    +        command.append("--modules=%s" % ','.join(m.name for m in 
test_modules))
    +    command.append("--parallelism=%i" % parallelism)
    +    run_cmd(command)
    +    post_python_tests_results()
    +
    +
    +def post_python_tests_results():
    +    if "SPARK_TEST_KEY" not in os.environ:
    +        print("[error] 'SPARK_TEST_KEY' environment variable was not set. 
Unable to post"
    +              "PySpark coverage results.")
    +        sys.exit(1)
    --- End diff --
    
    @shaneknapp can you add another environment variable that indicates PR 
builder and spark-master-test-sbt-hadoop-2.7 where we're going to run Python 
coverage? I can check it and explicitly enable it only in that condition.
    
    True, if the condition below (which I checked before at #17669):
    
    ```python
                os.environ.get("AMPLAB_JENKINS_BUILD_PROFILE", "") == 
"hadoop2.7"
                and os.environ.get("SPARK_BRANCH", "") == "master"
                and os.environ.get("AMPLAB_JENKINS", "") == "true"
                and os.environ.get("AMPLAB_JENKINS_BUILD_TOOL", "") == "sbt")
    ```
    
    is `True` in Jenkins build or other users environment, it might cause some 
problems (even though looks quite unlikely).
    
    For similar instance, if `AMPLAB_JENKINS` is set in users environment who 
run the tests locally, it wouldn't work anyway tho.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to