Github user icexelloss commented on a diff in the pull request:

    https://github.com/apache/spark/pull/21107#discussion_r183409329
  
    --- Diff: python/run-tests.py ---
    @@ -152,65 +172,17 @@ def parse_opts():
         return opts
     
     
    -def _check_dependencies(python_exec, modules_to_test):
    -    if "COVERAGE_PROCESS_START" in os.environ:
    -        # Make sure if coverage is installed.
    -        try:
    -            subprocess_check_output(
    -                [python_exec, "-c", "import coverage"],
    -                stderr=open(os.devnull, 'w'))
    -        except:
    -            print_red("Coverage is not installed in Python executable '%s' 
"
    -                      "but 'COVERAGE_PROCESS_START' environment variable 
is set, "
    -                      "exiting." % python_exec)
    -            sys.exit(-1)
    -
    -    # If we should test 'pyspark-sql', it checks if PyArrow and Pandas are 
installed and
    -    # explicitly prints out. See SPARK-23300.
    -    if pyspark_sql in modules_to_test:
    -        # TODO(HyukjinKwon): Relocate and deduplicate these version 
specifications.
    -        minimum_pyarrow_version = '0.8.0'
    --- End diff --
    
    Gotcha. Thanks for the explanation!


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to