Github user HyukjinKwon commented on a diff in the pull request:

    https://github.com/apache/spark/pull/21007#discussion_r181083759
  
    --- Diff: python/pyspark/sql/tests.py ---
    @@ -3062,6 +3071,64 @@ def 
test_sparksession_with_stopped_sparkcontext(self):
                 sc.stop()
     
     
    +class QueryExecutionListenerTests(unittest.TestCase, SQLTestUtils):
    +    # These tests are separate because it uses 
'spark.sql.queryExecutionListeners' which is
    +    # static and immutable. This can't be set or unset, for example, via 
`spark.conf`.
    +
    +    @classmethod
    +    def setUpClass(cls):
    +        import glob
    +        from pyspark.find_spark_home import _find_spark_home
    +
    +        SPARK_HOME = _find_spark_home()
    +        filename_pattern = (
    +            "sql/core/target/scala-*/test-classes/org/apache/spark/sql/"
    +            "TestQueryExecutionListener.class")
    +        if not glob.glob(os.path.join(SPARK_HOME, filename_pattern)):
    +            raise unittest.SkipTest(
    --- End diff --
    
    and .. for
    
    > If it happens, should we just silently skip this test like this?
    
    Yea, ideally we should warn explicitly in the console. The problem is about 
our own testing script .. We could make some changes to explicitly warn but 
seems we need some duplicated changes. 
    
    There are some discussions / changes going on here - 
https://github.com/apache/spark/pull/20909


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to