Github user HyukjinKwon commented on a diff in the pull request:

    https://github.com/apache/spark/pull/22104#discussion_r212794910
  
    --- Diff: python/pyspark/sql/utils.py ---
    @@ -152,6 +152,22 @@ def require_minimum_pyarrow_version():
                               "your version was %s." % 
(minimum_pyarrow_version, pyarrow.__version__))
     
     
    +def require_test_compiled():
    +    """ Raise Exception if test classes are not compiled
    +    """
    +    import os
    +    try:
    +        spark_home = os.environ['SPARK_HOME']
    +    except KeyError:
    +        raise RuntimeError('SPARK_HOME is not defined in environment')
    +
    +    test_class_path = os.path.join(
    +        spark_home, 'sql', 'core', 'target', 'scala-2.11', 'test-classes')
    --- End diff --
    
    Eh, @icexelloss, can we avoid specific version of `scala-2.11` here?


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to