Yikun Jiang created SPARK-37721: ----------------------------------- Summary: Failed to execute pyspark test in Win WSL Key: SPARK-37721 URL: https://issues.apache.org/jira/browse/SPARK-37721 Project: Spark Issue Type: Bug Components: Tests Affects Versions: 3.3.0 Reporter: Yikun Jiang
{code:java} Launching unittests with arguments python -m unittest test_rdd.RDDTests.test_range in /home/yikun/spark/python/pyspark/testsTraceback (most recent call last): File "/mnt/d/Program Files/JetBrains/PyCharm 2021.1.3/plugins/python/helpers/pycharm/_jb_unittest_runner.py", line 35, in <module> sys.exit(main(argv=args, module=None, testRunner=unittestpy.TeamcityTestRunner, buffer=not JB_DISABLE_BUFFERING)) File "/usr/lib/python3.8/unittest/main.py", line 100, in __init__ self.parseArgs(argv) File "/usr/lib/python3.8/unittest/main.py", line 147, in parseArgs self.createTests() File "/usr/lib/python3.8/unittest/main.py", line 158, in createTests self.test = self.testLoader.loadTestsFromNames(self.testNames, File "/usr/lib/python3.8/unittest/loader.py", line 220, in loadTestsFromNames suites = [self.loadTestsFromName(name, module) for name in names] File "/usr/lib/python3.8/unittest/loader.py", line 220, in <listcomp> suites = [self.loadTestsFromName(name, module) for name in names] File "/usr/lib/python3.8/unittest/loader.py", line 154, in loadTestsFromName module = __import__(module_name) File "/home/yikun/spark/python/pyspark/tests/test_rdd.py", line 37, in <module> from pyspark.testing.utils import ReusedPySparkTestCase, SPARK_HOME, QuietTest File "/home/yikun/spark/python/pyspark/testing/utils.py", line 47, in <module> SPARK_HOME = os.environ["SPARK_HOME"]#_find_spark_home() File "/usr/lib/python3.8/os.py", line 675, in __getitem__ raise KeyError(key) from None KeyError: 'SPARK_HOME' {code} Looks like we should change "SPARK_HOME = os.environ["SPARK_HOME"]" to "SPARK_HOME = _find_spark_home()" -- This message was sent by Atlassian Jira (v8.20.1#820001) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org