Zhang created SPARK-31483: ----------------------------- Summary: pyspark shell IPython launch throws ".../pyspark/bin/load-spark-env.sh: No such file or directory" Key: SPARK-31483 URL: https://issues.apache.org/jira/browse/SPARK-31483 Project: Spark Issue Type: Bug Components: PySpark, Spark Shell Affects Versions: 2.4.5 Environment: $ uname -a
Darwin mengyu-C02Z7885LVDQ 19.3.0 Darwin Kernel Version 19.3.0: Thu Jan 9 20:58:23 PST 2020; root:xnu-6153.81.5~1/RELEASE_X86_64 x86_64 $ python -V Python 3.7.7 $ ipython -V 7.13.0 Reporter: Zhang I'm trying launching pyspark shell with IPython interface via {{PYSPARK_DRIVER_PYTHON=ipython pyspark}} However it hits ".../pyspark/bin/load-spark-env.sh: No such file or directory" {{(py3-spark) mengyu@mengyu-C02Z7885LVDQ:~/workspace/tmp$ PYSPARK_DRIVER_PYTHON=ipython pyspark}} {{/Users/mengyu/opt/anaconda2/envs/py3-spark/bin/pyspark: line 24: /Users/mengyu/opt/anaconda2/envs/py3-spark/lib/python3.7/site-packages/pyspark/bin/load-spark-env.sh: No such file or directory}} {{/Users/mengyu/opt/anaconda2/envs/py3-spark/bin/pyspark: line 77: /Users/mengyu/workspace/tmp//Users/mengyu/opt/anaconda2/envs/py3-spark/lib/python3.7/site-packages/pyspark/bin/spark-submit: No such file or directory}} {{/Users/mengyu/opt/anaconda2/envs/py3-spark/bin/pyspark: line 77: exec: /Users/mengyu/workspace/tmp//Users/mengyu/opt/anaconda2/envs/py3-spark/lib/python3.7/site-packages/pyspark/bin/spark-submit: cannot execute: No such file or directory}} It is strange because the path "{{/Users/mengyu/opt/anaconda2/envs/py3-spark/lib/python3.7/site-packages/pyspark/bin/load-spark-env.sh}}" exists. {{$ file /Users/mengyu/opt/anaconda2/envs/py3-spark/lib/python3.7/site-packages/pyspark/bin/load-spark-env.sh}}{{/Users/mengyu/opt/anaconda2/envs/py3-spark/lib/python3.7/site-packages/pyspark/bin/load-spark-env.sh: Bourne-Again shell script text executable, ASCII text}} -- This message was sent by Atlassian Jira (v8.3.4#803005) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org