Fwd: How to import PySpark into Jupyter

2020-04-10 Thread Yasir Elgohary
Peace dear all, I hope you all are well and healthy... I am brand new to Spark/Hadoop. My env. is: Windows 7 with Jupyter/Anaconda and Spark/Hadoop all installed on my laptop. How can I run the following without errors: import findspark findspark.init() findspark.find() from pyspark.sql import S

Re: How to import PySpark into Jupyter

2020-04-10 Thread Akchhaya S
Hello Yasir, You need to check your 'PYTHONPATH' environment variable. For windows, If I do a "pip install", the package is installed in "lib\site-packages" under the python folder. If I "print (sys.path)", I see "lib\site-packages" as one of the entries, and I can expect "import " to work. Find