Re: Apache Spark 3.2.0 | Pyspark | Pycharm Setup

2021-11-17 Thread Mich Talebzadeh
yep the latest pyspark is 3.2. you can easily install it from available packages [image: image.png] my Linkedin profile *Disclaimer:* Use it at your own risk. Any and all responsibility for any loss, damage or destruction of data or

Re: Apache Spark 3.2.0 | Pyspark | Pycharm Setup

2021-11-17 Thread Khalid Mammadov
Hi Anil You dont need to download and Install Spark. It's enough to add pyspark to PyCharm as a package for your environment and start developing and testing locally. The thing is PySpark includes local Spark that is installed as part of pip install. When it comes to your particular issue. I

Re: Apache Spark 3.2.0 | Pyspark | Pycharm Setup

2021-11-17 Thread Gourav Sengupta
Hi Anil, I generally create an anaconda environment, and then install pyspark in it, and then configure the interpreter to point to that particular environment. Never faced an issue with my approach. Regards, Gourav Sengupta On Wed, Nov 17, 2021 at 7:39 AM Anil Kulkarni wrote: > Hi Spark

Apache Spark 3.2.0 | Pyspark | Pycharm Setup

2021-11-16 Thread Anil Kulkarni
Hi Spark community, I am having a hard time setting up my Pycharm to work with pyspark. Can any of you point me to documentation available? Things I have tried till now : 1. Download and Install Apache spark 2. Add pyspark package in pycharm. 3. Add SPARK_HOME. PYTHONPATH, HADOOP_HOME