Hi Spark community,

I am having a hard time setting up my Pycharm to work with pyspark. Can any
of you point me to documentation available?

Things I have tried till now :

   1. Download and Install Apache spark
   2. Add pyspark package in pycharm.
   3. Add SPARK_HOME. PYTHONPATH, HADOOP_HOME env variables to Run config

Error I am getting :

Using Spark's default log4j profile:
org/apache/spark/log4j-defaults.properties
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use
setLogLevel(newLevel).
21/11/16 23:26:28 WARN NativeCodeLoader: *Unable to load native-hadoop
library for your platform... using builtin-java classes where applicable*
Traceback (most recent call last):


-- 
Cheers,
Anil Kulkarni
https://anilkulkarni.com/

Reply via email to