Is you using the pyspark?

If pyspark, you can try to set env about PYSPARK_PYTHON  SPARK_HOME
Example:


import os
os.environ['PYSPARK_PYTHON'] = “python path”
os.environ[’SPARK_HOME’] = “SPARK path”


you can try this code…may it can resolved this.


在 2022年9月20日 17:34,Bjørn Jørgensen<bjornjorgen...@gmail.com> 写道:


Hi, we have a user group at user@spark.apache.org 


You must install a java JRE 


If you are on ubuntu you can type
apt-get install openjdk-17-jre-headless



tir. 20. sep. 2022 kl. 06:15 skrev yogita bhardwaj <yogita.bhard...@iktara.ai>:

 
I am getting the py4j.protocol.Py4JJavaError while running SparkContext. Can 
you please help me to resolve this issue.
 
Sent from Mail for Windows
 




-- 

Bjørn Jørgensen 
Vestre Aspehaug 4, 6010 Ålesund 
Norge

+47 480 94 297

Reply via email to