Re: [DISCUSS] SPIP: Support Docker Official Image for Spark

2022-09-20 Thread 416161...@qq.com
+1 RuifengZheng ruife...@foxmail.com --Original-- From: "Chao Sun"

Re: [DISCUSS] SPIP: Support Docker Official Image for Spark

2022-09-20 Thread Chao Sun
+1 (non-binding) On Mon, Sep 19, 2022 at 10:17 PM Wenchen Fan wrote: > > +1 > > On Mon, Sep 19, 2022 at 2:59 PM Yang,Jie(INF) wrote: >> >> +1 (non-binding) >> >> >> >> Yang Jie >> >> >> 发件人: Yikun Jiang >> 发送时间: 2022年9月19日 14:23:14 >> 收件人: Denny Lee >> 抄送: bo

回复:

2022-09-20 Thread javacaoyu
Try: import os os.environ['PYSPARK_PYTHON'] = “python path” os.environ[’SPARK_HOME’] = “SPARK path” 在 2022年9月20日 17:51,yogita bhardwaj 写道: I have installed pyspark using pip. I m getting the error while running the following code. from pyspark import SparkContext sc=SparkContext()

[no subject]

2022-09-20 Thread yogita bhardwaj
I have installed pyspark using pip. I m getting the error while running the following code. from pyspark import SparkContext sc=SparkContext() a=sc.parallelize([1,2,3,4]) print(f"a_take:{a.take(2)}") py4j.protocol.Py4JJavaError: An error occurred while calling

Re: Issue with SparkContext

2022-09-20 Thread javacaoyu
Is you using the pyspark? If pyspark, you can try to set env about PYSPARK_PYTHON SPARK_HOME Example: import os os.environ['PYSPARK_PYTHON'] = “python path” os.environ[’SPARK_HOME’] = “SPARK path” you can try this code…may it can resolved this. 在 2022年9月20日 17:34,Bjørn Jørgensen 写道: Hi,

Re: Issue with SparkContext

2022-09-20 Thread Bjørn Jørgensen
Hi, we have a user group at u...@spark.apache.org You must install a java JRE If you are on ubuntu you can type apt-get install openjdk-17-jre-headless tir. 20. sep. 2022 kl. 06:15 skrev yogita bhardwaj < yogita.bhard...@iktara.ai>: > > > I am getting the py4j.protocol.Py4JJavaError while

Pyspark SparkContext issue

2022-09-20 Thread yogita bhardwaj
I am getting the py4j.protocol.Py4JJavaError while running SparkContext. Can you please help me to resolve this issue. from pyspark import SparkContext sc=SparkContext() a=sc.parallelize([1,2,3,4]) print(f"a_take:{a.take(2)}") py4j.protocol.Py4JJavaError: An error occurred while calling