Hi,

I am not sure at all that we need to use SQLContext and HiveContext
anymore.

Can you please check your JAVA_HOME, and SPARK_HOME? I use findspark
library to enable all environment variables for me regarding spark, or use
conda to install pyspark using conda-forge


Regards,
Gourav Sengupta


On Wed, Jan 5, 2022 at 4:08 PM Mich Talebzadeh <mich.talebza...@gmail.com>
wrote:

> hm,
>
> If I understand correctly
>
> from pyspark.sql import SparkSession
> from pyspark import SparkContext
> from pyspark.sql import SQLContext, HiveContext
> import sys
>
> def spark_session(appName):
>   return SparkSession.builder \
>         .appName(appName) \
>         .enableHiveSupport() \
>         .getOrCreate()
>
> def sparkcontext():
>   return SparkContext.getOrCreate()
>
> def hivecontext():
>   return HiveContext(sparkcontext())
>
>
> HTH
>
>
>    view my Linkedin profile
> <https://www.linkedin.com/in/mich-talebzadeh-ph-d-5205b2/>
>
>
>
> *Disclaimer:* Use it at your own risk. Any and all responsibility for any
> loss, damage or destruction of data or any other property which may arise
> from relying on this email's technical content is explicitly disclaimed.
> The author will in no case be liable for any monetary damages arising from
> such loss, damage or destruction.
>
>
>
>
> On Wed, 5 Jan 2022 at 16:00, 流年以东” <2538974...@qq.com.invalid> wrote:
>
>>
>> In the process of using pyspark,there is no spark context when opening
>> jupyter and input sc.master show that sc is not define.we want to
>> initialize the spark context with script. this is error.
>> hope to receive your reply
>> ------------------------------
>> 发自我的iPhone
>>
>> ---------------------------------------------------------------------
>> To unsubscribe e-mail: user-unsubscr...@spark.apache.org
>
>

Reply via email to