Ok, thanks a lot for the info. Will try installing Hive and make it work.
Will reach out to you if any roadblocks are encountered.

On Tue, Apr 11, 2023 at 1:43 PM Jeff Zhang <zjf...@gmail.com> wrote:

> For local, if you don't have hive installed, then you can not use
> hiveContext
>
> On Tue, Apr 11, 2023 at 4:00 PM VIVEK NARAYANASETTY <vive....@gmail.com>
> wrote:
>
>> Hi Jeff,
>>
>> This is in my local system and I am using Zeppelin docker image and
>> passing the SPARK_HOME directory as below.
>>
>> -v C:\spark-3.1.3-bin-hadoop3.2:/opt/spark -e SPARK_HOME=/opt/spark
>>
>> On Tue, Apr 11, 2023 at 12:22 PM Jeff Zhang <zjf...@gmail.com> wrote:
>>
>>> Ask your hadoop cluster admin for the hive-site.xml you should use.
>>>
>>> On Tue, Apr 11, 2023 at 2:19 PM VIVEK NARAYANASETTY <vive....@gmail.com>
>>> wrote:
>>>
>>>> Hi Jeff,
>>>>
>>>> Thanks a lot for responding. I have placed the hive-site.xml file in
>>>> the spark/conf folder now but now encountering a new error.
>>>>
>>>> org.apache.hadoop.hive.ql.metadata.HiveException:
>>>> java.lang.RuntimeException: Unable to instantiate
>>>> org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient
>>>>
>>>> I have placed hive-site.xml from the below git repository. Not sure if
>>>> its correct or not.
>>>>
>>>> https://github.com/FRosner/docker-zeppelin/blob/master/conf.templates/hive-site.xml.template
>>>>
>>>> Please let me know if I need to make any other changes.
>>>>
>>>>
>>>> On Tue, Apr 11, 2023 at 11:10 AM Jeff Zhang <zjf...@gmail.com> wrote:
>>>>
>>>>> Have you put hive-site.xml under SPARK_CONF_DIR?
>>>>>
>>>>>
>>>>> On Tue, Apr 11, 2023 at 1:01 PM VIVEK NARAYANASETTY <
>>>>> vive....@gmail.com> wrote:
>>>>>
>>>>>> Hi Users,
>>>>>>
>>>>>> Appreciate if any leads on the below issue.
>>>>>>
>>>>>> On Sat, Apr 8, 2023 at 1:33 PM VIVEK NARAYANASETTY <
>>>>>> vive....@gmail.com> wrote:
>>>>>>
>>>>>>> Hello Everyone,
>>>>>>>
>>>>>>> I am trying to create some tables using Spark SQL and encountering
>>>>>>> the below error in zeppelin. When debugged, I could see that the
>>>>>>> "zeppelin.spark.useHiveContext" is set to true which means hive context
>>>>>>> needs to be used in place of SQLContext but doesn't look its able to
>>>>>>> use/initiate HiveContext. Attaching my analysis screenshot on the 
>>>>>>> zeppelin
>>>>>>> notebook. Please advise on how to resolve this.
>>>>>>>
>>>>>>> Hive support is required to CREATE Hive TABLE (AS SELECT);
>>>>>>> 'CreateTable `test_schema_default_location`.`managed_table`,
>>>>>>> org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe, ErrorIfExists
>>>>>>>
>>>>>>>
>>>>>>> [image: image.png]
>>>>>>>
>>>>>>> --
>>>>>>> Thanks & Regards
>>>>>>> *Vivek Narayanasetty*
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> *Go Green: Think before you print this e-mail or its attachment. You
>>>>>>> can save paper if you do not really need to print.*
>>>>>>>
>>>>>>
>>>>>>
>>>>>> --
>>>>>> Thanks & Regards
>>>>>> *Vivek Narayanasetty*
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>> *Go Green: Think before you print this e-mail or its attachment. You
>>>>>> can save paper if you do not really need to print.*
>>>>>>
>>>>>
>>>>>
>>>>> --
>>>>> Best Regards
>>>>>
>>>>> Jeff Zhang
>>>>>
>>>>
>>>>
>>>> --
>>>> Thanks & Regards
>>>> *Vivek Narayanasetty*
>>>>
>>>>
>>>>
>>>>
>>>> *Go Green: Think before you print this e-mail or its attachment. You
>>>> can save paper if you do not really need to print.*
>>>>
>>>
>>>
>>> --
>>> Best Regards
>>>
>>> Jeff Zhang
>>>
>>
>>
>> --
>> Thanks & Regards
>> *Vivek Narayanasetty*
>>
>>
>>
>>
>> *Go Green: Think before you print this e-mail or its attachment. You can
>> save paper if you do not really need to print.*
>>
>
>
> --
> Best Regards
>
> Jeff Zhang
>


-- 
Thanks & Regards
*Vivek Narayanasetty*




*Go Green: Think before you print this e-mail or its attachment. You can
save paper if you do not really need to print.*

Reply via email to