Thanks Doug,

I have all the 4 configs (mentioned by you) already in my hive-site.xml. Do
I need to create a hive-site.xml in spark conf directory (it is not there
by default in 1.6.1)? Please suggest.


On Mon, May 23, 2016 at 9:53 PM, Doug Balog <doug.sparku...@dugos.com>
wrote:

> I have a custom  hive-site.xml for spark in sparks conf directory.
> These properties are the minimal ones that you need for spark, I believe.
>
> hive.metastore.kerberos.principal = copy from your hive-site.xml,  i.e.
> "hive/_h...@foo.com"
> hive.metastore.uris = copy from your hive-site.xml,  i.e. thrift://
> ms1.foo.com:9083
> hive.metastore.sasl.enabled = true
> hive.security.authorization.enabled = false
>
> Cheers,
>
> Doug
>
>
>
> > On May 23, 2016, at 7:41 AM, Chandraprakash Bhagtani <
> cpbhagt...@gmail.com> wrote:
> >
> > Hi,
> >
> > My Spark job is failing with kerberos issues while creating hive context
> in yarn-cluster mode. However it is running with yarn-client mode. My spark
> version is 1.6.1
> >
> > I am passing hive-site.xml through --files option.
> >
> > I tried searching online and found that the same issue is fixed with the
> following jira SPARK-6207. it is fixed in spark 1.4, but I am running 1.6.1
> >
> > Am i missing any configuration here?
> >
> >
> > --
> > Thanks & Regards,
> > Chandra Prakash Bhagtani
>
>


-- 
Thanks & Regards,
Chandra Prakash Bhagtani

Reply via email to