Hi, Chetan.

Did you copy your `hive-site.xml` into Spark conf directory? For example,

cp /usr/local/hive/conf/hive-site.xml /usr/local/spark/conf

If you want to use the existing Hive metastore, you need to provide that 
information to Spark.

Bests,
Dongjoon.

On 2017-01-16 21:36 (-0800), Chetan Khatri <chetan.opensou...@gmail.com> wrote: 
> Hello,
> 
> I have following services are configured and installed successfully:
> 
> Hadoop 2.7.x
> Spark 2.0.x
> HBase 1.2.4
> Hive 1.2.1
> 
> *Installation Directories:*
> 
> /usr/local/hadoop
> /usr/local/spark
> /usr/local/hbase
> 
> *Hive Environment variables:*
> 
> #HIVE VARIABLES START
> export HIVE_HOME=/usr/local/hive
> export PATH=$PATH:$HIVE_HOME/bin
> #HIVE VARIABLES END
> 
> So, I can access Hive from anywhere as environment variables are
> configured. Now if if i start my spark-shell & hive from location
> /usr/local/hive then both work good for hive-metastore other wise from
> where i start spark-shell where spark creates own meta-store.
> 
> i.e I am reading from HBase and Writing to Hive using Spark. I dont know
> why this is weird issue is.
> 
> 
> 
> 
> Thanks.
> 

---------------------------------------------------------------------
To unsubscribe e-mail: dev-unsubscr...@spark.apache.org

Reply via email to