in the current directory./
So I have existing hive set up and configured, how would I be able to use
the same in Spark?
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/Link-existing-Hive-to-Spark-tp21531.html
Sent from the Apache Spark User List mailing list archive
: Link existing Hive to Spark
Hi Ashu,
Per the documents:
Configuration of Hive is done by placing your hive-site.xml file in conf/.
For example, you can place a something like this in your
$SPARK_HOME/conf/hive-site.xml file:
configuration
property
namehive.metastore.uris/name
!-- Ensure
I have existing hive set up and configured, how would I be able to use
the same in Spark?
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/Link-existing-Hive-to-Spark-tp21531.html
Sent from the Apache Spark User List mailing list archive at Nabble.com
ok.Is there no way to specify it in code, when I create SparkConf ?
From: Todd Nist tsind...@gmail.com
Sent: Friday, February 6, 2015 10:08 PM
To: Ashutosh Trivedi (MT2013030)
Cc: user@spark.apache.org
Subject: Re: Link existing Hive to Spark
You can always just