Hm, this is a common confusion... Although the variable name is
`sqlContext` in Spark shell, it's actually a `HiveContext`, which
extends `SQLContext` and has the ability to communicate with Hive metastore.
So your program need to instantiate a
`org.apache.spark.sql.hive.HiveContext` instead.
Thanks for your help !
Switching to HiveContext fixed the issue.
Just one side comment:
In the documentation regarding Hive Tables and HiveContext
https://spark.apache.org/docs/latest/sql-programming-guide.html#hive-tables,
we see:
// sc is an existing JavaSparkContext.HiveContext sqlContext =
Thanks for pointing out the documentation error :) Opened
https://github.com/apache/spark/pull/6749 to fix this.
On 6/11/15 1:18 AM, James Pirz wrote:
Thanks for your help !
Switching to HiveContext fixed the issue.
Just one side comment:
In the documentation regarding Hive Tables and
I am using Spark (standalone) to run queries (from a remote client) against
data in tables that are already defined/loaded in Hive.
I have started metastore service in Hive successfully, and by putting
hive-site.xml, with proper metastore.uri, in $SPARK_HOME/conf directory, I
tried to share its