Just trying to get started with Spark and attempting to use HiveContext using
spark-shell to interact with existing Hive tables on my CDH cluster but keep
running into the errors (pls see below) when I do 'hiveContext.sql(show
tables)'. Wanted to know what all JARs need to be included to have this
Hi, bdev
Derby is the default embedded DB for Hive MetaStore if you do not specify a
hive.metastore.uris, please take a look at the lib directory of hive, you can
find out derby jar there, Spark does not require derby by default
At 2015-07-07 17:07:28, bdev buntu...@gmail.com wrote:
Just
[mailto:buntu...@gmail.com]
Sent: Tuesday, July 7, 2015 5:07 PM
To: user@spark.apache.org
Subject: HiveContext throws
org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient
Just trying to get started with Spark and attempting to use HiveContext using
spark-shell to interact with existing