You are using a Hive version which is not support by Spark SQL. Spark
SQL 1.1.x and prior versions only support Hive 0.12.0. Spark SQL 1.2.0
supports Hive 0.12.0 or Hive 0.13.1.
On 2/27/15 12:12 AM, sandeep vura wrote:
Hi Cheng,
Thanks the above issue has been resolved.I have configured
Oh Thanks for the clarification,I will try to downgrade hive.
On Thu, Feb 26, 2015 at 9:44 PM, Cheng Lian lian.cs@gmail.com wrote:
You are using a Hive version which is not support by Spark SQL. Spark SQL
1.1.x and prior versions only support Hive 0.12.0. Spark SQL 1.2.0 supports
Hive
Seems that you are running Hive metastore over MySQL, but don’t have
MySQL JDBC driver on classpath:
Caused by:
org.datanucleus.store.rdbms.connectionpool.DatastoreDriverNotFoundException:
The specified datastore driver (“com.mysql.jdbc.Driver”) was not
found in the CLASSPATH.
Hi Cheng,
Thanks the above issue has been resolved.I have configured Remote metastore
not Local metastore in Hive.
While creating a table in sparksql another error reflecting on terminal .
Below error is given below
sqlContext.sql(LOAD DATA LOCAL INPATH
'/home/spark12/sandeep_data/sales_pg.csv'
Hi Sparkers,
I am trying to creating hive table in SparkSql.But couldn't able to create
it.Below are the following errors which are generating so far.
java.lang.RuntimeException: java.lang.RuntimeException: Unable to
instantiate org.apache.hadoop.hive.metastore.HiveMetaStoreClient
at