The error you're seeing typically means that you cannot connect to the Hive
metastore itself. Some quick thoughts:
- If you were to run show tables (instead of the CREATE TABLE statement),
are you still getting the same error?
- To confirm, the Hive metastore (MySQL database) is up and running
Hi Denny,
Still facing the same issue.Please find the following errors.
*scala val sqlContext = new org.apache.spark.sql.hive.HiveContext(sc)*
*sqlContext: org.apache.spark.sql.hive.HiveContext =
org.apache.spark.sql.hive.HiveContext@4e4f880c*
*scala sqlContext.sql(CREATE TABLE IF NOT EXISTS
No I am just running ./spark-shell command in terminal I will try with
above command
On Wed, Mar 25, 2015 at 11:09 AM, Denny Lee denny.g@gmail.com wrote:
Did you include the connection to a MySQL connector jar so that way
spark-shell / hive can connect to the metastore?
For example, when
Did you include the connection to a MySQL connector jar so that way
spark-shell / hive can connect to the metastore?
For example, when I run my spark-shell instance in standalone mode, I use:
./spark-shell --master spark://servername:7077 --driver-class-path
/lib/mysql-connector-java-5.1.27.jar
I was actually just able to reproduce the issue. I do wonder if this is a
bug -- the docs say When not configured by the hive-site.xml, the context
automatically creates metastore_db and warehouse in the current directory.
But as you can see in from the message warehouse is not in the current
Hi yana,
I have removed hive-site.xml from spark/conf directory but still getting
the same errors. Anyother way to work around.
Regards,
Sandeep
On Fri, Feb 27, 2015 at 9:38 PM, Yana Kadiyska yana.kadiy...@gmail.com
wrote:
I think you're mixing two things: the docs say When* not *configured
I think you're mixing two things: the docs say When* not *configured by
the hive-site.xml, the context automatically creates metastore_db and
warehouse in the current directory.. AFAIK if you want a local metastore,
you don't put hive-site.xml anywhere. You only need the file if you're
going to