Ok.
I modified as per your suggestions
export SPARK_HOME=/home/dvasthimal/spark1.3/spark-1.3.0-bin-hadoop2.4
export SPARK_JAR=$SPARK_HOME/lib/spark-assembly-1.3.0-hadoop2.4.0.jar
export HADOOP_CONF_DIR=/apache/hadoop/conf
cd $SPARK_HOME
./bin/spark-sql -v --driver-class-path
Hey Deepak,
It seems that your hive-site.xml says your Hive metastore setup is using
MySQL. If that's not the case, you need to adjust your hive-site.xml
configurations. As for the version of MySQL driver, it should match the
MySQL server.
Cheng
On 3/27/15 11:07 AM, ÐΞ€ρ@Ҝ (๏̯͡๏) wrote:
I
I am unable to run spark-sql form command line. I attempted the following
1)
export SPARK_HOME=/home/dvasthimal/spark1.3/spark-1.3.0-bin-hadoop2.4
export SPARK_JAR=$SPARK_HOME/lib/spark-assembly-1.3.0-hadoop2.4.0.jar
export
I do not use MySQL, i want to read Hive tables from Spark SQL and transform
them in Spark SQL. Why do i need a MySQL driver ? If i still need it which
version should i use.
Assuming i need it, i downloaded the latest version of it from
If you're not using MySQL as your metastore for Hive, out of curiosity what
are you using?
The error you are seeing is common when there isn't the correct driver to
allow Spark to connect to the Hive metastore because the correct driver
isn't there.
As well, I noticed that you're using
As the exception suggests, you don't have MySQL JDBC driver on your
classpath.
On 3/27/15 10:45 AM, ÐΞ€ρ@Ҝ (๏̯͡๏) wrote:
I am unable to run spark-sql form command line. I attempted the following
1)
export SPARK_HOME=/home/dvasthimal/spark1.3/spark-1.3.0-bin-hadoop2.4
export