Yes, Spark 1.5 use Hive 1.2's metastore client by default. You can change
it by putting the following settings in your spark conf.

spark.sql.hive.metastore.version = 0.13.1
spark.sql.hive.metastore.jars = maven or the path of your hive 0.13 jars
and hadoop jars

For spark.sql.hive.metastore.jars, basically, it tells spark sql where to
find metastore client's classes of Hive 0.13.1. If you set it to maven, we
will download needed jars directly (it is an easy way to do testing work).

On Thu, Sep 10, 2015 at 7:45 PM, StanZhai <m...@zhaishidan.cn> wrote:

> Thank you for the swift reply!
>
> The version of my hive metastore server is 0.13.1, I've build spark use sbt
> like this:
> build/sbt -Pyarn -Phadoop-2.4 -Phive -Phive-thriftserver assembly
>
> Is spark 1.5 bind the hive client version of 1.2 by default?
>
>
>
> --
> View this message in context:
> http://apache-spark-developers-list.1001551.n3.nabble.com/SparkSQL-Could-not-alter-table-in-Spark-1-5-use-HiveContext-tp14029p14044.html
> Sent from the Apache Spark Developers List mailing list archive at
> Nabble.com.
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
> For additional commands, e-mail: dev-h...@spark.apache.org
>
>

Reply via email to