I want to build hive and spark to make my hive based on spark engine.
I choose Hive 2.3.0 and Spark 2.0.0, which is claimed to be compatible by hive 
official document.
According to the hive officials document ,I  have to build spark without hive 
profile to avoid the conflict between original hive and spark-integrated hive. 
Yes, I build successfully , but then the problem comes:I cannot use spark-sql 
anymore because spark-sql relies on the hive library and my spark is a no-hive 
build.

I don’t know the relationship between hive-integrated hive and original hive. 
Below is the spark-integrated hive jars:

hive-beeline-1.2.1.spark2.jar
hive-cli-1.2.1.spark2.jar
hive-exec-1.2.1.spark2.jar
hive-jdbc-1.2.1.spark2.jar
hive-metastore-1.2.1.spark2.jar
spark-hive_2.11-2.0.0.jar
spark-hive-thriftserver_2.11-2.0.0.jar
It seems that Spark 2.0.0 relies on hive 1.2.1。

Can I just add my 2.3.0 hive's libs to the classpath of Spark?

Reply via email to