[ 
https://issues.apache.org/jira/browse/HIVE-15302?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15704508#comment-15704508
 ] 

Rui Li commented on HIVE-15302:
-------------------------------

With Spark 2.0, we can use {{spark.yarn.archive}} or {{spark.yarn.jars}} to 
specify the spark jars needed on YARN side. Therefore, even if the spark is 
built with hive dependencies, we can exclude such jars (e.g. 
hive-exec-1.2.1.spark2.jar, spark-sql.jar) from {{spark.yarn.archive}}. Then we 
won't have conflicts running HoS. I did some simple tests and it worked. I'll 
do more investigation to verify.
After that, I think we can update our wiki to tell users how to achieve this 
(probably better if we can find the minimum set of required jars). In addition, 
we may also consider to automatically set this on hive side - since we ask user 
to set spark.home, it won't be difficult to find the location of the jars.
[~xuefuz], any idea on this?

> Relax the requirement that HoS needs Spark built w/o Hive
> ---------------------------------------------------------
>
>                 Key: HIVE-15302
>                 URL: https://issues.apache.org/jira/browse/HIVE-15302
>             Project: Hive
>          Issue Type: Improvement
>            Reporter: Rui Li
>            Assignee: Rui Li
>
> This requirement becomes more and more unacceptable as SparkSQL becomes 
> widely adopted. Let's use this JIRA to find out how we can relax the 
> limitation.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

Reply via email to