[ 
https://issues.apache.org/jira/browse/HIVE-15302?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15709208#comment-15709208
 ] 

Marcelo Vanzin commented on HIVE-15302:
---------------------------------------

bq. I plan to find the needed jars from the Spark installed in the cluster

That's kind of what I meant. Wouldn't it be better to just directly depend on 
the parts of Spark that Hive needs, package those with Hive, and not have to 
depend on any cluster deployment of Spark?

Then the user doesn't need to care about a separate Spark installation when he 
wants to run HoS.

> Relax the requirement that HoS needs Spark built w/o Hive
> ---------------------------------------------------------
>
>                 Key: HIVE-15302
>                 URL: https://issues.apache.org/jira/browse/HIVE-15302
>             Project: Hive
>          Issue Type: Improvement
>            Reporter: Rui Li
>            Assignee: Rui Li
>
> This requirement becomes more and more unacceptable as SparkSQL becomes 
> widely adopted. Let's use this JIRA to find out how we can relax the 
> limitation.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

Reply via email to