[
https://issues.apache.org/jira/browse/HIVE-8795?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14208484#comment-14208484
]
Xuefu Zhang commented on HIVE-8795:
-----------------------------------
Requiring a Spark installation can be as simple as downloading a tarball as
part of the build and untar it to a folder to which SPARK_HOME can be set. It's
possible, however, with Spark assembly + certain scripts, we may not even need
the tarball. However, I haven't tried this combination, but according to Spark
folks, this is possible.
> Switch precommit test from local to local-cluster [Spark Branch]
> ----------------------------------------------------------------
>
> Key: HIVE-8795
> URL: https://issues.apache.org/jira/browse/HIVE-8795
> Project: Hive
> Issue Type: Sub-task
> Components: Spark
> Reporter: Xuefu Zhang
> Assignee: Szehon Ho
>
> It seems unlikely that Spark community will provide MRMiniCluster equivalent
> (SPARK-3691), and Spark local-cluster was the recommendation. Latest research
> shows that Spark local-cluster works with Hive. Therefore, for now, we use
> Spark local-cluster (instead of current local) for our precommit test.
> It's previous belived (HIVE-7382) that a Spark installation is required and
> SPARK_HOME env variable needs to set. Since Spark pulls in Spark's assembly
> jar, it's believed now we only need a few script from Spark installation
> instead.
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)