[
https://issues.apache.org/jira/browse/HIVE-7436?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Chengxiang Li updated HIVE-7436:
--------------------------------
Attachment: HIVE-7436-Spark.2.patch
update patch with default spark.master and spark.app.name.
> Load Spark configuration into Hive driver
> -----------------------------------------
>
> Key: HIVE-7436
> URL: https://issues.apache.org/jira/browse/HIVE-7436
> Project: Hive
> Issue Type: Sub-task
> Components: Spark
> Reporter: Chengxiang Li
> Assignee: Chengxiang Li
> Attachments: HIVE-7436-Spark.1.patch, HIVE-7436-Spark.2.patch
>
>
> load Spark configuration into Hive driver, there are 3 ways to setup spark
> configurations:
> # Configure properties in spark configuration file(spark-defaults.conf).
> # Java property.
> # System environment.
> Spark support configuration through system environment just for compatible
> with previous scripts, we won't support in Hive on Spark. Hive on Spark load
> defaults from java properties, then load properties from configuration file,
> and override existed properties.
> configuration steps:
> # Create spark-defaults.conf, and place it in the /etc/spark/conf
> configuration directory.
> please refer to [http://spark.apache.org/docs/latest/configuration.html]
> for configuration of spark-defaults.conf.
> # Create the $SPARK_CONF_DIR environment variable and set it to the location
> of spark-defaults.conf.
> export SPARK_CONF_DIR=/etc/spark/conf
> # Add $SAPRK_CONF_DIR to the $HADOOP_CLASSPATH environment variable.
> export HADOOP_CLASSPATH=$SPARK_CONF_DIR:$HADOOP_CLASSPATH
> NO PRECOMMIT TESTS. This is for spark-branch only.
--
This message was sent by Atlassian JIRA
(v6.2#6252)