[ 
https://issues.apache.org/jira/browse/HIVE-7436?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Chengxiang Li updated HIVE-7436:
--------------------------------

    Description: 
load Spark configuration into Hive driver, there are 3 ways to setup spark 
configurations:
#  Configure properties in spark configuration file(spark-defaults.conf).
#  Java property.
#  System environment.
Spark support configuration through system environment just for compatible with 
previous scripts, we won't support in Hive on Spark. Hive on Spark load 
defaults from java properties, then load properties from configuration file, 
and override existed properties.

configuration steps:
1. create spark-defaults.conf, and place it in the /etc/spark/conf 
configuration directory.
    please refer to [http://spark.apache.org/docs/latest/configuration.html] 
for configuration of spark-defaults.conf.
2. create the $SPARK_CONF_DIR environment variable and set it to the location 
of spark-defaults.conf.
    export SPARK_CONF_DIR=/etc/spark/conf
3. Add $SAPRK_CONF_DIR to the $HADOOP_CLASSPATH environment variable.
    export HADOOP_CLASSPATH=$SPARK_CONF_DIR:$HADOOP_CLASSPATH

NO PRECOMMIT TESTS. This is for spark-branch only.

  was:
load Spark configuration into Hive driver:
# load Spark configuration through spark configuration file.
# load Spark configuration through java property and override.
# ship Spark configuration and Hive configuration to spark cluster.

NO PRECOMMIT TESTS. This is for spark-branch only.


> Load Spark configuration into Hive driver
> -----------------------------------------
>
>                 Key: HIVE-7436
>                 URL: https://issues.apache.org/jira/browse/HIVE-7436
>             Project: Hive
>          Issue Type: Sub-task
>          Components: Spark
>            Reporter: Chengxiang Li
>            Assignee: Chengxiang Li
>
> load Spark configuration into Hive driver, there are 3 ways to setup spark 
> configurations:
> #  Configure properties in spark configuration file(spark-defaults.conf).
> #  Java property.
> #  System environment.
> Spark support configuration through system environment just for compatible 
> with previous scripts, we won't support in Hive on Spark. Hive on Spark load 
> defaults from java properties, then load properties from configuration file, 
> and override existed properties.
> configuration steps:
> 1. create spark-defaults.conf, and place it in the /etc/spark/conf 
> configuration directory.
>     please refer to [http://spark.apache.org/docs/latest/configuration.html] 
> for configuration of spark-defaults.conf.
> 2. create the $SPARK_CONF_DIR environment variable and set it to the location 
> of spark-defaults.conf.
>     export SPARK_CONF_DIR=/etc/spark/conf
> 3. Add $SAPRK_CONF_DIR to the $HADOOP_CLASSPATH environment variable.
>     export HADOOP_CLASSPATH=$SPARK_CONF_DIR:$HADOOP_CLASSPATH
> NO PRECOMMIT TESTS. This is for spark-branch only.



--
This message was sent by Atlassian JIRA
(v6.2#6252)

Reply via email to