[ 
https://issues.apache.org/jira/browse/HIVE-7436?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14079131#comment-14079131
 ] 

Xuefu Zhang commented on HIVE-7436:
-----------------------------------

[~chengxiang li], I guess for now expecting spark-defaults.conf from hadoop 
classpath is fine for now, though we might need to go back to revisit and 
rebrainstorm on this. Note that we don't have to follow exactly what Tez did on 
every aspect, but I agree it can serve as a good reference point, giving users 
a similar experience.


> Load Spark configuration into Hive driver
> -----------------------------------------
>
>                 Key: HIVE-7436
>                 URL: https://issues.apache.org/jira/browse/HIVE-7436
>             Project: Hive
>          Issue Type: Sub-task
>          Components: Spark
>            Reporter: Chengxiang Li
>            Assignee: Chengxiang Li
>             Fix For: spark-branch
>
>         Attachments: HIVE-7436-Spark.1.patch, HIVE-7436-Spark.2.patch, 
> HIVE-7436-Spark.3.patch
>
>
> load Spark configuration into Hive driver, there are 3 ways to setup spark 
> configurations:
> #  Java property.
> #  Configure properties in spark configuration file(spark-defaults.conf).
> #  Hive configuration file(hive-site.xml).
> The below configuration has more priority, and would overwrite previous 
> configuration with the same property name.
> Please refer to [http://spark.apache.org/docs/latest/configuration.html] for 
> all configurable properties of spark, and you can configure spark 
> configuration in Hive through following ways:
> # Configure through spark configuration file.
> #* Create spark-defaults.conf, and place it in the /etc/spark/conf 
> configuration directory. configure properties in spark-defaults.conf in java 
> properties format.
> #* Create the $SPARK_CONF_DIR environment variable and set it to the location 
> of spark-defaults.conf.
>     export SPARK_CONF_DIR=/etc/spark/conf
> #* Add $SAPRK_CONF_DIR to the $HADOOP_CLASSPATH environment variable.
>     export HADOOP_CLASSPATH=$SPARK_CONF_DIR:$HADOOP_CLASSPATH
> # Configure through hive configuration file.
> #* edit hive-site.xml in hive conf directory, configure properties in 
> spark-defaults.conf in xml format.
> Hive driver default spark properties:
> ||name||default value||description||
> |spark.master|local|Spark master url.|
> |spark.app.name|Hive on Spark|Default Spark application name.|
> NO PRECOMMIT TESTS. This is for spark-branch only.



--
This message was sent by Atlassian JIRA
(v6.2#6252)

Reply via email to