Ryan Blue created SPARK-13403:
---------------------------------

             Summary: HiveConf used for SparkSQL is not based on the Hadoop 
configuration
                 Key: SPARK-13403
                 URL: https://issues.apache.org/jira/browse/SPARK-13403
             Project: Spark
          Issue Type: Bug
          Components: Spark Core
    Affects Versions: 1.6.0
            Reporter: Ryan Blue


The HiveConf instances used by HiveContext are not instantiated by passing in 
the SparkContext's Hadoop conf and are instead based only on the config files 
in the environment. Hadoop best practice is to instantiate just one 
Configuration from the environment and then pass that conf when instantiating 
others so that modifications aren't lost.

Spark will set configuration variables that start with "spark.hadoop." from 
spark-defaults.conf when creating {{sc.hadoopConfiguration}}, which are not 
correctly passed to the HiveConf because of this.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to