Repository: spark
Updated Branches:
  refs/heads/branch-1.4 9d6475b93 -> 4940630f5


[SPARK-8020] [SQL] Spark SQL conf in spark-defaults.conf make metadataHive get 
constructed too early

https://issues.apache.org/jira/browse/SPARK-8020

Author: Yin Huai <yh...@databricks.com>

Closes #6571 from yhuai/SPARK-8020-1 and squashes the following commits:

0398f5b [Yin Huai] First populate the SQLConf and then construct executionHive 
and metadataHive.

(cherry picked from commit 7b7f7b6c6fd903e2ecfc886d29eaa9df58adcfc3)
Signed-off-by: Yin Huai <yh...@databricks.com>


Project: http://git-wip-us.apache.org/repos/asf/spark/repo
Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/4940630f
Tree: http://git-wip-us.apache.org/repos/asf/spark/tree/4940630f
Diff: http://git-wip-us.apache.org/repos/asf/spark/diff/4940630f

Branch: refs/heads/branch-1.4
Commit: 4940630f56d3e95a01526bf1fdfc88517b8e661b
Parents: 9d6475b
Author: Yin Huai <yh...@databricks.com>
Authored: Tue Jun 2 00:16:56 2015 -0700
Committer: Yin Huai <yh...@databricks.com>
Committed: Tue Jun 2 00:17:09 2015 -0700

----------------------------------------------------------------------
 .../scala/org/apache/spark/sql/SQLContext.scala | 25 +++++++++++++++++---
 1 file changed, 22 insertions(+), 3 deletions(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/spark/blob/4940630f/sql/core/src/main/scala/org/apache/spark/sql/SQLContext.scala
----------------------------------------------------------------------
diff --git a/sql/core/src/main/scala/org/apache/spark/sql/SQLContext.scala 
b/sql/core/src/main/scala/org/apache/spark/sql/SQLContext.scala
index 7384b24..91e6385 100644
--- a/sql/core/src/main/scala/org/apache/spark/sql/SQLContext.scala
+++ b/sql/core/src/main/scala/org/apache/spark/sql/SQLContext.scala
@@ -182,9 +182,28 @@ class SQLContext(@transient val sparkContext: SparkContext)
     conf.dialect
   }
 
-  sparkContext.getConf.getAll.foreach {
-    case (key, value) if key.startsWith("spark.sql") => setConf(key, value)
-    case _ =>
+  {
+    // We extract spark sql settings from SparkContext's conf and put them to
+    // Spark SQL's conf.
+    // First, we populate the SQLConf (conf). So, we can make sure that other 
values using
+    // those settings in their construction can get the correct settings.
+    // For example, metadataHive in HiveContext may need both 
spark.sql.hive.metastore.version
+    // and spark.sql.hive.metastore.jars to get correctly constructed.
+    val properties = new Properties
+    sparkContext.getConf.getAll.foreach {
+      case (key, value) if key.startsWith("spark.sql") => 
properties.setProperty(key, value)
+      case _ =>
+    }
+    // We directly put those settings to conf to avoid of calling setConf, 
which may have
+    // side-effects. For example, in HiveContext, setConf may cause 
executionHive and metadataHive
+    // get constructed. If we call setConf directly, the constructed 
metadataHive may have
+    // wrong settings, or the construction may fail.
+    conf.setConf(properties)
+    // After we have populated SQLConf, we call setConf to populate other 
confs in the subclass
+    // (e.g. hiveconf in HiveContext).
+    properties.foreach {
+      case (key, value) => setConf(key, value)
+    }
   }
 
   @transient


---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org

Reply via email to