Repository: spark
Updated Branches:
  refs/heads/master d4e7f20f5 -> 10b3ca3e9


[SPARK-21574][SQL] Point out user to set hive config before SparkSession is 
initialized

## What changes were proposed in this pull request?
Since Spark 2.0.0, SET hive config commands do not pass the values to 
HiveClient, this PR point out user to set hive config before SparkSession is 
initialized when they try to set hive config.

## How was this patch tested?
manual tests

<img width="1637" alt="spark-set" 
src="https://user-images.githubusercontent.com/5399861/29001141-03f943ee-7ab3-11e7-8584-ba5a5e81f6ad.png";>

Author: Yuming Wang <wgy...@gmail.com>

Closes #18769 from wangyum/SPARK-21574.


Project: http://git-wip-us.apache.org/repos/asf/spark/repo
Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/10b3ca3e
Tree: http://git-wip-us.apache.org/repos/asf/spark/tree/10b3ca3e
Diff: http://git-wip-us.apache.org/repos/asf/spark/diff/10b3ca3e

Branch: refs/heads/master
Commit: 10b3ca3e9382e2b407492a4ca008a887f706f763
Parents: d4e7f20
Author: Yuming Wang <wgy...@gmail.com>
Authored: Sun Aug 6 10:08:44 2017 -0700
Committer: gatorsmile <gatorsm...@gmail.com>
Committed: Sun Aug 6 10:08:44 2017 -0700

----------------------------------------------------------------------
 .../org/apache/spark/sql/execution/command/SetCommand.scala | 9 +++++++++
 1 file changed, 9 insertions(+)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/spark/blob/10b3ca3e/sql/core/src/main/scala/org/apache/spark/sql/execution/command/SetCommand.scala
----------------------------------------------------------------------
diff --git 
a/sql/core/src/main/scala/org/apache/spark/sql/execution/command/SetCommand.scala
 
b/sql/core/src/main/scala/org/apache/spark/sql/execution/command/SetCommand.scala
index 5f12830..7477d02 100644
--- 
a/sql/core/src/main/scala/org/apache/spark/sql/execution/command/SetCommand.scala
+++ 
b/sql/core/src/main/scala/org/apache/spark/sql/execution/command/SetCommand.scala
@@ -21,6 +21,7 @@ import org.apache.spark.internal.Logging
 import org.apache.spark.sql.{Row, SparkSession}
 import org.apache.spark.sql.catalyst.expressions.Attribute
 import org.apache.spark.sql.internal.SQLConf
+import org.apache.spark.sql.internal.StaticSQLConf.CATALOG_IMPLEMENTATION
 import org.apache.spark.sql.types.{StringType, StructField, StructType}
 
 
@@ -87,6 +88,14 @@ case class SetCommand(kv: Option[(String, Option[String])]) 
extends RunnableComm
     // Configures a single property.
     case Some((key, Some(value))) =>
       val runFunc = (sparkSession: SparkSession) => {
+        if (sparkSession.conf.get(CATALOG_IMPLEMENTATION.key).equals("hive") &&
+            key.startsWith("hive.")) {
+          logWarning(s"'SET $key=$value' might not work, since Spark doesn't 
support changing " +
+            "the Hive config dynamically. Please passing the Hive-specific 
config by adding the " +
+            s"prefix spark.hadoop (e.g., spark.hadoop.$key) when starting a 
Spark application. " +
+            "For details, see the link: 
https://spark.apache.org/docs/latest/configuration.html#"; +
+            "dynamically-loading-spark-properties.")
+        }
         sparkSession.conf.set(key, value)
         Seq(Row(key, value))
       }


---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org

Reply via email to