[GitHub] spark pull request #23031: [SPARK-26060][SQL] Track SparkConf entries and ma...

2018-11-29 Thread asfgit
Github user asfgit closed the pull request at:

https://github.com/apache/spark/pull/23031


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #23031: [SPARK-26060][SQL] Track SparkConf entries and ma...

2018-11-28 Thread cloud-fan
Github user cloud-fan commented on a diff in the pull request:

https://github.com/apache/spark/pull/23031#discussion_r237364466
  
--- Diff: 
sql/catalyst/src/main/scala/org/apache/spark/sql/internal/SQLConf.scala ---
@@ -1610,6 +1610,14 @@ object SQLConf {
   """ "... N more fields" placeholder.""")
 .intConf
 .createWithDefault(25)
+
+  val SET_COMMAND_REJECTS_SPARK_CONFS =
+buildConf("spark.sql.execution.setCommandRejectsSparkConfs")
--- End diff --

shall we use the legacy prefix?


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #23031: [SPARK-26060][SQL] Track SparkConf entries and ma...

2018-11-28 Thread cloud-fan
Github user cloud-fan commented on a diff in the pull request:

https://github.com/apache/spark/pull/23031#discussion_r237075228
  
--- Diff: 
sql/core/src/test/scala/org/apache/spark/sql/execution/command/DDLSuite.scala 
---
@@ -2715,4 +2715,11 @@ abstract class DDLSuite extends QueryTest with 
SQLTestUtils {
   }
 }
   }
+
+  test("set command rejects SparkConf entries") {
+val ex = intercept[AnalysisException] {
+  sql("SET spark.task.cpus = 4")
--- End diff --

ditto


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #23031: [SPARK-26060][SQL] Track SparkConf entries and ma...

2018-11-28 Thread cloud-fan
Github user cloud-fan commented on a diff in the pull request:

https://github.com/apache/spark/pull/23031#discussion_r237075170
  
--- Diff: 
sql/core/src/test/scala/org/apache/spark/sql/RuntimeConfigSuite.scala ---
@@ -68,4 +68,13 @@ class RuntimeConfigSuite extends SparkFunSuite {
 assert(!conf.isModifiable(""))
 assert(!conf.isModifiable("invalid config parameter"))
   }
+
+  test("reject SparkConf entries") {
+val conf = newConf()
+
+val ex = intercept[AnalysisException] {
+  conf.set("spark.task.cpus", 4)
--- End diff --

can we use `config.CPUS_PER_TASK` instead of hardcoding it?


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #23031: [SPARK-26060][SQL] Track SparkConf entries and ma...

2018-11-28 Thread cloud-fan
Github user cloud-fan commented on a diff in the pull request:

https://github.com/apache/spark/pull/23031#discussion_r237074727
  
--- Diff: sql/core/src/main/scala/org/apache/spark/sql/RuntimeConfig.scala 
---
@@ -154,5 +154,9 @@ class RuntimeConfig private[sql](sqlConf: SQLConf = new 
SQLConf) {
 if (SQLConf.staticConfKeys.contains(key)) {
   throw new AnalysisException(s"Cannot modify the value of a static 
config: $key")
 }
+if (sqlConf.setCommandRejectsSparkConfs &&
+ConfigEntry.findEntry(key) != null && 
!SQLConf.sqlConfEntries.containsKey(key)) {
--- End diff --

we should only reject configs that are registered as SparkConf. Thinking 
about configs that are either a SparkConf or SQLConf, we shouldn't reject it.


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #23031: [SPARK-26060][SQL] Track SparkConf entries and ma...

2018-11-28 Thread ueshin
Github user ueshin commented on a diff in the pull request:

https://github.com/apache/spark/pull/23031#discussion_r237071928
  
--- Diff: sql/core/src/main/scala/org/apache/spark/sql/RuntimeConfig.scala 
---
@@ -154,5 +154,9 @@ class RuntimeConfig private[sql](sqlConf: SQLConf = new 
SQLConf) {
 if (SQLConf.staticConfKeys.contains(key)) {
   throw new AnalysisException(s"Cannot modify the value of a static 
config: $key")
 }
+if (sqlConf.setCommandRejectsSparkConfs &&
+ConfigEntry.findEntry(key) != null && 
!SQLConf.sqlConfEntries.containsKey(key)) {
--- End diff --

I'm sorry for the delay.
As per the comment 
https://github.com/apache/spark/pull/22887#issuecomment-442425557, I'd leave it 
as is for now.


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #23031: [SPARK-26060][SQL] Track SparkConf entries and ma...

2018-11-16 Thread vanzin
Github user vanzin commented on a diff in the pull request:

https://github.com/apache/spark/pull/23031#discussion_r234314168
  
--- Diff: sql/core/src/main/scala/org/apache/spark/sql/RuntimeConfig.scala 
---
@@ -154,5 +154,9 @@ class RuntimeConfig private[sql](sqlConf: SQLConf = new 
SQLConf) {
 if (SQLConf.staticConfKeys.contains(key)) {
   throw new AnalysisException(s"Cannot modify the value of a static 
config: $key")
 }
+if (sqlConf.setCommandRejectsSparkConfs &&
+ConfigEntry.findEntry(key) != null && 
!SQLConf.sqlConfEntries.containsKey(key)) {
--- End diff --

Actually, thinking of it, isn't `ConfigEntry.findEntry(key) != null` 
redundant with the other check?

Removing that would also want by other settings that don't have constants 
defined.


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #23031: [SPARK-26060][SQL] Track SparkConf entries and ma...

2018-11-16 Thread vanzin
Github user vanzin commented on a diff in the pull request:

https://github.com/apache/spark/pull/23031#discussion_r234314374
  
--- Diff: docs/sql-migration-guide-upgrade.md ---
@@ -17,6 +17,8 @@ displayTitle: Spark SQL Upgrading Guide
 
   - The `ADD JAR` command previously returned a result set with the single 
value 0. It now returns an empty result set.
 
+  - In Spark version 2.4 and earlier, the `SET` command works without any 
warnings even if the specified key is for `SparkConf` entries and it has no 
effect because the command does not update `SparkConf`, but the behavior might 
confuse users. Since 3.0, the command fails if a non-SQL or static SQL config 
key is used. You can disable such a check by setting 
`spark.sql.execution.setCommandRejectsSparkConfs` to `false`.
--- End diff --

Now that I looked at that code again, the static config part already threw 
an error before, so probably can be removed here.


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #23031: [SPARK-26060][SQL] Track SparkConf entries and ma...

2018-11-15 Thread vanzin
Github user vanzin commented on a diff in the pull request:

https://github.com/apache/spark/pull/23031#discussion_r233987571
  
--- Diff: docs/sql-migration-guide-upgrade.md ---
@@ -17,6 +17,8 @@ displayTitle: Spark SQL Upgrading Guide
 
   - The `ADD JAR` command previously returned a result set with the single 
value 0. It now returns an empty result set.
 
+  - In Spark version 2.4 and earlier, the `SET` command works without any 
warnings even if the specified key is for `SparkConf` entries and it has no 
effect because the command does not update `SparkConf`, but the behavior might 
confuse users. Since 3.0, the command fails if the key is for `SparkConf` as 
same as for static sql config keys. You can disable such a check by setting 
`spark.sql.execution.setCommandRejectsSparkConfs` to `false`.
--- End diff --

...fails if a non-SQL or static SQL config key is used.


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #23031: [SPARK-26060][SQL] Track SparkConf entries and ma...

2018-11-15 Thread vanzin
Github user vanzin commented on a diff in the pull request:

https://github.com/apache/spark/pull/23031#discussion_r233986989
  
--- Diff: sql/core/src/main/scala/org/apache/spark/sql/RuntimeConfig.scala 
---
@@ -154,5 +154,9 @@ class RuntimeConfig private[sql](sqlConf: SQLConf = new 
SQLConf) {
 if (SQLConf.staticConfKeys.contains(key)) {
   throw new AnalysisException(s"Cannot modify the value of a static 
config: $key")
 }
+if (sqlConf.setCommandRejectsSparkConfs &&
+ConfigEntry.findEntry(key) != null && 
!SQLConf.sqlConfEntries.containsKey(key)) {
+  throw new AnalysisException(s"Cannot modify the value of a spark 
config: $key")
--- End diff --

Spark


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org