[spark] branch master updated: [SPARK-44404][SQL] Assign names to the error class _LEGACY_ERROR_TEMP_[1009,1010,1013,1015,1016,1278]

2023-08-12 Thread maxgekk
This is an automated email from the ASF dual-hosted git repository.

maxgekk pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git


The following commit(s) were added to refs/heads/master by this push:
 new 295c615b16b [SPARK-44404][SQL] Assign names to the error class 
_LEGACY_ERROR_TEMP_[1009,1010,1013,1015,1016,1278]
295c615b16b is described below

commit 295c615b16b8a77f242ffa99006b4fb95f8f3487
Author: panbingkun 
AuthorDate: Sat Aug 12 12:22:28 2023 +0500

[SPARK-44404][SQL] Assign names to the error class 
_LEGACY_ERROR_TEMP_[1009,1010,1013,1015,1016,1278]

### What changes were proposed in this pull request?
The pr aims to assign names to the error class, include:
- _LEGACY_ERROR_TEMP_1009 => VIEW_EXCEED_MAX_NESTED_DEPTH
- _LEGACY_ERROR_TEMP_1010 => UNSUPPORTED_VIEW_OPERATION.WITHOUT_SUGGESTION
- _LEGACY_ERROR_TEMP_1013 => UNSUPPORTED_VIEW_OPERATION.WITH_SUGGESTION / 
UNSUPPORTED_TEMP_VIEW_OPERATION.WITH_SUGGESTION
- _LEGACY_ERROR_TEMP_1014 => 
UNSUPPORTED_TEMP_VIEW_OPERATION.WITHOUT_SUGGESTION
- _LEGACY_ERROR_TEMP_1015 => UNSUPPORTED_TABLE_OPERATION.WITH_SUGGESTION
- _LEGACY_ERROR_TEMP_1016 => 
UNSUPPORTED_TEMP_VIEW_OPERATION.WITHOUT_SUGGESTION
- _LEGACY_ERROR_TEMP_1278 => UNSUPPORTED_TABLE_OPERATION.WITHOUT_SUGGESTION

### Why are the changes needed?
The changes improve the error framework.

### Does this PR introduce _any_ user-facing change?
No.

### How was this patch tested?
- Pass GA.
- Manually test.
- Update UT.

Closes #42109 from panbingkun/SPARK-44404.

Lead-authored-by: panbingkun 
Co-authored-by: panbingkun <84731...@qq.com>
Signed-off-by: Max Gekk 
---
 R/pkg/tests/fulltests/test_sparkSQL.R  |   3 +-
 .../src/main/resources/error/error-classes.json|  91 ---
 ...ions-unsupported-table-operation-error-class.md |  36 +++
 ...-unsupported-temp-view-operation-error-class.md |  36 +++
 ...tions-unsupported-view-operation-error-class.md |  36 +++
 docs/sql-error-conditions.md   |  30 +++
 .../spark/sql/catalyst/analysis/Analyzer.scala |   9 +-
 .../sql/catalyst/analysis/v2ResolutionPlans.scala  |   4 +-
 .../spark/sql/catalyst/parser/AstBuilder.scala |  32 ++-
 .../spark/sql/errors/QueryCompilationErrors.scala  |  90 ---
 .../spark/sql/catalyst/parser/DDLParserSuite.scala | 104 
 .../apache/spark/sql/execution/command/views.scala |   2 +-
 .../apache/spark/sql/internal/CatalogImpl.scala|   2 +-
 .../analyzer-results/change-column.sql.out |  16 +-
 .../sql-tests/results/change-column.sql.out|  16 +-
 .../spark/sql/connector/DataSourceV2SQLSuite.scala |   7 +-
 .../apache/spark/sql/execution/SQLViewSuite.scala  | 267 ++---
 .../spark/sql/execution/SQLViewTestSuite.scala |  23 +-
 .../AlterTableAddPartitionParserSuite.scala|   4 +-
 .../AlterTableDropPartitionParserSuite.scala   |   8 +-
 .../AlterTableRecoverPartitionsParserSuite.scala   |   8 +-
 .../AlterTableRenamePartitionParserSuite.scala |   4 +-
 .../command/AlterTableSetLocationParserSuite.scala |   6 +-
 .../command/AlterTableSetSerdeParserSuite.scala|  16 +-
 .../spark/sql/execution/command/DDLSuite.scala |  36 ++-
 .../command/MsckRepairTableParserSuite.scala   |  13 +-
 .../command/ShowPartitionsParserSuite.scala|  10 +-
 .../command/TruncateTableParserSuite.scala |   6 +-
 .../execution/command/TruncateTableSuiteBase.scala |  45 +++-
 .../execution/command/v1/ShowPartitionsSuite.scala |  57 -
 .../apache/spark/sql/internal/CatalogSuite.scala   |  13 +-
 .../spark/sql/hive/execution/HiveDDLSuite.scala|  94 +++-
 32 files changed, 717 insertions(+), 407 deletions(-)

diff --git a/R/pkg/tests/fulltests/test_sparkSQL.R 
b/R/pkg/tests/fulltests/test_sparkSQL.R
index d61501d248a..47688d7560c 100644
--- a/R/pkg/tests/fulltests/test_sparkSQL.R
+++ b/R/pkg/tests/fulltests/test_sparkSQL.R
@@ -4193,8 +4193,7 @@ test_that("catalog APIs, listTables, getTable, 
listColumns, listFunctions, funct
 
   # recoverPartitions does not work with temporary view
   expect_error(recoverPartitions("cars"),
-   paste("Error in recoverPartitions : analysis error - cars is a 
temp view.",
- "'recoverPartitions()' expects a table"), fixed = TRUE)
+   "[UNSUPPORTED_TEMP_VIEW_OPERATION.WITH_SUGGESTION]*`cars`*")
   expect_error(refreshTable("cars"), NA)
   expect_error(refreshByPath("/"), NA)
 
diff --git a/common/utils/src/main/resources/error/error-classes.json 
b/common/utils/src/main/resources/error/error-classes.json
index 133c2dd826c..08f79bcecbb 100644
--- a/common/utils/src/main/resources/error/error-classes.json
+++ b/common/utils/src/main/resources/error/error-classes.json
@@ -3394,12 +3394,63 @@
 },
 "sqlState" : "0A000"
   },
+  "UNSUPPORTED_TABLE_OPERATION" : {
+"message" : [
+  "The 

[spark] branch master updated: [SPARK-44242][CORE][FOLLOWUP] Use the `assertThrows` method to fix Java linter issue

2023-08-12 Thread yumwang
This is an automated email from the ASF dual-hosted git repository.

yumwang pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git


The following commit(s) were added to refs/heads/master by this push:
 new 2f9c8ac25ba [SPARK-44242][CORE][FOLLOWUP] Use the `assertThrows` 
method to fix Java linter issue
2f9c8ac25ba is described below

commit 2f9c8ac25ba634affe366ce55eb3f9e969e71ae3
Author: Yuming Wang 
AuthorDate: Sat Aug 12 17:37:32 2023 +0800

[SPARK-44242][CORE][FOLLOWUP] Use the `assertThrows` method to fix Java 
linter issue

### What changes were proposed in this pull request?

Use the `assertThrows` method to test for exceptions.

### Why are the changes needed?

Fix Java linter issue.

### Does this PR introduce _any_ user-facing change?

No.

### How was this patch tested?

N/A.

Closes #42466 from wangyum/SPARK-44242.

Authored-by: Yuming Wang 
Signed-off-by: Yuming Wang 
---
 .../org/apache/spark/launcher/SparkSubmitCommandBuilderSuite.java | 4 ++--
 1 file changed, 2 insertions(+), 2 deletions(-)

diff --git 
a/launcher/src/test/java/org/apache/spark/launcher/SparkSubmitCommandBuilderSuite.java
 
b/launcher/src/test/java/org/apache/spark/launcher/SparkSubmitCommandBuilderSuite.java
index 7a623bb76f3..e07095167da 100644
--- 
a/launcher/src/test/java/org/apache/spark/launcher/SparkSubmitCommandBuilderSuite.java
+++ 
b/launcher/src/test/java/org/apache/spark/launcher/SparkSubmitCommandBuilderSuite.java
@@ -72,7 +72,7 @@ public class SparkSubmitCommandBuilderSuite extends BaseSuite 
{
   cmd.contains("org.apache.spark.deploy.SparkSubmit"));
   }
 
-  @Test(expected = IllegalArgumentException.class)
+  @Test
   public void testCheckJavaOptionsThrowException() throws Exception {
 Map env = new HashMap<>();
 List sparkSubmitArgs = Arrays.asList(
@@ -84,7 +84,7 @@ public class SparkSubmitCommandBuilderSuite extends BaseSuite 
{
   "-Xmx64g -Dprop=Other -Dprop1=\"-Xmx -Xmx\" -Dprop2=\"-Xmx '-Xmx\" " +
 "-Dprop3='-Xmx -Xmx' -Dprop4='-Xmx \"-Xmx'",
   SparkLauncher.NO_RESOURCE);
-buildCommand(sparkSubmitArgs, env);
+assertThrows(IllegalArgumentException.class, () -> 
buildCommand(sparkSubmitArgs, env));
   }
 
   @Test


-
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org



[spark] branch master updated: [MINOR][SQL] Rename shouldBroadcast to isDynamicPruning in InSubqueryExec

2023-08-12 Thread wenchen
This is an automated email from the ASF dual-hosted git repository.

wenchen pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git


The following commit(s) were added to refs/heads/master by this push:
 new 7070b3672d8 [MINOR][SQL] Rename shouldBroadcast to isDynamicPruning in 
InSubqueryExec
7070b3672d8 is described below

commit 7070b3672d8426834ff936fff4543b10093042fc
Author: Ziqi Liu 
AuthorDate: Sat Aug 12 21:40:22 2023 +0800

[MINOR][SQL] Rename shouldBroadcast to isDynamicPruning in InSubqueryExec

### What changes were proposed in this pull request?

Rename `shouldBroadcast` param to `isDynamicPruning`.

### Why are the changes needed?

Explicitly indicating the behavior mode of DPP

### Does this PR introduce _any_ user-facing change?
NO

### How was this patch tested?

Closes #42286 from liuzqt/insubqueryexec-rename.

Authored-by: Ziqi Liu 
Signed-off-by: Wenchen Fan 
---
 .../spark/sql/execution/adaptive/PlanAdaptiveSubqueries.scala   | 2 +-
 .../src/main/scala/org/apache/spark/sql/execution/subquery.scala| 6 +++---
 2 files changed, 4 insertions(+), 4 deletions(-)

diff --git 
a/sql/core/src/main/scala/org/apache/spark/sql/execution/adaptive/PlanAdaptiveSubqueries.scala
 
b/sql/core/src/main/scala/org/apache/spark/sql/execution/adaptive/PlanAdaptiveSubqueries.scala
index 5b4a7e50db7..7816fbd52c0 100644
--- 
a/sql/core/src/main/scala/org/apache/spark/sql/execution/adaptive/PlanAdaptiveSubqueries.scala
+++ 
b/sql/core/src/main/scala/org/apache/spark/sql/execution/adaptive/PlanAdaptiveSubqueries.scala
@@ -45,7 +45,7 @@ case class PlanAdaptiveSubqueries(
   )
 }
 val subquery = SubqueryExec(s"subquery#${exprId.id}", 
subqueryMap(exprId.id))
-InSubqueryExec(expr, subquery, exprId, shouldBroadcast = true)
+InSubqueryExec(expr, subquery, exprId, isDynamicPruning = false)
   case expressions.DynamicPruningSubquery(value, buildPlan,
   buildKeys, broadcastKeyIndex, onlyInBroadcast, exprId, _) =>
 val name = s"dynamicpruning#${exprId.id}"
diff --git 
a/sql/core/src/main/scala/org/apache/spark/sql/execution/subquery.scala 
b/sql/core/src/main/scala/org/apache/spark/sql/execution/subquery.scala
index 2a28f6848aa..41230c7792c 100644
--- a/sql/core/src/main/scala/org/apache/spark/sql/execution/subquery.scala
+++ b/sql/core/src/main/scala/org/apache/spark/sql/execution/subquery.scala
@@ -117,7 +117,7 @@ case class InSubqueryExec(
 child: Expression,
 plan: BaseSubqueryExec,
 exprId: ExprId,
-shouldBroadcast: Boolean = false,
+isDynamicPruning: Boolean = true,
 private var resultBroadcast: Broadcast[Array[Any]] = null,
 @transient private var result: Array[Any] = null)
   extends ExecSubqueryExpression with UnaryLike[Expression] with Predicate {
@@ -136,7 +136,7 @@ case class InSubqueryExec(
 } else {
   rows.map(_.get(0, child.dataType))
 }
-if (shouldBroadcast) {
+if (!isDynamicPruning) {
   resultBroadcast = plan.session.sparkContext.broadcast(result)
 }
   }
@@ -198,7 +198,7 @@ case class PlanSubqueries(sparkSession: SparkSession) 
extends Rule[SparkPlan] {
 }
 val executedPlan = QueryExecution.prepareExecutedPlan(sparkSession, 
query)
 InSubqueryExec(expr, SubqueryExec(s"subquery#${exprId.id}", 
executedPlan),
-  exprId, shouldBroadcast = true)
+  exprId, isDynamicPruning = false)
 }
   }
 }


-
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org