[spark] branch branch-3.3 updated: [SPARK-39087][SQL][3.3] Improve messages of error classes

2022-05-03 Thread maxgekk
This is an automated email from the ASF dual-hosted git repository.

maxgekk pushed a commit to branch branch-3.3
in repository https://gitbox.apache.org/repos/asf/spark.git


The following commit(s) were added to refs/heads/branch-3.3 by this push:
 new d3aadb40370 [SPARK-39087][SQL][3.3] Improve messages of error classes
d3aadb40370 is described below

commit d3aadb40370c0613c2d2ce41d8b905f0fafcd69c
Author: Max Gekk 
AuthorDate: Wed May 4 08:45:03 2022 +0300

[SPARK-39087][SQL][3.3] Improve messages of error classes

### What changes were proposed in this pull request?
In the PR, I propose to modify error messages of the following error 
classes:
- INVALID_JSON_SCHEMA_MAP_TYPE
- INCOMPARABLE_PIVOT_COLUMN
- INVALID_ARRAY_INDEX_IN_ELEMENT_AT
- INVALID_ARRAY_INDEX
- DIVIDE_BY_ZERO

This is a backport of https://github.com/apache/spark/pull/36428.

### Why are the changes needed?
To improve readability of error messages.

### Does this PR introduce _any_ user-facing change?
Yes. It changes user-facing error messages.

### How was this patch tested?
By running the modified test suites:
```
$ build/sbt "sql/testOnly *QueryCompilationErrorsSuite*"
$ build/sbt "sql/testOnly *QueryExecutionErrorsSuite*"
$ build/sbt "sql/testOnly *QueryExecutionAnsiErrorsSuite"
$ build/sbt "test:testOnly *SparkThrowableSuite"
```

Authored-by: Max Gekk 
Signed-off-by: Max Gekk 
(cherry picked from commit 040526391a45ad610422a48c05aa69ba5133f922)
Signed-off-by: Max Gekk 

Closes #36439 from MaxGekk/error-class-improve-msg-3.3.

Authored-by: Max Gekk 
Signed-off-by: Max Gekk 
---
 core/src/main/resources/error/error-classes.json   | 12 -
 .../org/apache/spark/SparkThrowableSuite.scala |  2 +-
 .../spark/sql/errors/QueryCompilationErrors.scala  |  6 ++---
 .../expressions/ArithmeticExpressionSuite.scala| 30 +++---
 .../expressions/CollectionExpressionsSuite.scala   |  4 +--
 .../catalyst/expressions/ComplexTypeSuite.scala|  4 +--
 .../expressions/IntervalExpressionsSuite.scala | 10 
 .../expressions/StringExpressionsSuite.scala   |  6 ++---
 .../sql/catalyst/util/IntervalUtilsSuite.scala |  2 +-
 .../resources/sql-tests/results/ansi/array.sql.out | 24 -
 .../sql-tests/results/ansi/interval.sql.out|  4 +--
 .../resources/sql-tests/results/interval.sql.out   |  4 +--
 .../test/resources/sql-tests/results/pivot.sql.out |  4 +--
 .../sql-tests/results/postgreSQL/case.sql.out  |  6 ++---
 .../sql-tests/results/postgreSQL/int8.sql.out  |  6 ++---
 .../results/postgreSQL/select_having.sql.out   |  2 +-
 .../results/udf/postgreSQL/udf-case.sql.out|  6 ++---
 .../udf/postgreSQL/udf-select_having.sql.out   |  2 +-
 .../sql-tests/results/udf/udf-pivot.sql.out|  4 +--
 .../apache/spark/sql/ColumnExpressionSuite.scala   | 12 -
 .../org/apache/spark/sql/DataFrameSuite.scala  |  2 +-
 .../apache/spark/sql/execution/SQLViewSuite.scala  |  4 +--
 .../sql/streaming/FileStreamSourceSuite.scala  |  2 +-
 23 files changed, 79 insertions(+), 79 deletions(-)

diff --git a/core/src/main/resources/error/error-classes.json 
b/core/src/main/resources/error/error-classes.json
index 463a5eae534..78934667ac0 100644
--- a/core/src/main/resources/error/error-classes.json
+++ b/core/src/main/resources/error/error-classes.json
@@ -37,7 +37,7 @@
 "sqlState" : "22008"
   },
   "DIVIDE_BY_ZERO" : {
-"message" : [ "divide by zero. To return NULL instead, use 'try_divide'. 
If necessary set  to false (except for ANSI interval type) to bypass 
this error." ],
+"message" : [ "Division by zero. To return NULL instead, use `try_divide`. 
If necessary set  to false (except for ANSI interval type) to bypass 
this error." ],
 "sqlState" : "22012"
   },
   "DUPLICATE_KEY" : {
@@ -72,7 +72,7 @@
 "message" : [ "Grouping sets size cannot be greater than " ]
   },
   "INCOMPARABLE_PIVOT_COLUMN" : {
-"message" : [ "Invalid pivot column ''. Pivot columns must be 
comparable." ],
+"message" : [ "Invalid pivot column . Pivot columns must be 
comparable." ],
 "sqlState" : "42000"
   },
   "INCOMPATIBLE_DATASOURCE_REGISTER" : {
@@ -89,10 +89,10 @@
 "message" : [ "" ]
   },
   "INVALID_ARRAY_INDEX" : {
-"message" : [ "Invalid index: , numElements: . If 
necessary set  to false to bypass this error." ]
+"message" : [ "The index  is out of bounds. The array has 
 elements. If necessary set  to false to bypass this error." 
]
   },
   "INVALID_ARRAY_INDEX_IN_ELEMENT_AT" : {
-"message" : [ "Invalid index: , numElements: . To 
return NULL instead, use 'try_element_at'. If necessary set  to false 
to bypass this error." ]
+"message" : [ "The index  is out of bounds. The array has 
 elements. To return NULL instead, use `try_element_at`. If 
necessary set  to false to bypass 

[spark] branch master updated: [SPARK-39029][PYTHON][TESTS] Improve the test coverage for pyspark/broadcast.py

2022-05-03 Thread gurwls223
This is an automated email from the ASF dual-hosted git repository.

gurwls223 pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git


The following commit(s) were added to refs/heads/master by this push:
 new 0d3093cc983 [SPARK-39029][PYTHON][TESTS] Improve the test coverage for 
pyspark/broadcast.py
0d3093cc983 is described below

commit 0d3093cc983f8fe236cc99e546e0a154f696b3f9
Author: pralabhkumar 
AuthorDate: Wed May 4 09:14:43 2022 +0900

[SPARK-39029][PYTHON][TESTS] Improve the test coverage for 
pyspark/broadcast.py

### What changes were proposed in this pull request?
This PR add test cases for broadcast.py

### Why are the changes needed?
To cover corner test cases and increase coverage

### Does this PR introduce _any_ user-facing change?
No - test only

### How was this patch tested?
CI in this PR should test it out

Closes #36432 from pralabhkumar/rk_test_broadcast.

Lead-authored-by: pralabhkumar 
Co-authored-by: Kumar, Pralabh 
Signed-off-by: Hyukjin Kwon 
---
 python/pyspark/tests/test_broadcast.py | 29 -
 1 file changed, 28 insertions(+), 1 deletion(-)

diff --git a/python/pyspark/tests/test_broadcast.py 
b/python/pyspark/tests/test_broadcast.py
index 56763e8d80a..8185e812e66 100644
--- a/python/pyspark/tests/test_broadcast.py
+++ b/python/pyspark/tests/test_broadcast.py
@@ -15,12 +15,15 @@
 # limitations under the License.
 #
 import os
+import pickle
 import random
 import time
 import tempfile
 import unittest
 
-from pyspark import SparkConf, SparkContext
+from py4j.protocol import Py4JJavaError
+
+from pyspark import SparkConf, SparkContext, Broadcast
 from pyspark.java_gateway import launch_gateway
 from pyspark.serializers import ChunkedStream
 
@@ -99,6 +102,30 @@ class BroadcastTest(unittest.TestCase):
 finally:
 b.destroy()
 
+def test_broadcast_when_sc_none(self):
+# SPARK-39029 : Test case to improve test coverage of broadcast.py
+# It tests the case when SparkContext is none and Broadcast is called 
at executor
+conf = SparkConf()
+conf.setMaster("local-cluster[2,1,1024]")
+self.sc = SparkContext(conf=conf)
+bs = self.sc.broadcast([10])
+bs_sc_none = Broadcast(sc=None, path=bs._path)
+self.assertEqual(bs_sc_none.value, [10])
+
+def test_broadcast_for_error_condition(self):
+# SPARK-39029: Test case to improve test coverage of broadcast.py
+# It tests the case when broadcast should raise error .
+conf = SparkConf()
+conf.setMaster("local-cluster[2,1,1024]")
+self.sc = SparkContext(conf=conf)
+bs = self.sc.broadcast([1])
+with self.assertRaisesRegex(pickle.PickleError, 
"Could.*not.*serialize.*broadcast"):
+self.sc.broadcast(self.sc)
+with self.assertRaisesRegex(Py4JJavaError, 
"RuntimeError.*Broadcast.*destroyed.*driver"):
+self.sc.parallelize([1]).map(lambda x: bs.destroy()).collect()
+with self.assertRaisesRegex(Py4JJavaError, 
"RuntimeError.*Broadcast.*unpersisted.*driver"):
+self.sc.parallelize([1]).map(lambda x: bs.unpersist()).collect()
+
 
 class BroadcastFrameProtocolTest(unittest.TestCase):
 @classmethod


-
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org



[spark] branch branch-3.3 updated (4177626e634 -> 0515536e6d1)

2022-05-03 Thread maxgekk
This is an automated email from the ASF dual-hosted git repository.

maxgekk pushed a change to branch branch-3.3
in repository https://gitbox.apache.org/repos/asf/spark.git


from 4177626e634 [SPARK-35320][SQL][FOLLOWUP] Remove duplicated test
 add 482b7d54b52 Preparing Spark release v3.3.0-rc1
 new 0515536e6d1 Preparing development version 3.3.1-SNAPSHOT

The 1 revisions listed above as "new" are entirely new to this
repository and will be described in separate emails.  The revisions
listed as "add" were already present in the repository and have only
been added to this reference.


Summary of changes:
 R/pkg/DESCRIPTION  | 2 +-
 assembly/pom.xml   | 2 +-
 common/kvstore/pom.xml | 2 +-
 common/network-common/pom.xml  | 2 +-
 common/network-shuffle/pom.xml | 2 +-
 common/network-yarn/pom.xml| 2 +-
 common/sketch/pom.xml  | 2 +-
 common/tags/pom.xml| 2 +-
 common/unsafe/pom.xml  | 2 +-
 core/pom.xml   | 2 +-
 docs/_config.yml   | 6 +++---
 examples/pom.xml   | 2 +-
 external/avro/pom.xml  | 2 +-
 external/docker-integration-tests/pom.xml  | 2 +-
 external/kafka-0-10-assembly/pom.xml   | 2 +-
 external/kafka-0-10-sql/pom.xml| 2 +-
 external/kafka-0-10-token-provider/pom.xml | 2 +-
 external/kafka-0-10/pom.xml| 2 +-
 external/kinesis-asl-assembly/pom.xml  | 2 +-
 external/kinesis-asl/pom.xml   | 2 +-
 external/spark-ganglia-lgpl/pom.xml| 2 +-
 graphx/pom.xml | 2 +-
 hadoop-cloud/pom.xml   | 2 +-
 launcher/pom.xml   | 2 +-
 mllib-local/pom.xml| 2 +-
 mllib/pom.xml  | 2 +-
 pom.xml| 2 +-
 repl/pom.xml   | 2 +-
 resource-managers/kubernetes/core/pom.xml  | 2 +-
 resource-managers/kubernetes/integration-tests/pom.xml | 2 +-
 resource-managers/mesos/pom.xml| 2 +-
 resource-managers/yarn/pom.xml | 2 +-
 sql/catalyst/pom.xml   | 2 +-
 sql/core/pom.xml   | 2 +-
 sql/hive-thriftserver/pom.xml  | 2 +-
 sql/hive/pom.xml   | 2 +-
 streaming/pom.xml  | 2 +-
 tools/pom.xml  | 2 +-
 38 files changed, 40 insertions(+), 40 deletions(-)


-
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org



[spark] 01/01: Preparing development version 3.3.1-SNAPSHOT

2022-05-03 Thread maxgekk
This is an automated email from the ASF dual-hosted git repository.

maxgekk pushed a commit to branch branch-3.3
in repository https://gitbox.apache.org/repos/asf/spark.git

commit 0515536e6d1b4819eeab59cecb9a045b1a0d3325
Author: Maxim Gekk 
AuthorDate: Tue May 3 18:15:51 2022 +

Preparing development version 3.3.1-SNAPSHOT
---
 R/pkg/DESCRIPTION  | 2 +-
 assembly/pom.xml   | 2 +-
 common/kvstore/pom.xml | 2 +-
 common/network-common/pom.xml  | 2 +-
 common/network-shuffle/pom.xml | 2 +-
 common/network-yarn/pom.xml| 2 +-
 common/sketch/pom.xml  | 2 +-
 common/tags/pom.xml| 2 +-
 common/unsafe/pom.xml  | 2 +-
 core/pom.xml   | 2 +-
 docs/_config.yml   | 6 +++---
 examples/pom.xml   | 2 +-
 external/avro/pom.xml  | 2 +-
 external/docker-integration-tests/pom.xml  | 2 +-
 external/kafka-0-10-assembly/pom.xml   | 2 +-
 external/kafka-0-10-sql/pom.xml| 2 +-
 external/kafka-0-10-token-provider/pom.xml | 2 +-
 external/kafka-0-10/pom.xml| 2 +-
 external/kinesis-asl-assembly/pom.xml  | 2 +-
 external/kinesis-asl/pom.xml   | 2 +-
 external/spark-ganglia-lgpl/pom.xml| 2 +-
 graphx/pom.xml | 2 +-
 hadoop-cloud/pom.xml   | 2 +-
 launcher/pom.xml   | 2 +-
 mllib-local/pom.xml| 2 +-
 mllib/pom.xml  | 2 +-
 pom.xml| 2 +-
 repl/pom.xml   | 2 +-
 resource-managers/kubernetes/core/pom.xml  | 2 +-
 resource-managers/kubernetes/integration-tests/pom.xml | 2 +-
 resource-managers/mesos/pom.xml| 2 +-
 resource-managers/yarn/pom.xml | 2 +-
 sql/catalyst/pom.xml   | 2 +-
 sql/core/pom.xml   | 2 +-
 sql/hive-thriftserver/pom.xml  | 2 +-
 sql/hive/pom.xml   | 2 +-
 streaming/pom.xml  | 2 +-
 tools/pom.xml  | 2 +-
 38 files changed, 40 insertions(+), 40 deletions(-)

diff --git a/R/pkg/DESCRIPTION b/R/pkg/DESCRIPTION
index 9479bb3bf87..0e449e841cf 100644
--- a/R/pkg/DESCRIPTION
+++ b/R/pkg/DESCRIPTION
@@ -1,6 +1,6 @@
 Package: SparkR
 Type: Package
-Version: 3.3.0
+Version: 3.3.1
 Title: R Front End for 'Apache Spark'
 Description: Provides an R Front end for 'Apache Spark' 
.
 Authors@R:
diff --git a/assembly/pom.xml b/assembly/pom.xml
index 2e9c4d9960b..d12f2ad73fa 100644
--- a/assembly/pom.xml
+++ b/assembly/pom.xml
@@ -21,7 +21,7 @@
   
 org.apache.spark
 spark-parent_2.12
-3.3.0
+3.3.1-SNAPSHOT
 ../pom.xml
   
 
diff --git a/common/kvstore/pom.xml b/common/kvstore/pom.xml
index 2a9acfa335e..842d63f5d38 100644
--- a/common/kvstore/pom.xml
+++ b/common/kvstore/pom.xml
@@ -22,7 +22,7 @@
   
 org.apache.spark
 spark-parent_2.12
-3.3.0
+3.3.1-SNAPSHOT
 ../../pom.xml
   
 
diff --git a/common/network-common/pom.xml b/common/network-common/pom.xml
index 7b17e625d75..f7d187bf952 100644
--- a/common/network-common/pom.xml
+++ b/common/network-common/pom.xml
@@ -22,7 +22,7 @@
   
 org.apache.spark
 spark-parent_2.12
-3.3.0
+3.3.1-SNAPSHOT
 ../../pom.xml
   
 
diff --git a/common/network-shuffle/pom.xml b/common/network-shuffle/pom.xml
index c5c920e7747..53f38df8851 100644
--- a/common/network-shuffle/pom.xml
+++ b/common/network-shuffle/pom.xml
@@ -22,7 +22,7 @@
   
 org.apache.spark
 spark-parent_2.12
-3.3.0
+3.3.1-SNAPSHOT
 ../../pom.xml
   
 
diff --git a/common/network-yarn/pom.xml b/common/network-yarn/pom.xml
index 697b5a3928e..845f6659407 100644
--- a/common/network-yarn/pom.xml
+++ b/common/network-yarn/pom.xml
@@ -22,7 +22,7 @@
   
 org.apache.spark
 spark-parent_2.12
-3.3.0
+3.3.1-SNAPSHOT
 ../../pom.xml
   
 
diff --git a/common/sketch/pom.xml b/common/sketch/pom.xml
index ad2db11370a..8e159089193 100644
--- a/common/sketch/pom.xml
+++ b/common/sketch/pom.xml
@@ -22,7 +22,7 @@
   
 org.apache.spark
 spark-parent_2.12
-3.3.0
+3.3.1-SNAPSHOT
 ../../pom.xml
   
 
diff --git a/common/tags/pom.xml b/common/tags/pom.xml
index 1a7bdee70f3..1987c133285 100644
--- a/common/tags/pom.xml
+++ 

[spark] 01/01: Preparing Spark release v3.3.0-rc1

2022-05-03 Thread maxgekk
This is an automated email from the ASF dual-hosted git repository.

maxgekk pushed a commit to tag v3.3.0-rc1
in repository https://gitbox.apache.org/repos/asf/spark.git

commit 482b7d54b522c4d1e25f3e84eabbc78126f22a3d
Author: Maxim Gekk 
AuthorDate: Tue May 3 18:15:45 2022 +

Preparing Spark release v3.3.0-rc1
---
 assembly/pom.xml   | 2 +-
 common/kvstore/pom.xml | 2 +-
 common/network-common/pom.xml  | 2 +-
 common/network-shuffle/pom.xml | 2 +-
 common/network-yarn/pom.xml| 2 +-
 common/sketch/pom.xml  | 2 +-
 common/tags/pom.xml| 2 +-
 common/unsafe/pom.xml  | 2 +-
 core/pom.xml   | 2 +-
 docs/_config.yml   | 4 ++--
 examples/pom.xml   | 2 +-
 external/avro/pom.xml  | 2 +-
 external/docker-integration-tests/pom.xml  | 2 +-
 external/kafka-0-10-assembly/pom.xml   | 2 +-
 external/kafka-0-10-sql/pom.xml| 2 +-
 external/kafka-0-10-token-provider/pom.xml | 2 +-
 external/kafka-0-10/pom.xml| 2 +-
 external/kinesis-asl-assembly/pom.xml  | 2 +-
 external/kinesis-asl/pom.xml   | 2 +-
 external/spark-ganglia-lgpl/pom.xml| 2 +-
 graphx/pom.xml | 2 +-
 hadoop-cloud/pom.xml   | 2 +-
 launcher/pom.xml   | 2 +-
 mllib-local/pom.xml| 2 +-
 mllib/pom.xml  | 2 +-
 pom.xml| 2 +-
 repl/pom.xml   | 2 +-
 resource-managers/kubernetes/core/pom.xml  | 2 +-
 resource-managers/kubernetes/integration-tests/pom.xml | 2 +-
 resource-managers/mesos/pom.xml| 2 +-
 resource-managers/yarn/pom.xml | 2 +-
 sql/catalyst/pom.xml   | 2 +-
 sql/core/pom.xml   | 2 +-
 sql/hive-thriftserver/pom.xml  | 2 +-
 sql/hive/pom.xml   | 2 +-
 streaming/pom.xml  | 2 +-
 tools/pom.xml  | 2 +-
 37 files changed, 38 insertions(+), 38 deletions(-)

diff --git a/assembly/pom.xml b/assembly/pom.xml
index 0f88fe4feaf..2e9c4d9960b 100644
--- a/assembly/pom.xml
+++ b/assembly/pom.xml
@@ -21,7 +21,7 @@
   
 org.apache.spark
 spark-parent_2.12
-3.3.0-SNAPSHOT
+3.3.0
 ../pom.xml
   
 
diff --git a/common/kvstore/pom.xml b/common/kvstore/pom.xml
index 15f7b8fa828..2a9acfa335e 100644
--- a/common/kvstore/pom.xml
+++ b/common/kvstore/pom.xml
@@ -22,7 +22,7 @@
   
 org.apache.spark
 spark-parent_2.12
-3.3.0-SNAPSHOT
+3.3.0
 ../../pom.xml
   
 
diff --git a/common/network-common/pom.xml b/common/network-common/pom.xml
index d652b6d1c8d..7b17e625d75 100644
--- a/common/network-common/pom.xml
+++ b/common/network-common/pom.xml
@@ -22,7 +22,7 @@
   
 org.apache.spark
 spark-parent_2.12
-3.3.0-SNAPSHOT
+3.3.0
 ../../pom.xml
   
 
diff --git a/common/network-shuffle/pom.xml b/common/network-shuffle/pom.xml
index db36da4799f..c5c920e7747 100644
--- a/common/network-shuffle/pom.xml
+++ b/common/network-shuffle/pom.xml
@@ -22,7 +22,7 @@
   
 org.apache.spark
 spark-parent_2.12
-3.3.0-SNAPSHOT
+3.3.0
 ../../pom.xml
   
 
diff --git a/common/network-yarn/pom.xml b/common/network-yarn/pom.xml
index 9e0a202edd1..697b5a3928e 100644
--- a/common/network-yarn/pom.xml
+++ b/common/network-yarn/pom.xml
@@ -22,7 +22,7 @@
   
 org.apache.spark
 spark-parent_2.12
-3.3.0-SNAPSHOT
+3.3.0
 ../../pom.xml
   
 
diff --git a/common/sketch/pom.xml b/common/sketch/pom.xml
index 068ef60b77f..ad2db11370a 100644
--- a/common/sketch/pom.xml
+++ b/common/sketch/pom.xml
@@ -22,7 +22,7 @@
   
 org.apache.spark
 spark-parent_2.12
-3.3.0-SNAPSHOT
+3.3.0
 ../../pom.xml
   
 
diff --git a/common/tags/pom.xml b/common/tags/pom.xml
index 5081579e38d..1a7bdee70f3 100644
--- a/common/tags/pom.xml
+++ b/common/tags/pom.xml
@@ -22,7 +22,7 @@
   
 org.apache.spark
 spark-parent_2.12
-3.3.0-SNAPSHOT
+3.3.0
 ../../pom.xml
   
 
diff --git a/common/unsafe/pom.xml b/common/unsafe/pom.xml
index 500f4083805..66dc93de059 100644
--- a/common/unsafe/pom.xml
+++ b/common/unsafe/pom.xml
@@ -22,7 +22,7 @@
   
 org.apache.spark
 spark-parent_2.12
-3.3.0-SNAPSHOT
+3.3.0
 ../../pom.xml
   
 
diff --git 

[spark] tag v3.3.0-rc1 created (now 482b7d54b52)

2022-05-03 Thread maxgekk
This is an automated email from the ASF dual-hosted git repository.

maxgekk pushed a change to tag v3.3.0-rc1
in repository https://gitbox.apache.org/repos/asf/spark.git


  at 482b7d54b52 (commit)
This tag includes the following new commits:

 new 482b7d54b52 Preparing Spark release v3.3.0-rc1

The 1 revisions listed above as "new" are entirely new to this
repository and will be described in separate emails.  The revisions
listed as "add" were already present in the repository and have only
been added to this reference.



-
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org



[spark] branch master updated (db7f346729d -> a1ac5c57c7b)

2022-05-03 Thread gurwls223
This is an automated email from the ASF dual-hosted git repository.

gurwls223 pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git


from db7f346729d [SPARK-39085][SQL] Move the error message of 
`INCONSISTENT_BEHAVIOR_CROSS_VERSION` to error-classes.json
 add a1ac5c57c7b [SPARK-35320][SQL][FOLLOWUP] Remove duplicated test

No new revisions were added by this update.

Summary of changes:
 .../src/test/scala/org/apache/spark/sql/JsonFunctionsSuite.scala  | 8 ++--
 1 file changed, 2 insertions(+), 6 deletions(-)


-
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org



[spark] branch branch-3.3 updated: [SPARK-35320][SQL][FOLLOWUP] Remove duplicated test

2022-05-03 Thread gurwls223
This is an automated email from the ASF dual-hosted git repository.

gurwls223 pushed a commit to branch branch-3.3
in repository https://gitbox.apache.org/repos/asf/spark.git


The following commit(s) were added to refs/heads/branch-3.3 by this push:
 new 4177626e634 [SPARK-35320][SQL][FOLLOWUP] Remove duplicated test
4177626e634 is described below

commit 4177626e634cbb0ee446fe042a4a6201b9d8531e
Author: itholic 
AuthorDate: Tue May 3 19:28:02 2022 +0900

[SPARK-35320][SQL][FOLLOWUP] Remove duplicated test

### What changes were proposed in this pull request?

Follow-up for https://github.com/apache/spark/pull/33525 to remove 
duplicated test.

### Why are the changes needed?

We don't need to do the same test twice.

### Does this PR introduce _any_ user-facing change?

No

### How was this patch tested?

This patch remove the duplicated test, so the existing test should pass.

Closes #36436 from itholic/SPARK-35320.

Authored-by: itholic 
Signed-off-by: Hyukjin Kwon 
(cherry picked from commit a1ac5c57c7b79fb70656638d284b77dfc4261d35)
Signed-off-by: Hyukjin Kwon 
---
 .../src/test/scala/org/apache/spark/sql/JsonFunctionsSuite.scala  | 8 ++--
 1 file changed, 2 insertions(+), 6 deletions(-)

diff --git 
a/sql/core/src/test/scala/org/apache/spark/sql/JsonFunctionsSuite.scala 
b/sql/core/src/test/scala/org/apache/spark/sql/JsonFunctionsSuite.scala
index c86e1f6e297..1c6bbc5a09d 100644
--- a/sql/core/src/test/scala/org/apache/spark/sql/JsonFunctionsSuite.scala
+++ b/sql/core/src/test/scala/org/apache/spark/sql/JsonFunctionsSuite.scala
@@ -391,14 +391,10 @@ class JsonFunctionsSuite extends QueryTest with 
SharedSparkSession {
   test("SPARK-24027: from_json of a map with unsupported key type") {
 val schema = MapType(StructType(StructField("f", IntegerType) :: Nil), 
StringType)
 val startMsg = "cannot resolve 'entries' due to data type mismatch:"
-val exception1 = intercept[AnalysisException] {
+val exception = intercept[AnalysisException] {
   Seq("""{{"f": 1}: "a"}""").toDS().select(from_json($"value", schema))
 }.getMessage
-assert(exception1.contains(startMsg))
-val exception2 = intercept[AnalysisException] {
-  Seq("""{{"f": 1}: "a"}""").toDS().select(from_json($"value", schema))
-}.getMessage
-assert(exception2.contains(startMsg))
+assert(exception.contains(startMsg))
   }
 
   test("SPARK-24709: infers schemas of json strings and pass them to 
from_json") {


-
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org