[spark] branch master updated (af536459501 -> 70f34278cbf)

2023-06-28 Thread maxgekk
This is an automated email from the ASF dual-hosted git repository.

maxgekk pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git


from af536459501 [SPARK-44237][CORE] Simplify DirectByteBuffer constructor 
lookup logic
 add 70f34278cbf [SPARK-44079][SQL] Fix `ArrayIndexOutOfBoundsException` 
when parse array as struct using PERMISSIVE mode with corrupt record

No new revisions were added by this update.

Summary of changes:
 .../spark/sql/catalyst/csv/UnivocityParser.scala |  4 ++--
 .../spark/sql/catalyst/json/JacksonParser.scala  | 20 +++-
 .../spark/sql/catalyst/util/BadRecordException.scala | 14 --
 .../spark/sql/catalyst/util/FailureSafeParser.scala  |  9 +++--
 .../sql/execution/datasources/json/JsonSuite.scala   | 15 +++
 5 files changed, 51 insertions(+), 11 deletions(-)


-
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org



[spark] branch master updated: [SPARK-44237][CORE] Simplify DirectByteBuffer constructor lookup logic

2023-06-28 Thread dongjoon
This is an automated email from the ASF dual-hosted git repository.

dongjoon pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git


The following commit(s) were added to refs/heads/master by this push:
 new af536459501 [SPARK-44237][CORE] Simplify DirectByteBuffer constructor 
lookup logic
af536459501 is described below

commit af5364595015acaed7e0499a70d2d40fddc1a400
Author: Dongjoon Hyun 
AuthorDate: Wed Jun 28 22:54:18 2023 -0700

[SPARK-44237][CORE] Simplify DirectByteBuffer constructor lookup logic

### What changes were proposed in this pull request?

This PR aims to simplify `DirectByteBuffer` constructor lookup logic.

### Why are the changes needed?

`try-catch` statement is not needed because we know version number already.

### Does this PR introduce _any_ user-facing change?

No.

### How was this patch tested?

Pass the CIs.

Closes #41780 from dongjoon-hyun/SPARK-44237.

Authored-by: Dongjoon Hyun 
Signed-off-by: Dongjoon Hyun 
---
 .../src/main/java/org/apache/spark/unsafe/Platform.java   | 11 +++
 1 file changed, 3 insertions(+), 8 deletions(-)

diff --git a/common/unsafe/src/main/java/org/apache/spark/unsafe/Platform.java 
b/common/unsafe/src/main/java/org/apache/spark/unsafe/Platform.java
index 4dd51991ba4..a91ea2ee6b5 100644
--- a/common/unsafe/src/main/java/org/apache/spark/unsafe/Platform.java
+++ b/common/unsafe/src/main/java/org/apache/spark/unsafe/Platform.java
@@ -68,14 +68,9 @@ public final class Platform {
 }
 try {
   Class cls = Class.forName("java.nio.DirectByteBuffer");
-  Constructor constructor;
-  try {
-constructor = cls.getDeclaredConstructor(Long.TYPE, Integer.TYPE);
-  } catch (NoSuchMethodException e) {
-// DirectByteBuffer(long,int) was removed in
-// 
https://github.com/openjdk/jdk/commit/a56598f5a534cc9223367e7faa8433ea38661db9
-constructor = cls.getDeclaredConstructor(Long.TYPE, Long.TYPE);
-  }
+  Constructor constructor = (majorVersion < 21) ?
+cls.getDeclaredConstructor(Long.TYPE, Integer.TYPE) :
+cls.getDeclaredConstructor(Long.TYPE, Long.TYPE);
   Field cleanerField = cls.getDeclaredField("cleaner");
   try {
 constructor.setAccessible(true);


-
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org



[spark] branch master updated: [SPARK-44220][SQL] Move StringConcat to sql/api

2023-06-28 Thread yao
This is an automated email from the ASF dual-hosted git repository.

yao pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git


The following commit(s) were added to refs/heads/master by this push:
 new e5a5921968c [SPARK-44220][SQL] Move StringConcat to sql/api
e5a5921968c is described below

commit e5a5921968c84601ce005a7785bdd08c41a2d862
Author: Rui Wang 
AuthorDate: Thu Jun 29 11:52:06 2023 +0800

[SPARK-44220][SQL] Move StringConcat to sql/api

### What changes were proposed in this pull request?

Move StringConcat to `sql/api` module.

### Why are the changes needed?

StringConcat is widely used in data types. As we plan to move entire data 
type family to sql/api, we should move StringConcat.

### Does this PR introduce _any_ user-facing change?

No

### How was this patch tested?

Existing UT

Closes #41764 from amaliujia/move_out_string_concat.

Authored-by: Rui Wang 
Signed-off-by: Kent Yao 
---
 common/unsafe/pom.xml  |  7 +++
 .../spark/unsafe/array/ByteArrayMethods.java   |  7 +--
 .../apache/spark/unsafe/array/ByteArrayUtils.java  | 27 +
 sql/api/pom.xml|  5 ++
 .../spark/sql/catalyst/util/StringUtils.scala  | 65 ++
 sql/catalyst/pom.xml   |  5 ++
 .../spark/sql/catalyst/util/StringUtils.scala  | 47 
 .../org/apache/spark/sql/types/ArrayType.scala |  2 +-
 .../org/apache/spark/sql/types/DataType.scala  |  2 +-
 .../scala/org/apache/spark/sql/types/MapType.scala |  2 +-
 .../org/apache/spark/sql/types/StructField.scala   |  2 +-
 .../org/apache/spark/sql/types/StructType.scala|  8 +--
 .../org/apache/spark/sql/types/DataTypeSuite.scala |  2 +-
 .../apache/spark/sql/execution/debug/package.scala |  2 +-
 14 files changed, 120 insertions(+), 63 deletions(-)

diff --git a/common/unsafe/pom.xml b/common/unsafe/pom.xml
index a61f00084eb..bdf82d9285e 100644
--- a/common/unsafe/pom.xml
+++ b/common/unsafe/pom.xml
@@ -38,6 +38,13 @@
 
   org.apache.spark
   spark-tags_${scala.binary.version}
+  ${project.version}
+
+
+
+  org.apache.spark
+  spark-common-utils_${scala.binary.version}
+  ${project.version}
 
 
 

[spark] branch master updated: [SPARK-43265][CORE][FOLLOW-UP] Move Error framework to a common utils module

2023-06-28 Thread yao
This is an automated email from the ASF dual-hosted git repository.

yao pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git


The following commit(s) were added to refs/heads/master by this push:
 new a1592fa7ff2 [SPARK-43265][CORE][FOLLOW-UP] Move Error framework to a 
common utils module
a1592fa7ff2 is described below

commit a1592fa7ff2674f6be956f86a61e8da3554601cf
Author: Rui Wang 
AuthorDate: Thu Jun 29 11:49:46 2023 +0800

[SPARK-43265][CORE][FOLLOW-UP] Move Error framework to a common utils module

### What changes were proposed in this pull request?

We can also move `error-classes.json` to `common-utils`.

### Why are the changes needed?

So Scala client can re-use the file but do not need to depend on Spark core.

### Does this PR introduce _any_ user-facing change?

No

### How was this patch tested?

Existing tests

Closes #41774 from amaliujia/move_error_file.

Authored-by: Rui Wang 
Signed-off-by: Kent Yao 
---
 {core => common/utils}/src/main/resources/error/README.md  | 0
 {core => common/utils}/src/main/resources/error/error-classes.json | 0
 core/src/test/scala/org/apache/spark/SparkThrowableSuite.scala | 2 +-
 3 files changed, 1 insertion(+), 1 deletion(-)

diff --git a/core/src/main/resources/error/README.md 
b/common/utils/src/main/resources/error/README.md
similarity index 100%
rename from core/src/main/resources/error/README.md
rename to common/utils/src/main/resources/error/README.md
diff --git a/core/src/main/resources/error/error-classes.json 
b/common/utils/src/main/resources/error/error-classes.json
similarity index 100%
rename from core/src/main/resources/error/error-classes.json
rename to common/utils/src/main/resources/error/error-classes.json
diff --git a/core/src/test/scala/org/apache/spark/SparkThrowableSuite.scala 
b/core/src/test/scala/org/apache/spark/SparkThrowableSuite.scala
index e9554da082a..96c4e3b8ab7 100644
--- a/core/src/test/scala/org/apache/spark/SparkThrowableSuite.scala
+++ b/core/src/test/scala/org/apache/spark/SparkThrowableSuite.scala
@@ -47,7 +47,7 @@ class SparkThrowableSuite extends SparkFunSuite {
}}}
*/
   private val errorJsonFilePath = getWorkspaceFilePath(
-"core", "src", "main", "resources", "error", "error-classes.json")
+"common", "utils", "src", "main", "resources", "error", 
"error-classes.json")
 
   private val errorReader = new 
ErrorClassesJsonReader(Seq(errorJsonFilePath.toUri.toURL))
 


-
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org



[spark] branch master updated: [SPARK-44231][BUILD] Update ORC to 1.9.0

2023-06-28 Thread dongjoon
This is an automated email from the ASF dual-hosted git repository.

dongjoon pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git


The following commit(s) were added to refs/heads/master by this push:
 new 1a5b9f295a6 [SPARK-44231][BUILD] Update ORC to 1.9.0
1a5b9f295a6 is described below

commit 1a5b9f295a623901100b6f41e785138fba0bc156
Author: William Hyun 
AuthorDate: Wed Jun 28 15:18:19 2023 -0700

[SPARK-44231][BUILD] Update ORC to 1.9.0

### What changes were proposed in this pull request?
This PR aims to update ORC to 1.9.0.

### Why are the changes needed?
This is the newest version of ORC with the following improvements:
- https://github.com/apache/orc/milestone/10?closed=1
- https://github.com/apache/orc/releases/tag/v1.9.0

### Does this PR introduce _any_ user-facing change?
No.

### How was this patch tested?
Pass the CIs.

Closes #41775 from williamhyun/orc190.

Authored-by: William Hyun 
Signed-off-by: Dongjoon Hyun 
---
 dev/deps/spark-deps-hadoop-3-hive-2.3 | 8 
 pom.xml   | 2 +-
 2 files changed, 5 insertions(+), 5 deletions(-)

diff --git a/dev/deps/spark-deps-hadoop-3-hive-2.3 
b/dev/deps/spark-deps-hadoop-3-hive-2.3
index 8630040abad..aac141ccfe0 100644
--- a/dev/deps/spark-deps-hadoop-3-hive-2.3
+++ b/dev/deps/spark-deps-hadoop-3-hive-2.3
@@ -4,7 +4,7 @@ JTransforms/3.1//JTransforms-3.1.jar
 RoaringBitmap/0.9.45//RoaringBitmap-0.9.45.jar
 ST4/4.0.4//ST4-4.0.4.jar
 activation/1.1.1//activation-1.1.1.jar
-aircompressor/0.21//aircompressor-0.21.jar
+aircompressor/0.24//aircompressor-0.24.jar
 algebra_2.12/2.0.1//algebra_2.12-2.0.1.jar
 aliyun-java-sdk-core/4.5.10//aliyun-java-sdk-core-4.5.10.jar
 aliyun-java-sdk-kms/2.11.0//aliyun-java-sdk-kms-2.11.0.jar
@@ -208,9 +208,9 @@ opencsv/2.3//opencsv-2.3.jar
 opentracing-api/0.33.0//opentracing-api-0.33.0.jar
 opentracing-noop/0.33.0//opentracing-noop-0.33.0.jar
 opentracing-util/0.33.0//opentracing-util-0.33.0.jar
-orc-core/1.8.4/shaded-protobuf/orc-core-1.8.4-shaded-protobuf.jar
-orc-mapreduce/1.8.4/shaded-protobuf/orc-mapreduce-1.8.4-shaded-protobuf.jar
-orc-shims/1.8.4//orc-shims-1.8.4.jar
+orc-core/1.9.0/shaded-protobuf/orc-core-1.9.0-shaded-protobuf.jar
+orc-mapreduce/1.9.0/shaded-protobuf/orc-mapreduce-1.9.0-shaded-protobuf.jar
+orc-shims/1.9.0//orc-shims-1.9.0.jar
 oro/2.0.8//oro-2.0.8.jar
 osgi-resource-locator/1.0.3//osgi-resource-locator-1.0.3.jar
 paranamer/2.8//paranamer-2.8.jar
diff --git a/pom.xml b/pom.xml
index c8dbd0ab2d8..1c60a5c7db7 100644
--- a/pom.xml
+++ b/pom.xml
@@ -141,7 +141,7 @@
 
 10.14.2.0
 1.13.1
-1.8.4
+1.9.0
 shaded-protobuf
 9.4.51.v20230217
 4.0.3


-
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org



[spark] branch master updated (97a94b74496 -> cd6aa5819f3)

2023-06-28 Thread dongjoon
This is an automated email from the ASF dual-hosted git repository.

dongjoon pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git


from 97a94b74496 [SPARK-44205][SQL] Extract Catalyst Code from DecimalType
 add cd6aa5819f3 [SPARK-44230][SQL][TESTS] Make `sql` module passes in Java 
21

No new revisions were added by this update.

Summary of changes:
 .../sql/execution/arrow/ArrowConvertersSuite.scala | 59 ++
 1 file changed, 59 insertions(+)


-
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org



[spark] branch master updated (d2054fc358d -> 97a94b74496)

2023-06-28 Thread wenchen
This is an automated email from the ASF dual-hosted git repository.

wenchen pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git


from d2054fc358d [SPARK-44128][BUILD] Upgrade netty to 4.1.93
 add 97a94b74496 [SPARK-44205][SQL] Extract Catalyst Code from DecimalType

No new revisions were added by this update.

Summary of changes:
 project/MimaExcludes.scala  |  4 +++-
 .../spark/sql/catalyst/analysis/DecimalPrecision.scala  | 17 +
 .../apache/spark/sql/catalyst/optimizer/Optimizer.scala |  8 
 .../apache/spark/sql/catalyst/types/DataTypeUtils.scala | 15 +--
 .../org/apache/spark/sql/types/DataTypeExpression.scala |  7 +++
 .../scala/org/apache/spark/sql/types/DecimalType.scala  | 17 -
 6 files changed, 36 insertions(+), 32 deletions(-)


-
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org



[spark] branch master updated: [SPARK-44128][BUILD] Upgrade netty to 4.1.93

2023-06-28 Thread dongjoon
This is an automated email from the ASF dual-hosted git repository.

dongjoon pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git


The following commit(s) were added to refs/heads/master by this push:
 new d2054fc358d [SPARK-44128][BUILD] Upgrade netty to 4.1.93
d2054fc358d is described below

commit d2054fc358d4727d13e3fb7cb26c35b68f6a3fb8
Author: panbingkun 
AuthorDate: Wed Jun 28 11:05:20 2023 -0700

[SPARK-44128][BUILD] Upgrade netty to 4.1.93

### What changes were proposed in this pull request?
This pr aims to upgrade netty from 4.1.92 to 4.1.93.

### Why are the changes needed?
1.v4.1.92 VS v4.1.93

https://github.com/netty/netty/compare/netty-4.1.92.Final...netty-4.1.93.Final

2.The new version brings some bug fix, eg:
- Reset byte buffer in loop for AbstractDiskHttpData.setContent 
([#13320](https://github.com/netty/netty/pull/13320))
- OpenSSL MAX_CERTIFICATE_LIST_BYTES option supported 
([#13365](https://github.com/netty/netty/pull/13365))
- Adapt to DirectByteBuffer constructor in Java 21 
([#13366](https://github.com/netty/netty/pull/13366))
- HTTP/2 encoder: allow HEADER_TABLE_SIZE greater than Integer.MAX_VALUE 
([#13368](https://github.com/netty/netty/pull/13368))
- Upgrade to latest netty-tcnative to fix memory leak 
([#13375](https://github.com/netty/netty/pull/13375))
- H2/H2C server stream channels deactivated while write still in progress 
([#13388](https://github.com/netty/netty/pull/13388))
- Channel#bytesBefore(un)writable off by 1 
([#13389](https://github.com/netty/netty/pull/13389))
- HTTP/2 should forward shutdown user events to active streams 
([#13394](https://github.com/netty/netty/pull/13394))
- Respect the number of bytes read per datagram when using recvmmsg 
([#13399](https://github.com/netty/netty/pull/13399))

3.The release notes as follows:
- https://netty.io/news/2023/05/25/4-1-93-Final.html

4.Why not upgrade to `4-1-94-Final` version?
Because the return value of the 'threadCache()' (from `PoolThreadCache` to 
`PoolArenasCache`) method of the netty Inner class used in the 'arrow memory 
netty' version '12.0.1' has changed and belongs to break change, let's wait for 
the upgrade of the 'arrow memory netty' before upgrading to the '4-1-94-Final' 
version.

The reference is as follows:

https://github.com/apache/arrow/blob/6af660f48472b8b45a5e01b7136b9b040b185eb1/java/memory/memory-netty/src/main/java/io/netty/buffer/PooledByteBufAllocatorL.java#L164

https://github.com/netty/netty/blob/da1a448d5bc4f36cc1744db93fcaf64e198db2bd/buffer/src/main/java/io/netty/buffer/PooledByteBufAllocator.java#L732-L736

### Does this PR introduce _any_ user-facing change?
No

### How was this patch tested?
Pass GA.

Closes #41681 from panbingkun/upgrade_netty.

Authored-by: panbingkun 
Signed-off-by: Dongjoon Hyun 
---
 dev/deps/spark-deps-hadoop-3-hive-2.3 | 36 +--
 pom.xml   |  6 +-
 2 files changed, 23 insertions(+), 19 deletions(-)

diff --git a/dev/deps/spark-deps-hadoop-3-hive-2.3 
b/dev/deps/spark-deps-hadoop-3-hive-2.3
index 153f5b57ce5..8630040abad 100644
--- a/dev/deps/spark-deps-hadoop-3-hive-2.3
+++ b/dev/deps/spark-deps-hadoop-3-hive-2.3
@@ -183,24 +183,24 @@ metrics-jmx/4.2.18//metrics-jmx-4.2.18.jar
 metrics-json/4.2.18//metrics-json-4.2.18.jar
 metrics-jvm/4.2.18//metrics-jvm-4.2.18.jar
 minlog/1.3.0//minlog-1.3.0.jar
-netty-all/4.1.92.Final//netty-all-4.1.92.Final.jar
-netty-buffer/4.1.92.Final//netty-buffer-4.1.92.Final.jar
-netty-codec-http/4.1.92.Final//netty-codec-http-4.1.92.Final.jar
-netty-codec-http2/4.1.92.Final//netty-codec-http2-4.1.92.Final.jar
-netty-codec-socks/4.1.92.Final//netty-codec-socks-4.1.92.Final.jar
-netty-codec/4.1.92.Final//netty-codec-4.1.92.Final.jar
-netty-common/4.1.92.Final//netty-common-4.1.92.Final.jar
-netty-handler-proxy/4.1.92.Final//netty-handler-proxy-4.1.92.Final.jar
-netty-handler/4.1.92.Final//netty-handler-4.1.92.Final.jar
-netty-resolver/4.1.92.Final//netty-resolver-4.1.92.Final.jar
-netty-transport-classes-epoll/4.1.92.Final//netty-transport-classes-epoll-4.1.92.Final.jar
-netty-transport-classes-kqueue/4.1.92.Final//netty-transport-classes-kqueue-4.1.92.Final.jar
-netty-transport-native-epoll/4.1.92.Final/linux-aarch_64/netty-transport-native-epoll-4.1.92.Final-linux-aarch_64.jar
-netty-transport-native-epoll/4.1.92.Final/linux-x86_64/netty-transport-native-epoll-4.1.92.Final-linux-x86_64.jar
-netty-transport-native-kqueue/4.1.92.Final/osx-aarch_64/netty-transport-native-kqueue-4.1.92.Final-osx-aarch_64.jar
-netty-transport-native-kqueue/4.1.92.Final/osx-x86_64/netty-transport-native-kqueue-4.1.92.Final-osx-x86_64.jar
-netty-transport-native-unix-common/4.1.92.Final//netty-transport-native-unix-common-4.1.92.Final.jar
-netty-transport/4.1.92.Final//netty-transpor

[spark] branch master updated: [SPARK-43757][CONNECT] Change client compatibility from allow list to deny list

2023-06-28 Thread hvanhovell
This is an automated email from the ASF dual-hosted git repository.

hvanhovell pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git


The following commit(s) were added to refs/heads/master by this push:
 new 1fcd537a37b [SPARK-43757][CONNECT] Change client compatibility from 
allow list to deny list
1fcd537a37b is described below

commit 1fcd537a37b2457092e20f8034f23917a8ae2ffa
Author: Zhen Li 
AuthorDate: Wed Jun 28 10:37:38 2023 -0400

[SPARK-43757][CONNECT] Change client compatibility from allow list to deny 
list

### What changes were proposed in this pull request?
Expand the client compatibility check to include all sql APIs.

### Why are the changes needed?
Enhance the API compatibility coverage

### Does this PR introduce _any_ user-facing change?
No, except it fixes a few wrong types and hides a few helper methods 
internally.

### How was this patch tested?
Existing tests.

Closes #41284 from zhenlineo/compatibility-check-allowlist.

Authored-by: Zhen Li 
Signed-off-by: Herman van Hovell 
---
 .../apache/spark/sql/KeyValueGroupedDataset.scala  |   6 +-
 .../scala/org/apache/spark/sql/SparkSession.scala  |   2 +-
 .../sql/streaming/StreamingQueryException.scala|   3 +-
 .../sql/streaming/StreamingQueryManager.scala  |   3 +-
 .../CheckConnectJvmClientCompatibility.scala   | 327 ++---
 5 files changed, 225 insertions(+), 116 deletions(-)

diff --git 
a/connector/connect/client/jvm/src/main/scala/org/apache/spark/sql/KeyValueGroupedDataset.scala
 
b/connector/connect/client/jvm/src/main/scala/org/apache/spark/sql/KeyValueGroupedDataset.scala
index 20c130b83cb..e67ef1c0fa7 100644
--- 
a/connector/connect/client/jvm/src/main/scala/org/apache/spark/sql/KeyValueGroupedDataset.scala
+++ 
b/connector/connect/client/jvm/src/main/scala/org/apache/spark/sql/KeyValueGroupedDataset.scala
@@ -38,7 +38,7 @@ import org.apache.spark.sql.streaming.{GroupState, 
GroupStateTimeout, OutputMode
  *
  * @since 3.5.0
  */
-abstract class KeyValueGroupedDataset[K, V] private[sql] () extends 
Serializable {
+class KeyValueGroupedDataset[K, V] private[sql] () extends Serializable {
 
   /**
* Returns a new [[KeyValueGroupedDataset]] where the type of the key has 
been mapped to the
@@ -462,7 +462,7 @@ abstract class KeyValueGroupedDataset[K, V] private[sql] () 
extends Serializable
   UdfUtils.coGroupFunctionToScalaFunc(f))(encoder)
   }
 
-  protected def flatMapGroupsWithStateHelper[S: Encoder, U: Encoder](
+  protected[sql] def flatMapGroupsWithStateHelper[S: Encoder, U: Encoder](
   outputMode: Option[OutputMode],
   timeoutConf: GroupStateTimeout,
   initialState: Option[KeyValueGroupedDataset[K, S]],
@@ -923,7 +923,7 @@ private class KeyValueGroupedDatasetImpl[K, V, IK, IV](
 agg(aggregator)
   }
 
-  override protected def flatMapGroupsWithStateHelper[S: Encoder, U: Encoder](
+  override protected[sql] def flatMapGroupsWithStateHelper[S: Encoder, U: 
Encoder](
   outputMode: Option[OutputMode],
   timeoutConf: GroupStateTimeout,
   initialState: Option[KeyValueGroupedDataset[K, S]],
diff --git 
a/connector/connect/client/jvm/src/main/scala/org/apache/spark/sql/SparkSession.scala
 
b/connector/connect/client/jvm/src/main/scala/org/apache/spark/sql/SparkSession.scala
index 45e7dca38d7..54e9102c55c 100644
--- 
a/connector/connect/client/jvm/src/main/scala/org/apache/spark/sql/SparkSession.scala
+++ 
b/connector/connect/client/jvm/src/main/scala/org/apache/spark/sql/SparkSession.scala
@@ -429,7 +429,7 @@ class SparkSession private[sql] (
*
* @since 3.4.0
*/
-  object implicits extends SQLImplicits(this)
+  object implicits extends SQLImplicits(this) with Serializable
   // scalastyle:on
 
   def newSession(): SparkSession = {
diff --git 
a/connector/connect/client/jvm/src/main/scala/org/apache/spark/sql/streaming/StreamingQueryException.scala
 
b/connector/connect/client/jvm/src/main/scala/org/apache/spark/sql/streaming/StreamingQueryException.scala
index d5e9982dfbf..512c94f5c70 100644
--- 
a/connector/connect/client/jvm/src/main/scala/org/apache/spark/sql/streaming/StreamingQueryException.scala
+++ 
b/connector/connect/client/jvm/src/main/scala/org/apache/spark/sql/streaming/StreamingQueryException.scala
@@ -36,7 +36,8 @@ class StreamingQueryException private[sql] (
 message: String,
 errorClass: String,
 stackTrace: String)
-extends SparkThrowable {
+extends Exception(message)
+with SparkThrowable {
 
   override def getErrorClass: String = errorClass
 
diff --git 
a/connector/connect/client/jvm/src/main/scala/org/apache/spark/sql/streaming/StreamingQueryManager.scala
 
b/connector/connect/client/jvm/src/main/scala/org/apache/spark/sql/streaming/StreamingQueryManager.scala
index 775921ff579..13bbf470639 100644
--- 
a/connector/connect/client/jvm/src/main/scala/org/apache/spark/sql/streaming/StreamingQuer

[spark] branch master updated (f26bdb7bfde -> d14a6ecd9e1)

2023-06-28 Thread maxgekk
This is an automated email from the ASF dual-hosted git repository.

maxgekk pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git


from f26bdb7bfde [SPARK-44222][BUILD][PYTHON] Upgrade `grpc` to 1.56.0
 add d14a6ecd9e1 [SPARK-40850][SQL] Fix test case interpreted queries may 
execute Codegen

No new revisions were added by this update.

Summary of changes:
 .../src/test/scala/org/apache/spark/sql/catalyst/plans/PlanTest.scala  | 3 ++-
 1 file changed, 2 insertions(+), 1 deletion(-)


-
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org



[spark] branch master updated (1c8c47cb55d -> f26bdb7bfde)

2023-06-28 Thread dongjoon
This is an automated email from the ASF dual-hosted git repository.

dongjoon pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git


from 1c8c47cb55d [SPARK-43914][SQL] Assign names to the error class 
_LEGACY_ERROR_TEMP_[2433-2437]
 add f26bdb7bfde [SPARK-44222][BUILD][PYTHON] Upgrade `grpc` to 1.56.0

No new revisions were added by this update.

Summary of changes:
 .github/workflows/build_and_test.yml   | 4 ++--
 connector/connect/common/src/main/buf.gen.yaml | 4 ++--
 dev/create-release/spark-rm/Dockerfile | 2 +-
 dev/requirements.txt   | 4 ++--
 pom.xml| 2 +-
 project/SparkBuild.scala   | 2 +-
 python/docs/source/getting_started/install.rst | 4 ++--
 python/setup.py| 2 +-
 8 files changed, 12 insertions(+), 12 deletions(-)


-
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org