This is an automated email from the ASF dual-hosted git repository.

dongjoon pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git


The following commit(s) were added to refs/heads/master by this push:
     new 6e5d1db9058d [SPARK-46919][BUILD][CONNECT] Upgrade `grpcio*` and 
`grpc-java` to 1.62.x
6e5d1db9058d is described below

commit 6e5d1db9058de62a45f35d3f41e028a72f688b70
Author: yangjie01 <yangji...@baidu.com>
AuthorDate: Tue Mar 12 08:00:37 2024 -0700

    [SPARK-46919][BUILD][CONNECT] Upgrade `grpcio*` and `grpc-java` to 1.62.x
    
    ### What changes were proposed in this pull request?
    This PR aims to upgrade `grpcio*` from 1.59.3 to 
[1.62.0](https://pypi.org/project/grpcio/1.62.0/)and `grpc-java` from 1.59.0 to 
1.62.2 for Apache Spark 4.0.0.
    
    ### Why are the changes needed?
    grpc 1.60.0 start to support dualstack IPv4 and IPv6 backend support:
    
    - Implemented dualstack IPv4 and IPv6 backend support, as per draft gRFC 
A61. xDS support currently guarded by GRPC_EXPERIMENTAL_XDS_DUALSTACK_ENDPOINTS 
env var.
    
    Note that in `grpc-java` 1.61.0, since the dependency scope of 
`grpc-protobuf` on `grpc-protobuf-lite` has been changed from `compile` to 
`runtime`, we need to manually configure the dependency of the `connect` module 
on `grpc-protobuf-lite` and explicitly exclude the dependency on 
`protobuf-javalite` because `SparkConnectService` uses 
`io.grpc.protobuf.lite.ProtoLiteUtils`
    
    - https://github.com/grpc/grpc-java/pull/10756/files
    
    The relevant release notes are as follows:
    - https://github.com/grpc/grpc/releases/tag/v1.60.0
    - https://github.com/grpc/grpc/releases/tag/v1.60.1
    - https://github.com/grpc/grpc/releases/tag/v1.61.0
    - https://github.com/grpc/grpc/releases/tag/v1.61.1
    - https://github.com/grpc/grpc/releases/tag/v1.62.0
    - https://github.com/grpc/grpc-java/releases/tag/v1.60.0
    - https://github.com/grpc/grpc-java/releases/tag/v1.60.1
    - https://github.com/grpc/grpc-java/releases/tag/v1.61.0
    - https://github.com/grpc/grpc-java/releases/tag/v1.61.1
    - https://github.com/grpc/grpc-java/releases/tag/v1.62.2
    
    ### Does this PR introduce _any_ user-facing change?
    No
    
    ### How was this patch tested?
    Pass GitHub Actions
    
    ### Was this patch authored or co-authored using generative AI tooling?
    No
    
    Closes #44929 from LuciferYang/grpc-16.
    
    Lead-authored-by: yangjie01 <yangji...@baidu.com>
    Co-authored-by: YangJie <yangji...@baidu.com>
    Signed-off-by: Dongjoon Hyun <dh...@apple.com>
---
 .github/workflows/build_and_test.yml           |  4 ++--
 .github/workflows/maven_test.yml               |  2 +-
 connector/connect/common/src/main/buf.gen.yaml |  4 ++--
 connector/connect/server/pom.xml               | 11 +++++++++++
 dev/create-release/spark-rm/Dockerfile         |  2 +-
 dev/infra/Dockerfile                           |  2 +-
 dev/requirements.txt                           |  4 ++--
 pom.xml                                        |  2 +-
 project/SparkBuild.scala                       |  2 +-
 python/docs/source/getting_started/install.rst |  4 ++--
 python/setup.py                                |  2 +-
 11 files changed, 25 insertions(+), 14 deletions(-)

diff --git a/.github/workflows/build_and_test.yml 
b/.github/workflows/build_and_test.yml
index 4f2be1c04f98..faa495fe5dfc 100644
--- a/.github/workflows/build_and_test.yml
+++ b/.github/workflows/build_and_test.yml
@@ -252,7 +252,7 @@ jobs:
     - name: Install Python packages (Python 3.9)
       if: (contains(matrix.modules, 'sql') && !contains(matrix.modules, 
'sql-')) || contains(matrix.modules, 'connect')
       run: |
-        python3.9 -m pip install 'numpy>=1.20.0' pyarrow pandas scipy 
unittest-xml-reporting 'lxml==4.9.4' 'grpcio==1.59.3' 'grpcio-status==1.59.3' 
'protobuf==4.25.1'
+        python3.9 -m pip install 'numpy>=1.20.0' pyarrow pandas scipy 
unittest-xml-reporting 'lxml==4.9.4' 'grpcio==1.62.0' 'grpcio-status==1.62.0' 
'protobuf==4.25.1'
         python3.9 -m pip list
     # Run the tests.
     - name: Run tests
@@ -702,7 +702,7 @@ jobs:
         python3.9 -m pip install 'sphinx==4.5.0' mkdocs 
'pydata_sphinx_theme>=0.13' sphinx-copybutton nbsphinx numpydoc jinja2 
markupsafe 'pyzmq<24.0.0' \
           ipython ipython_genutils sphinx_plotly_directive 'numpy>=1.20.0' 
pyarrow pandas 'plotly>=4.8' 'docutils<0.18.0' \
           'flake8==3.9.0' 'mypy==1.8.0' 'pytest==7.1.3' 
'pytest-mypy-plugins==1.9.3' 'black==23.9.1' \
-          'pandas-stubs==1.2.0.53' 'grpcio==1.59.3' 'grpc-stubs==1.24.11' 
'googleapis-common-protos-stubs==2.2.0' \
+          'pandas-stubs==1.2.0.53' 'grpcio==1.62.0' 'grpc-stubs==1.24.11' 
'googleapis-common-protos-stubs==2.2.0' \
           'sphinxcontrib-applehelp==1.0.4' 'sphinxcontrib-devhelp==1.0.2' 
'sphinxcontrib-htmlhelp==2.0.1' 'sphinxcontrib-qthelp==1.0.3' 
'sphinxcontrib-serializinghtml==1.1.5'
         python3.9 -m pip list
     - name: Python linter
diff --git a/.github/workflows/maven_test.yml b/.github/workflows/maven_test.yml
index d63066a521f9..34fa9a8b7768 100644
--- a/.github/workflows/maven_test.yml
+++ b/.github/workflows/maven_test.yml
@@ -179,7 +179,7 @@ jobs:
       - name: Install Python packages (Python 3.11)
         if: (contains(matrix.modules, 'sql#core')) || contains(matrix.modules, 
'connect')
         run: |
-          python3.11 -m pip install 'numpy>=1.20.0' pyarrow pandas scipy 
unittest-xml-reporting 'grpcio==1.59.3' 'grpcio-status==1.59.3' 
'protobuf==4.25.1'
+          python3.11 -m pip install 'numpy>=1.20.0' pyarrow pandas scipy 
unittest-xml-reporting 'grpcio==1.62.0' 'grpcio-status==1.62.0' 
'protobuf==4.25.1'
           python3.11 -m pip list
       # Run the tests.
       - name: Run tests
diff --git a/connector/connect/common/src/main/buf.gen.yaml 
b/connector/connect/common/src/main/buf.gen.yaml
index 902247453579..9b0b07932eae 100644
--- a/connector/connect/common/src/main/buf.gen.yaml
+++ b/connector/connect/common/src/main/buf.gen.yaml
@@ -22,14 +22,14 @@ plugins:
     out: gen/proto/csharp
   - plugin: buf.build/protocolbuffers/java:v21.7
     out: gen/proto/java
-  - plugin: buf.build/grpc/ruby:v1.59.3
+  - plugin: buf.build/grpc/ruby:v1.62.0
     out: gen/proto/ruby
   - plugin: buf.build/protocolbuffers/ruby:v21.7
     out: gen/proto/ruby
    # Building the Python build and building the mypy interfaces.
   - plugin: buf.build/protocolbuffers/python:v21.7
     out: gen/proto/python
-  - plugin: buf.build/grpc/python:v1.59.3
+  - plugin: buf.build/grpc/python:v1.62.0
     out: gen/proto/python
   - name: mypy
     out: gen/proto/python
diff --git a/connector/connect/server/pom.xml b/connector/connect/server/pom.xml
index 9f10adf03dcf..f093f4f19622 100644
--- a/connector/connect/server/pom.xml
+++ b/connector/connect/server/pom.xml
@@ -196,6 +196,17 @@
       <artifactId>grpc-protobuf</artifactId>
       <version>${io.grpc.version}</version>
     </dependency>
+    <dependency>
+      <groupId>io.grpc</groupId>
+      <artifactId>grpc-protobuf-lite</artifactId>
+      <version>${io.grpc.version}</version>
+      <exclusions>
+        <exclusion>
+          <groupId>com.google.protobuf</groupId>
+          <artifactId>protobuf-javalite</artifactId>
+        </exclusion>
+      </exclusions>
+    </dependency>
     <dependency>
       <groupId>io.grpc</groupId>
       <artifactId>grpc-services</artifactId>
diff --git a/dev/create-release/spark-rm/Dockerfile 
b/dev/create-release/spark-rm/Dockerfile
index eeddfdf9d009..2cd50999c4cc 100644
--- a/dev/create-release/spark-rm/Dockerfile
+++ b/dev/create-release/spark-rm/Dockerfile
@@ -37,7 +37,7 @@ ENV DEBCONF_NONINTERACTIVE_SEEN true
 # These arguments are just for reuse and not really meant to be customized.
 ARG APT_INSTALL="apt-get install --no-install-recommends -y"
 
-ARG PIP_PKGS="sphinx==4.5.0 mkdocs==1.1.2 numpy==1.20.3 
pydata_sphinx_theme==0.13.3 ipython==7.19.0 nbsphinx==0.8.0 numpydoc==1.1.0 
jinja2==3.1.2 twine==3.4.1 sphinx-plotly-directive==0.1.3 
sphinx-copybutton==0.5.2 pandas==1.5.3 pyarrow==3.0.0 plotly==5.4.0 
markupsafe==2.0.1 docutils<0.17 grpcio==1.59.3 protobuf==4.21.6 
grpcio-status==1.59.3 googleapis-common-protos==1.56.4"
+ARG PIP_PKGS="sphinx==4.5.0 mkdocs==1.1.2 numpy==1.20.3 
pydata_sphinx_theme==0.13.3 ipython==7.19.0 nbsphinx==0.8.0 numpydoc==1.1.0 
jinja2==3.1.2 twine==3.4.1 sphinx-plotly-directive==0.1.3 
sphinx-copybutton==0.5.2 pandas==1.5.3 pyarrow==3.0.0 plotly==5.4.0 
markupsafe==2.0.1 docutils<0.17 grpcio==1.62.0 protobuf==4.21.6 
grpcio-status==1.62.0 googleapis-common-protos==1.56.4"
 ARG GEM_PKGS="bundler:2.3.8"
 
 # Install extra needed repos and refresh.
diff --git a/dev/infra/Dockerfile b/dev/infra/Dockerfile
index 528e8b11e432..64adf33e6742 100644
--- a/dev/infra/Dockerfile
+++ b/dev/infra/Dockerfile
@@ -96,7 +96,7 @@ RUN pypy3 -m pip install numpy 'six==1.16.0' 'pandas<=2.2.1' 
scipy coverage matp
 
 ARG BASIC_PIP_PKGS="numpy pyarrow>=15.0.0 six==1.16.0 pandas<=2.2.1 scipy 
plotly>=4.8 mlflow>=2.8.1 coverage matplotlib openpyxl memory-profiler>=0.61.0 
scikit-learn>=1.3.2"
 # Python deps for Spark Connect
-ARG CONNECT_PIP_PKGS="grpcio==1.59.3 grpcio-status==1.59.3 protobuf==4.25.1 
googleapis-common-protos==1.56.4"
+ARG CONNECT_PIP_PKGS="grpcio==1.62.0 grpcio-status==1.62.0 protobuf==4.25.1 
googleapis-common-protos==1.56.4"
 
 # Add torch as a testing dependency for TorchDistributor and 
DeepspeedTorchDistributor
 RUN python3.9 -m pip install $BASIC_PIP_PKGS unittest-xml-reporting 
$CONNECT_PIP_PKGS && \
diff --git a/dev/requirements.txt b/dev/requirements.txt
index 6fcd04b6d44a..d6530d8ce282 100644
--- a/dev/requirements.txt
+++ b/dev/requirements.txt
@@ -51,8 +51,8 @@ black==23.9.1
 py
 
 # Spark Connect (required)
-grpcio>=1.59.3
-grpcio-status>=1.59.3
+grpcio>=1.62.0
+grpcio-status>=1.62.0
 googleapis-common-protos>=1.56.4
 
 # Spark Connect python proto generation plugin (optional)
diff --git a/pom.xml b/pom.xml
index 49a951405408..b74569650bf8 100644
--- a/pom.xml
+++ b/pom.xml
@@ -289,7 +289,7 @@
     <!-- Version used in Connect -->
     <connect.guava.version>33.0.0-jre</connect.guava.version>
     <guava.failureaccess.version>1.0.2</guava.failureaccess.version>
-    <io.grpc.version>1.59.0</io.grpc.version>
+    <io.grpc.version>1.62.2</io.grpc.version>
     <mima.version>1.1.3</mima.version>
     <tomcat.annotations.api.version>6.0.53</tomcat.annotations.api.version>
 
diff --git a/project/SparkBuild.scala b/project/SparkBuild.scala
index 9ca53f46eec0..a50442e778d9 100644
--- a/project/SparkBuild.scala
+++ b/project/SparkBuild.scala
@@ -91,7 +91,7 @@ object BuildCommons {
   // SPARK-41247: needs to be consistent with `protobuf.version` in `pom.xml`.
   val protoVersion = "3.25.1"
   // GRPC version used for Spark Connect.
-  val grpcVersion = "1.59.0"
+  val grpcVersion = "1.62.2"
 }
 
 object SparkBuild extends PomBuild {
diff --git a/python/docs/source/getting_started/install.rst 
b/python/docs/source/getting_started/install.rst
index 1011948591bf..ce29cf626be0 100644
--- a/python/docs/source/getting_started/install.rst
+++ b/python/docs/source/getting_started/install.rst
@@ -159,8 +159,8 @@ Package                    Supported version Note
 `pandas`                   >=1.4.4                   Required for pandas API 
on Spark and Spark Connect; Optional for Spark SQL
 `pyarrow`                  >=4.0.0                   Required for pandas API 
on Spark and Spark Connect; Optional for Spark SQL
 `numpy`                    >=1.21                    Required for pandas API 
on Spark and MLLib DataFrame-based API; Optional for Spark SQL
-`grpcio`                   >=1.59.3                  Required for Spark Connect
-`grpcio-status`            >=1.59.3                  Required for Spark Connect
+`grpcio`                   >=1.62.0                  Required for Spark Connect
+`grpcio-status`            >=1.62.0                  Required for Spark Connect
 `googleapis-common-protos` >=1.56.4                  Required for Spark Connect
 ========================== ========================= 
======================================================================================
 
diff --git a/python/setup.py b/python/setup.py
index 8bac5141c82e..ec7240107d1b 100755
--- a/python/setup.py
+++ b/python/setup.py
@@ -133,7 +133,7 @@ if in_spark:
 _minimum_pandas_version = "1.4.4"
 _minimum_numpy_version = "1.21"
 _minimum_pyarrow_version = "4.0.0"
-_minimum_grpc_version = "1.59.3"
+_minimum_grpc_version = "1.62.0"
 _minimum_googleapis_common_protos_version = "1.56.4"
 
 


---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org

Reply via email to