[ https://issues.apache.org/jira/browse/SPARK-45996?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Hyukjin Kwon resolved SPARK-45996. ---------------------------------- Fix Version/s: 4.0.0 Resolution: Fixed Issue resolved by pull request 43894 [https://github.com/apache/spark/pull/43894] > Show proper dependency requirement messages for Spark Connect > ------------------------------------------------------------- > > Key: SPARK-45996 > URL: https://issues.apache.org/jira/browse/SPARK-45996 > Project: Spark > Issue Type: Improvement > Components: PySpark > Affects Versions: 4.0.0 > Reporter: Hyukjin Kwon > Assignee: Hyukjin Kwon > Priority: Major > Labels: pull-request-available > Fix For: 4.0.0 > > > {code} > ./bin/pyspark --remote local > {code} > We should improve the error messages below. > {code} > /.../pyspark/shell.py:57: UserWarning: Failed to initialize Spark session. > warnings.warn("Failed to initialize Spark session.") > Traceback (most recent call last): > File "/.../pyspark/shell.py", line 52, in <module> > spark = SparkSession.builder.getOrCreate() > File "/.../pyspark/sql/session.py", line 476, in getOrCreate > from pyspark.sql.connect.session import SparkSession as RemoteSparkSession > File "/.../pyspark/sql/connect/session.py", line 53, in <module> > from pyspark.sql.connect.client import SparkConnectClient, ChannelBuilder > File "/.../pyspark/sql/connect/client/__init__.py", line 22, in <module> > from pyspark.sql.connect.client.core import * # noqa: F401,F403 > File "/.../pyspark/sql/connect/client/core.py", line 51, in <module> > import google.protobuf.message > ModuleNotFoundError: No module named 'google > {code} > {code} > /.../pyspark/shell.py:57: UserWarning: Failed to initialize Spark session. > warnings.warn("Failed to initialize Spark session.") > Traceback (most recent call last): > File "/.../pyspark/shell.py", line 52, in <module> > spark = SparkSession.builder.getOrCreate() > File "/.../pyspark/sql/session.py", line 476, in getOrCreate > from pyspark.sql.connect.session import SparkSession as RemoteSparkSession > File "/.../pyspark/sql/connect/session.py", line 53, in <module> > from pyspark.sql.connect.client import SparkConnectClient, ChannelBuilder > File "/.../pyspark/sql/connect/client/__init__.py", line 22, in <module> > from pyspark.sql.connect.client.core import * # noqa: F401,F403 > File "/.../pyspark/sql/connect/client/core.py", line 52, in <module> > from grpc_status import rpc_status > ModuleNotFoundError: No module named 'grpc_status' > {code} -- This message was sent by Atlassian Jira (v8.20.10#820010) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org