[ 
https://issues.apache.org/jira/browse/SPARK-47311?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Hyukjin Kwon updated SPARK-47311:
---------------------------------
    Description: 
{code}
scala> sql("create table t(i int) using json")
24/03/06 16:09:44 WARN DataSourceManager: Skipping the lookup of Python Data 
Sources due to the failure.
org.apache.spark.SparkException:
Error from python worker:
  /opt/homebrew/Caskroom/miniconda/base/bin/python3: Error while finding module 
specification for 'pyspark.daemon' (ModuleNotFoundError: No module named 
'pyspark')
{code}

When PySpark is not in the Python path at all, it always shows this warning 
message once for every Spark session initialization.

  was:
{code}
scala> sql("create table t(i int) using json")
24/03/06 16:09:44 WARN DataSourceManager: Skipping the lookup of Python Data 
Sources due to the failure.
org.apache.spark.SparkException:
Error from python worker:
  /opt/homebrew/Caskroom/miniconda/base/bin/python3: Error while finding module 
specification for 'pyspark.daemon' (ModuleNotFoundError: No module named 
'pyspark')
{code}

This is too much.


> Suppress Python exceptions where PySpark is not in the Python path
> ------------------------------------------------------------------
>
>                 Key: SPARK-47311
>                 URL: https://issues.apache.org/jira/browse/SPARK-47311
>             Project: Spark
>          Issue Type: Test
>          Components: PySpark, SQL
>    Affects Versions: 4.0.0
>            Reporter: Hyukjin Kwon
>            Priority: Minor
>
> {code}
> scala> sql("create table t(i int) using json")
> 24/03/06 16:09:44 WARN DataSourceManager: Skipping the lookup of Python Data 
> Sources due to the failure.
> org.apache.spark.SparkException:
> Error from python worker:
>   /opt/homebrew/Caskroom/miniconda/base/bin/python3: Error while finding 
> module specification for 'pyspark.daemon' (ModuleNotFoundError: No module 
> named 'pyspark')
> {code}
> When PySpark is not in the Python path at all, it always shows this warning 
> message once for every Spark session initialization.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to