HyukjinKwon opened a new pull request, #45414:
URL: https://github.com/apache/spark/pull/45414

   ### What changes were proposed in this pull request?
   
   This PR proposes to suppress Python exceptions where PySpark is not in the 
Python path
   
   ### Why are the changes needed?
   
   `pyspark` library itself might be missing when users run Scala/Java and R 
Spark applications. In that case, this warning messages might too nosiy.
   
   ### Does this PR introduce _any_ user-facing change?
   
   Yes, it will hide the warning message when a user meet all conditions below:
   - Use Scala or R only Spark application
   - Have a Python, but the Python does not have have PySpark in their Python 
path
     - Either because `SPARK_HOME` is undefined for some reasons,
     - Or, by other environment problems.
   
   ### How was this patch tested?
   
   Manually tested.
   
   ### Was this patch authored or co-authored using generative AI tooling?
   
   No.
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to