Github user HyukjinKwon commented on a diff in the pull request:

    https://github.com/apache/spark/pull/22770#discussion_r226515461
  
    --- Diff: 
core/src/main/scala/org/apache/spark/api/python/PythonWorkerFactory.scala ---
    @@ -31,15 +32,15 @@ import org.apache.spark.security.SocketAuthHelper
     import org.apache.spark.util.{RedirectThread, Utils}
     
     private[spark] class PythonWorkerFactory(pythonExec: String, envVars: 
Map[String, String])
    -  extends Logging {
    +  extends Logging { self =>
     
       import PythonWorkerFactory._
     
       // Because forking processes from Java is expensive, we prefer to launch 
a single Python daemon,
       // pyspark/daemon.py (by default) and tell it to fork new workers for 
our tasks. This daemon
       // currently only works on UNIX-based systems now because it uses 
signals for child management,
       // so we can also fall back to launching workers, pyspark/worker.py (by 
default) directly.
    -  val useDaemon = {
    +  private val useDaemon = {
    --- End diff --
    
    Why are we fixing this? Looks not directly related to the fix and this is 
already private class.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to