Github user andrewor14 commented on a diff in the pull request:

    https://github.com/apache/spark/pull/3274#discussion_r20399877
  
    --- Diff: 
core/src/main/scala/org/apache/spark/deploy/SparkSubmitDriverBootstrapper.scala 
---
    @@ -139,14 +139,15 @@ private[spark] object SparkSubmitDriverBootstrapper {
         // subprocess there already reads directly from our stdin, so we 
should avoid spawning a
         // thread that contends with the subprocess in reading from System.in.
         val isWindows = Utils.isWindows
    -    val isPySparkShell = sys.env.contains("PYSPARK_SHELL")
    +    val isSubprocess = sys.env.contains("IS_SUBPROCESS")
         if (!isWindows) {
           val stdinThread = new RedirectThread(System.in, 
process.getOutputStream, "redirect stdin")
           stdinThread.start()
    -      // For the PySpark shell, Spark submit itself runs as a python 
subprocess, and so this JVM
    -      // should terminate on broken pipe, which signals that the parent 
process has exited. In
    -      // Windows, the termination logic for the PySpark shell is handled 
in java_gateway.py
    -      if (isPySparkShell) {
    +      // Spark submit (JVM) may can runs as a subprocess, and so this JVM 
should terminate on
    --- End diff --
    
    can run. I'll fix this when I merge it


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to