Andrew Or created SPARK-3140:
--------------------------------

             Summary: PySpark start-up throws confusing exception
                 Key: SPARK-3140
                 URL: https://issues.apache.org/jira/browse/SPARK-3140
             Project: Spark
          Issue Type: Bug
          Components: PySpark
    Affects Versions: 1.0.2
            Reporter: Andrew Or
            Priority: Critical


Currently we read the pyspark port through stdout of the spark-submit 
subprocess. However, if there is stdout interference, e.g. spark-submit echoes 
something unexpected to stdout, we print the following:

{code}
Exception: Launching GatewayServer failed! (Warning: unexpected output 
detected.)
{code}

This condition is fine. However, we actually throw the same exception if there 
is *no* output from the subprocess as well. This is very confusing because it 
implies that the subprocess is outputting something (possibly whitespace, which 
is not visible) when it's actually not.



--
This message was sent by Atlassian JIRA
(v6.2#6252)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to