[ 
https://issues.apache.org/jira/browse/SPARK-12607?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15080499#comment-15080499
 ] 

SM Wang edited comment on SPARK-12607 at 1/3/16 9:57 PM:
---------------------------------------------------------

Sure.  I think the problem is with the while loop's delimiter setting (-d '')  
or the launcher class' behavior in the MSYS64 environment.

Here is section of the script in version 1.4.0 where I added some echo commands 
(marked with the +++ prefix).

{noformat}
CMD=()
while IFS= read -d '' -r ARG; do
        echo "+++ Parsed Arguments in while loop: $ARG"
  CMD+=("$ARG")
done < <("$RUNNER" -cp "$LAUNCH_CLASSPATH" org.apache.spark.launcher.Main "$@")
echo "+++ Launcher Command" "$RUNNER" -cp "$LAUNCH_CLASSPATH" 
org.apache.spark.launcher.Main "$@"

echo "+++ First Element: ${CMD[0]}"
echo "+++ Command Array: ${CMD[@]}"

if [ "${CMD[0]}" = "usage" ]; then
  "${CMD[@]}"
else
  exec "${CMD[@]}"
fi
{noformat}

The output from "run-example SparkPi" is as follows:

{panel}
+++ Launcher Command /apps/jdk1.7.0_80/bin/java -cp 
/apps/tmp/spark-1.4.0-bin-hadoop2.4/lib/spark-assembly-1.4.0-hadoop2.4.0.jar 
org.apache.spark.launcher.Main org.apache.spark.deploy.SparkSubmit --master 
local[*] --class org.apache.spark.examples.SparkPi 
/apps/tmp/spark-1.4.0-bin-hadoop2.4/lib/spark-examples-1.4.0-hadoop2.4.0.jar
+++ First Element:
+++ Command Array:
{panel}

As you can see the command array is empty.

However, when running the launcher command manually I got the following:
{panel}
C:/msys64/apps/jdk1.7.0_80\bin\java -cp 
"C:/msys64/apps/tmp/spark-1.4.0-bin-hadoop2.4\conf\;C:\msys64\apps\tmp\spark-1.4.0-bin-hadoop2.4\lib\spark-assembly-1.4.0-hadoop2.4.0.jar;C:\msys64\apps\tmp\spark-1.4.0-bin-hadoop2.4\lib\datanucleus-api-jdo-3.2.6.jar;C:\msys64\apps\tmp\spark-1.4.0-bin-hadoop2.4\lib\datanucleus-core-3.2.10.jar;C:\msys64\apps\tmp\spark-1.4.0-bin-hadoop2.4\lib\datanucleus-rdbms-3.2.9.jar"
 -Xms512m -Xmx512m "-XX:MaxPermSize=128m" org.apache.spark.deploy.SparkSubmit 
--master local[*] --class org.apache.spark.examples.SparkPi 
C:/msys64/apps/tmp/spark-1.4.0-bin-hadoop2.4/lib/spark-examples-1.4.0-hadoop2.4.0.jar
{panel}

When I change the delimiter to *-d ' '* (a space between quotes) I was able to 
get an non-empty command array.

This is why I think the issue is either with the delimiter setting or the 
launcher that does not produce the command string including the delimiter 
expected by the read function of while loop.

Hope this helps.

Thank you for looking into this.


was (Author: swang):
Sure.  I think the problem is with the while loop's delimiter setting (-d '')  
or the launcher class' behavior in the MSYS64 environment.

Here is section of the script in version 1.4.0 where I added some echo commands 
(marked with the +++ prefix).

{quote}
CMD=()
while IFS= read -d '' -r ARG; do
        echo "+++ Parsed Arguments in while loop: $ARG"
  CMD+=("$ARG")
done < <("$RUNNER" -cp "$LAUNCH_CLASSPATH" org.apache.spark.launcher.Main "$@")
echo "+++ Launcher Command" "$RUNNER" -cp "$LAUNCH_CLASSPATH" 
org.apache.spark.launcher.Main "$@"

echo "+++ First Element: ${CMD[0]}"
echo "+++ Command Array: ${CMD[@]}"

if [ "${CMD[0]}" = "usage" ]; then
  "${CMD[@]}"
else
  exec "${CMD[@]}"
fi
{quote}

The output from "run-example SparkPi" is as follows:

{panel}
+++ Launcher Command /apps/jdk1.7.0_80/bin/java -cp 
/apps/tmp/spark-1.4.0-bin-hadoop2.4/lib/spark-assembly-1.4.0-hadoop2.4.0.jar 
org.apache.spark.launcher.Main org.apache.spark.deploy.SparkSubmit --master 
local[*] --class org.apache.spark.examples.SparkPi 
/apps/tmp/spark-1.4.0-bin-hadoop2.4/lib/spark-examples-1.4.0-hadoop2.4.0.jar
+++ First Element:
+++ Command Array:
{panel}

As you can see the command array is empty.

However, when running the launcher command manually I got the following:
{panel}
C:/msys64/apps/jdk1.7.0_80\bin\java -cp 
"C:/msys64/apps/tmp/spark-1.4.0-bin-hadoop2.4\conf\;C:\msys64\apps\tmp\spark-1.4.0-bin-hadoop2.4\lib\spark-assembly-1.4.0-hadoop2.4.0.jar;C:\msys64\apps\tmp\spark-1.4.0-bin-hadoop2.4\lib\datanucleus-api-jdo-3.2.6.jar;C:\msys64\apps\tmp\spark-1.4.0-bin-hadoop2.4\lib\datanucleus-core-3.2.10.jar;C:\msys64\apps\tmp\spark-1.4.0-bin-hadoop2.4\lib\datanucleus-rdbms-3.2.9.jar"
 -Xms512m -Xmx512m "-XX:MaxPermSize=128m" org.apache.spark.deploy.SparkSubmit 
--master local[*] --class org.apache.spark.examples.SparkPi 
C:/msys64/apps/tmp/spark-1.4.0-bin-hadoop2.4/lib/spark-examples-1.4.0-hadoop2.4.0.jar
{panel}

When I change the delimiter to *-d ' '* (a space between quotes) I was able to 
get an non-empty command array.

This is why I think the issue is either with the delimiter setting or the 
launcher that does not produce the command string including the delimiter 
expected by the read function of while loop.

Hope this helps.

Thank you for looking into this.

> spark-class produced null command strings for "exec"
> ----------------------------------------------------
>
>                 Key: SPARK-12607
>                 URL: https://issues.apache.org/jira/browse/SPARK-12607
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Submit
>    Affects Versions: 1.4.0, 1.4.1, 1.5.2
>         Environment: MSYS64 on Windows 7 64 bit
>            Reporter: SM Wang
>
> When using the run-example script in 1.4.0 to run the SparkPi example, I 
> found that it did not print any text to the terminal (e.g., stdout, stderr). 
> After further investigation I found the while loop for producing the exec 
> command from the launcher class produced a null command array.
> This discrepancy was observed on 1.5.2 and 1.4.1.  The 1.3.1's behavior seems 
> to be correct.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to