Github user vanzin commented on a diff in the pull request:

    https://github.com/apache/spark/pull/3916#discussion_r26079694
  
    --- Diff: bin/spark-class ---
    @@ -110,83 +39,48 @@ else
         exit 1
       fi
     fi
    -JAVA_VERSION=$("$RUNNER" -version 2>&1 | grep 'version' | sed 's/.* 
version "\(.*\)\.\(.*\)\..*"/\1\2/; 1q')
    -
    -# Set JAVA_OPTS to be able to load native libraries and to set heap size
    -if [ "$JAVA_VERSION" -ge 18 ]; then
    -  JAVA_OPTS="$OUR_JAVA_OPTS"
    -else
    -  JAVA_OPTS="-XX:MaxPermSize=128m $OUR_JAVA_OPTS"
    -fi
    -JAVA_OPTS="$JAVA_OPTS -Xms$OUR_JAVA_MEM -Xmx$OUR_JAVA_MEM"
    -
    -# Load extra JAVA_OPTS from conf/java-opts, if it exists
    -if [ -e "$SPARK_CONF_DIR/java-opts" ] ; then
    -  JAVA_OPTS="$JAVA_OPTS `cat "$SPARK_CONF_DIR"/java-opts`"
    -fi
    -
    -# Attention: when changing the way the JAVA_OPTS are assembled, the change 
must be reflected in CommandUtils.scala!
    -
    -TOOLS_DIR="$FWDIR"/tools
    -SPARK_TOOLS_JAR=""
    -if [ -e 
"$TOOLS_DIR"/target/scala-$SPARK_SCALA_VERSION/spark-tools*[0-9Tg].jar ]; then
    -  # Use the JAR from the SBT build
    -  export SPARK_TOOLS_JAR="`ls 
"$TOOLS_DIR"/target/scala-$SPARK_SCALA_VERSION/spark-tools*[0-9Tg].jar`"
    -fi
    -if [ -e "$TOOLS_DIR"/target/spark-tools*[0-9Tg].jar ]; then
    -  # Use the JAR from the Maven build
    -  # TODO: this also needs to become an assembly!
    -  export SPARK_TOOLS_JAR="`ls "$TOOLS_DIR"/target/spark-tools*[0-9Tg].jar`"
    -fi
     
    -# Compute classpath using external script
    -classpath_output=$("$FWDIR"/bin/compute-classpath.sh)
    -if [[ "$?" != "0" ]]; then
    -  echo "$classpath_output"
    -  exit 1
    -else
    -  CLASSPATH="$classpath_output"
    -fi
    +# Look for the launcher. In non-release mode, add the compiled classes 
directly to the classpath
    +# instead of looking for a jar file.
    +SPARK_LAUNCHER_CP=
    +if [ -f $SPARK_HOME/RELEASE ]; then
    +  LAUNCHER_DIR="$SPARK_HOME/lib"
    +  num_jars="$(ls -1 "$LAUNCHER_DIR" | grep "^spark-launcher.*\.jar$" | wc 
-l)"
    --- End diff --
    
    While I understand what the article in the link you posted says, I fail to 
understand how it applies to this line. The "grep" command filters what we're 
looking for, and if you have things that would trigger the cases described in 
the article, they'd either be filtered out, or you could say you already have a 
busted build directory that doesn't match the expected.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to