Github user vanzin commented on the pull request:

    https://github.com/apache/spark/pull/11701#issuecomment-197529534
  
    Hmmm. I think this is because this code in `load-spark-env.sh`:
    
    ```
    if [ -z "$SPARK_SCALA_VERSION" ]; then
      USER_SCALA_VERSION_SET=0
      ASSEMBLY_DIR2="${SPARK_HOME}/assembly/target/scala-2.11"
      ASSEMBLY_DIR1="${SPARK_HOME}/assembly/target/scala-2.10"
    
      if [[ -d "$ASSEMBLY_DIR2" && -d "$ASSEMBLY_DIR1" ]]; then
        echo -e "Presence of build for both scala versions(SCALA 2.10 and SCALA 
2.11) detected." 1>&2
        echo -e 'Either clean one of them or, export SPARK_SCALA_VERSION=2.11 
in spark-env.sh.' 1>&2
        exit 1
      fi
    
      if [ -d "$ASSEMBLY_DIR2" ]; then
        export SPARK_SCALA_VERSION="2.11"
      else
        export SPARK_SCALA_VERSION="2.10"
      fi
    else
        USER_SCALA_VERSION_SET=1
    fi
    ```
    
    Since the assembly build is being skipped, it's defaulting to scala 2.10 
and failing to find the classes.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to