[ 
https://issues.apache.org/jira/browse/SPARK-33048?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Apache Spark reassigned SPARK-33048:
------------------------------------

    Assignee: Kousuke Saruta  (was: Apache Spark)

> Fix SparkBuild.scala to recognize build settings for Scala 2.13
> ---------------------------------------------------------------
>
>                 Key: SPARK-33048
>                 URL: https://issues.apache.org/jira/browse/SPARK-33048
>             Project: Spark
>          Issue Type: Sub-task
>          Components: Build
>    Affects Versions: 3.0.1, 3.1.0
>            Reporter: Kousuke Saruta
>            Assignee: Kousuke Saruta
>            Priority: Major
>
> In SparkBuild.scala, a variable 'scalaBinaryVersion' is hardcoded as '2.12'.
> So, an environment variable 'SPARK_SCALA_VERSION' is also to be '2.12'.
> This issue causes some test suites (e.g. SparkSubmitSuite) to be error.
> {code}
> ===== TEST OUTPUT FOR o.a.s.deploy.SparkSubmitSuite: 'user classpath first in 
> driver' =====
> 20/10/02 08:55:30.234 redirect stderr for command 
> /home/kou/work/oss/spark-scala-2.13/bin/spark-submit INFO Utils: Error: Could 
> not find or load m
> ain class org.apache.spark.launcher.Main
> 20/10/02 08:55:30.235 redirect stderr for command 
> /home/kou/work/oss/spark-scala-2.13/bin/spark-submit INFO Utils: 
> /home/kou/work/oss/spark-scala-
> 2.13/bin/spark-class: line 96: CMD: bad array subscript
> {code}
> The reason of this error is that environment variables 'SPARK_JARS_DIR' and 
> 'LAUNCH_CLASSPATH' is defined in bin/spark-class as follows.
> {code}
> SPARK_JARS_DIR="${SPARK_HOME}/assembly/target/scala-$SPARK_SCALA_VERSION/jars"
> LAUNCH_CLASSPATH="${SPARK_HOME}/launcher/target/scala-$SPARK_SCALA_VERSION/classes:$LAUNCH_CLASSPATH"
> {code}



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to