Repository: spark
Updated Branches:
  refs/heads/master 4c059ebc6 -> c7967c604


[SPARK-24418][BUILD] Upgrade Scala to 2.11.12 and 2.12.6

## What changes were proposed in this pull request?

Scala is upgraded to `2.11.12` and `2.12.6`.

We used `loadFIles()` in `ILoop` as a hook to initialize the Spark before REPL 
sees any files in Scala `2.11.8`. However, it was a hack, and it was not 
intended to be a public API, so it was removed in Scala `2.11.12`.

>From the discussion in Scala community, 
>https://github.com/scala/bug/issues/10913 , we can use `initializeSynchronous` 
>to initialize Spark instead. This PR implements the Spark initialization there.

However, in Scala `2.11.12`'s `ILoop.scala`, in function `def startup()`, the 
first thing it calls is `printWelcome()`. As a result, Scala will call 
`printWelcome()` and `splash` before calling `initializeSynchronous`.

Thus, the Spark shell will allow users to type commends first, and then show 
the Spark UI URL. It's working, but it will change the Spark Shell interface as 
the following.

```scala
➜  apache-spark git:(scala-2.11.12) ✗ ./bin/spark-shell
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use 
setLogLevel(newLevel).
Welcome to
      ____              __
     / __/__  ___ _____/ /__
    _\ \/ _ \/ _ `/ __/  '_/
   /___/ .__/\_,_/_/ /_/\_\   version 2.4.0-SNAPSHOT
      /_/

Using Scala version 2.11.12 (Java HotSpot(TM) 64-Bit Server VM, Java 1.8.0_161)
Type in expressions to have them evaluated.
Type :help for more information.

scala> Spark context Web UI available at http://192.168.1.169:4040
Spark context available as 'sc' (master = local[*], app id = 
local-1528180279528).
Spark session available as 'spark'.

scala>
```

It seems there is no easy way to inject the Spark initialization code in the 
proper place as Scala doesn't provide a hook. Maybe som-snytt can comment on 
this.

The following command is used to update the dep files.
```scala
./dev/test-dependencies.sh --replace-manifest
```
## How was this patch tested?

Existing tests

Author: DB Tsai <d_t...@apple.com>

Closes #21495 from dbtsai/scala-2.11.12.


Project: http://git-wip-us.apache.org/repos/asf/spark/repo
Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/c7967c60
Tree: http://git-wip-us.apache.org/repos/asf/spark/tree/c7967c60
Diff: http://git-wip-us.apache.org/repos/asf/spark/diff/c7967c60

Branch: refs/heads/master
Commit: c7967c6049327a03b63ea7a3b0001a97d31e309d
Parents: 4c059eb
Author: DB Tsai <d_t...@apple.com>
Authored: Tue Jun 26 09:48:52 2018 +0800
Committer: jerryshao <ss...@hortonworks.com>
Committed: Tue Jun 26 09:48:52 2018 +0800

----------------------------------------------------------------------
 LICENSE                                         | 12 +++++-----
 dev/deps/spark-deps-hadoop-2.6                  | 10 ++++----
 dev/deps/spark-deps-hadoop-2.7                  | 10 ++++----
 dev/deps/spark-deps-hadoop-3.1                  | 10 ++++----
 pom.xml                                         |  8 +++----
 .../org/apache/spark/repl/SparkILoop.scala      | 24 ++++++++------------
 .../spark/repl/SparkILoopInterpreter.scala      | 18 +++++++++++++--
 7 files changed, 50 insertions(+), 42 deletions(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/spark/blob/c7967c60/LICENSE
----------------------------------------------------------------------
diff --git a/LICENSE b/LICENSE
index cc1f580..6f5d945 100644
--- a/LICENSE
+++ b/LICENSE
@@ -243,18 +243,18 @@ The text of each license is also included at 
licenses/LICENSE-[project].txt.
      (BSD licence) ANTLR ST4 4.0.4 (org.antlr:ST4:4.0.4 - 
http://www.stringtemplate.org)
      (BSD licence) ANTLR StringTemplate (org.antlr:stringtemplate:3.2.1 - 
http://www.stringtemplate.org)
      (BSD License) Javolution (javolution:javolution:5.5.1 - 
http://javolution.org)
-     (BSD) JLine (jline:jline:0.9.94 - http://jline.sourceforge.net)
+     (BSD) JLine (jline:jline:2.14.3 - https://github.com/jline/jline2)
      (BSD) ParaNamer Core (com.thoughtworks.paranamer:paranamer:2.3 - 
http://paranamer.codehaus.org/paranamer)
      (BSD) ParaNamer Core (com.thoughtworks.paranamer:paranamer:2.6 - 
http://paranamer.codehaus.org/paranamer)
      (BSD 3 Clause) Scala (http://www.scala-lang.org/download/#License)
         (Interpreter classes (all .scala files in repl/src/main/scala
         except for Main.Scala, SparkHelper.scala and 
ExecutorClassLoader.scala),
         and for SerializableMapWrapper in JavaUtils.scala)
-     (BSD-like) Scala Actors library (org.scala-lang:scala-actors:2.11.8 - 
http://www.scala-lang.org/)
-     (BSD-like) Scala Compiler (org.scala-lang:scala-compiler:2.11.8 - 
http://www.scala-lang.org/)
-     (BSD-like) Scala Compiler (org.scala-lang:scala-reflect:2.11.8 - 
http://www.scala-lang.org/)
-     (BSD-like) Scala Library (org.scala-lang:scala-library:2.11.8 - 
http://www.scala-lang.org/)
-     (BSD-like) Scalap (org.scala-lang:scalap:2.11.8 - 
http://www.scala-lang.org/)
+     (BSD-like) Scala Actors library (org.scala-lang:scala-actors:2.11.12 - 
http://www.scala-lang.org/)
+     (BSD-like) Scala Compiler (org.scala-lang:scala-compiler:2.11.12 - 
http://www.scala-lang.org/)
+     (BSD-like) Scala Compiler (org.scala-lang:scala-reflect:2.11.12 - 
http://www.scala-lang.org/)
+     (BSD-like) Scala Library (org.scala-lang:scala-library:2.11.12 - 
http://www.scala-lang.org/)
+     (BSD-like) Scalap (org.scala-lang:scalap:2.11.12 - 
http://www.scala-lang.org/)
      (BSD-style) scalacheck (org.scalacheck:scalacheck_2.11:1.10.0 - 
http://www.scalacheck.org)
      (BSD-style) spire (org.spire-math:spire_2.11:0.7.1 - 
http://spire-math.org)
      (BSD-style) spire-macros (org.spire-math:spire-macros_2.11:0.7.1 - 
http://spire-math.org)

http://git-wip-us.apache.org/repos/asf/spark/blob/c7967c60/dev/deps/spark-deps-hadoop-2.6
----------------------------------------------------------------------
diff --git a/dev/deps/spark-deps-hadoop-2.6 b/dev/deps/spark-deps-hadoop-2.6
index 723180a..96e9c27 100644
--- a/dev/deps/spark-deps-hadoop-2.6
+++ b/dev/deps/spark-deps-hadoop-2.6
@@ -122,7 +122,7 @@ jersey-server-2.22.2.jar
 jets3t-0.9.4.jar
 jetty-6.1.26.jar
 jetty-util-6.1.26.jar
-jline-2.12.1.jar
+jline-2.14.3.jar
 joda-time-2.9.3.jar
 jodd-core-3.5.2.jar
 jpam-1.1.jar
@@ -172,10 +172,10 @@ parquet-jackson-1.10.0.jar
 protobuf-java-2.5.0.jar
 py4j-0.10.7.jar
 pyrolite-4.13.jar
-scala-compiler-2.11.8.jar
-scala-library-2.11.8.jar
-scala-parser-combinators_2.11-1.0.4.jar
-scala-reflect-2.11.8.jar
+scala-compiler-2.11.12.jar
+scala-library-2.11.12.jar
+scala-parser-combinators_2.11-1.1.0.jar
+scala-reflect-2.11.12.jar
 scala-xml_2.11-1.0.5.jar
 shapeless_2.11-2.3.2.jar
 slf4j-api-1.7.16.jar

http://git-wip-us.apache.org/repos/asf/spark/blob/c7967c60/dev/deps/spark-deps-hadoop-2.7
----------------------------------------------------------------------
diff --git a/dev/deps/spark-deps-hadoop-2.7 b/dev/deps/spark-deps-hadoop-2.7
index ea08a00..4a6ee02 100644
--- a/dev/deps/spark-deps-hadoop-2.7
+++ b/dev/deps/spark-deps-hadoop-2.7
@@ -122,7 +122,7 @@ jersey-server-2.22.2.jar
 jets3t-0.9.4.jar
 jetty-6.1.26.jar
 jetty-util-6.1.26.jar
-jline-2.12.1.jar
+jline-2.14.3.jar
 joda-time-2.9.3.jar
 jodd-core-3.5.2.jar
 jpam-1.1.jar
@@ -173,10 +173,10 @@ parquet-jackson-1.10.0.jar
 protobuf-java-2.5.0.jar
 py4j-0.10.7.jar
 pyrolite-4.13.jar
-scala-compiler-2.11.8.jar
-scala-library-2.11.8.jar
-scala-parser-combinators_2.11-1.0.4.jar
-scala-reflect-2.11.8.jar
+scala-compiler-2.11.12.jar
+scala-library-2.11.12.jar
+scala-parser-combinators_2.11-1.1.0.jar
+scala-reflect-2.11.12.jar
 scala-xml_2.11-1.0.5.jar
 shapeless_2.11-2.3.2.jar
 slf4j-api-1.7.16.jar

http://git-wip-us.apache.org/repos/asf/spark/blob/c7967c60/dev/deps/spark-deps-hadoop-3.1
----------------------------------------------------------------------
diff --git a/dev/deps/spark-deps-hadoop-3.1 b/dev/deps/spark-deps-hadoop-3.1
index da87402..e0b560c 100644
--- a/dev/deps/spark-deps-hadoop-3.1
+++ b/dev/deps/spark-deps-hadoop-3.1
@@ -122,7 +122,7 @@ jersey-server-2.22.2.jar
 jets3t-0.9.4.jar
 jetty-webapp-9.3.20.v20170531.jar
 jetty-xml-9.3.20.v20170531.jar
-jline-2.12.1.jar
+jline-2.14.3.jar
 joda-time-2.9.3.jar
 jodd-core-3.5.2.jar
 jpam-1.1.jar
@@ -192,10 +192,10 @@ protobuf-java-2.5.0.jar
 py4j-0.10.7.jar
 pyrolite-4.13.jar
 re2j-1.1.jar
-scala-compiler-2.11.8.jar
-scala-library-2.11.8.jar
-scala-parser-combinators_2.11-1.0.4.jar
-scala-reflect-2.11.8.jar
+scala-compiler-2.11.12.jar
+scala-library-2.11.12.jar
+scala-parser-combinators_2.11-1.1.0.jar
+scala-reflect-2.11.12.jar
 scala-xml_2.11-1.0.5.jar
 shapeless_2.11-2.3.2.jar
 slf4j-api-1.7.16.jar

http://git-wip-us.apache.org/repos/asf/spark/blob/c7967c60/pom.xml
----------------------------------------------------------------------
diff --git a/pom.xml b/pom.xml
index 4b4e6c1..90e64ff 100644
--- a/pom.xml
+++ b/pom.xml
@@ -155,7 +155,7 @@
     <commons.math3.version>3.4.1</commons.math3.version>
     <!-- managed up from 3.2.1 for SPARK-11652 -->
     <commons.collections.version>3.2.2</commons.collections.version>
-    <scala.version>2.11.8</scala.version>
+    <scala.version>2.11.12</scala.version>
     <scala.binary.version>2.11</scala.binary.version>
     <codehaus.jackson.version>1.9.13</codehaus.jackson.version>
     <fasterxml.jackson.version>2.6.7</fasterxml.jackson.version>
@@ -740,13 +740,13 @@
       <dependency>
         <groupId>org.scala-lang.modules</groupId>
         
<artifactId>scala-parser-combinators_${scala.binary.version}</artifactId>
-        <version>1.0.4</version>
+        <version>1.1.0</version>
       </dependency>
       <!-- SPARK-16770 affecting Scala 2.11.x -->
       <dependency>
         <groupId>jline</groupId>
         <artifactId>jline</artifactId>
-        <version>2.12.1</version>
+        <version>2.14.3</version>
       </dependency>
       <dependency>
         <groupId>org.scalatest</groupId>
@@ -2755,7 +2755,7 @@
     <profile>
       <id>scala-2.12</id>
       <properties>
-        <scala.version>2.12.4</scala.version>
+        <scala.version>2.12.6</scala.version>
         <scala.binary.version>2.12</scala.binary.version>
       </properties>
       <build>

http://git-wip-us.apache.org/repos/asf/spark/blob/c7967c60/repl/scala-2.11/src/main/scala/org/apache/spark/repl/SparkILoop.scala
----------------------------------------------------------------------
diff --git 
a/repl/scala-2.11/src/main/scala/org/apache/spark/repl/SparkILoop.scala 
b/repl/scala-2.11/src/main/scala/org/apache/spark/repl/SparkILoop.scala
index e69441a..a44051b 100644
--- a/repl/scala-2.11/src/main/scala/org/apache/spark/repl/SparkILoop.scala
+++ b/repl/scala-2.11/src/main/scala/org/apache/spark/repl/SparkILoop.scala
@@ -36,7 +36,7 @@ class SparkILoop(in0: Option[BufferedReader], out: 
JPrintWriter)
   def this() = this(None, new JPrintWriter(Console.out, true))
 
   override def createInterpreter(): Unit = {
-    intp = new SparkILoopInterpreter(settings, out)
+    intp = new SparkILoopInterpreter(settings, out, initializeSpark)
   }
 
   val initializationCommands: Seq[String] = Seq(
@@ -73,11 +73,15 @@ class SparkILoop(in0: Option[BufferedReader], out: 
JPrintWriter)
     "import org.apache.spark.sql.functions._"
   )
 
-  def initializeSpark() {
-    intp.beQuietDuring {
-      savingReplayStack { // remove the commands from session history.
-        initializationCommands.foreach(processLine)
+  def initializeSpark(): Unit = {
+    if (!intp.reporter.hasErrors) {
+      // `savingReplayStack` removes the commands from session history.
+      savingReplayStack {
+        initializationCommands.foreach(intp quietRun _)
       }
+    } else {
+      throw new RuntimeException(s"Scala $versionString interpreter 
encountered " +
+        "errors during initialization")
     }
   }
 
@@ -101,16 +105,6 @@ class SparkILoop(in0: Option[BufferedReader], out: 
JPrintWriter)
   /** Available commands */
   override def commands: List[LoopCommand] = standardCommands
 
-  /**
-   * We override `loadFiles` because we need to initialize Spark *before* the 
REPL
-   * sees any files, so that the Spark context is visible in those files. This 
is a bit of a
-   * hack, but there isn't another hook available to us at this point.
-   */
-  override def loadFiles(settings: Settings): Unit = {
-    initializeSpark()
-    super.loadFiles(settings)
-  }
-
   override def resetCommand(line: String): Unit = {
     super.resetCommand(line)
     initializeSpark()

http://git-wip-us.apache.org/repos/asf/spark/blob/c7967c60/repl/scala-2.11/src/main/scala/org/apache/spark/repl/SparkILoopInterpreter.scala
----------------------------------------------------------------------
diff --git 
a/repl/scala-2.11/src/main/scala/org/apache/spark/repl/SparkILoopInterpreter.scala
 
b/repl/scala-2.11/src/main/scala/org/apache/spark/repl/SparkILoopInterpreter.scala
index e736607..4e63816 100644
--- 
a/repl/scala-2.11/src/main/scala/org/apache/spark/repl/SparkILoopInterpreter.scala
+++ 
b/repl/scala-2.11/src/main/scala/org/apache/spark/repl/SparkILoopInterpreter.scala
@@ -21,8 +21,22 @@ import scala.collection.mutable
 import scala.tools.nsc.Settings
 import scala.tools.nsc.interpreter._
 
-class SparkILoopInterpreter(settings: Settings, out: JPrintWriter) extends 
IMain(settings, out) {
-  self =>
+class SparkILoopInterpreter(settings: Settings, out: JPrintWriter, 
initializeSpark: () => Unit)
+    extends IMain(settings, out) { self =>
+
+  /**
+   * We override `initializeSynchronous` to initialize Spark *after* `intp` is 
properly initialized
+   * and *before* the REPL sees any files in the private `loadInitFiles` 
functions, so that
+   * the Spark context is visible in those files.
+   *
+   * This is a bit of a hack, but there isn't another hook available to us at 
this point.
+   *
+   * See the discussion in Scala community 
https://github.com/scala/bug/issues/10913 for detail.
+   */
+  override def initializeSynchronous(): Unit = {
+    super.initializeSynchronous()
+    initializeSpark()
+  }
 
   override lazy val memberHandlers = new {
     val intp: self.type = self


---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org

Reply via email to