Github user som-snytt commented on a diff in the pull request:
https://github.com/apache/spark/pull/21495#discussion_r194294485
--- Diff:
repl/scala-2.11/src/main/scala/org/apache/spark/repl/SparkILoopInterpreter.scala
---
@@ -21,8 +21,22 @@ import scala.collection.mutable
Github user som-snytt commented on the issue:
https://github.com/apache/spark/pull/21495
@dbtsai 2.11 looks similar to 2.12. Do you mean you want the same technique
on 2.10? I would not expect to find a single hook for all versions
Github user som-snytt commented on a diff in the pull request:
https://github.com/apache/spark/pull/21495#discussion_r193938805
--- Diff:
repl/scala-2.11/src/main/scala/org/apache/spark/repl/SparkILoopInterpreter.scala
---
@@ -21,8 +21,22 @@ import scala.collection.mutable
Github user som-snytt commented on the issue:
https://github.com/apache/spark/pull/21495
The Scala REPL change at startup was to read user input while the
single-threaded compiler was initializing on the main thread.
There is a `SplashReader` that collects that input; its
Github user som-snytt commented on a diff in the pull request:
https://github.com/apache/spark/pull/21369#discussion_r191064975
--- Diff:
core/src/main/scala/org/apache/spark/util/collection/ExternalAppendOnlyMap.scala
---
@@ -585,17 +591,25 @@ class ExternalAppendOnlyMap[K, V, C
Github user som-snytt commented on the issue:
https://github.com/apache/spark/pull/17982
Thanks for the effort. I'll take a hack soon. If it's hopeless, I'll at
least try to track developments with the new REPL API.
---
If your project is set up for it, you can reply to this email
Github user som-snytt commented on the issue:
https://github.com/apache/spark/pull/17982
The contract was never well-defined.
`sbt` overrides `createInterpreter` and runs its `initialCommands` before
returning. So that runs before REPL finishes init in `loopPostInit
Github user som-snytt commented on the pull request:
https://github.com/apache/spark/pull/2615#issuecomment-60188566
I had a version for that SI-7747 that moved the scripting logic to a
separate compiler phase which could be replaced by a user plugin. (But that
also moved the repl
Github user som-snytt commented on the pull request:
https://github.com/apache/spark/pull/1929#issuecomment-52670339
FWIW, I remembered my fix included overriding the ClassPath (in the
JavaPlatform) of the REPL compiler. So it was a bit more nuanced in handling
the replacement
Github user som-snytt commented on the pull request:
https://github.com/apache/spark/pull/1929#issuecomment-52230440
This is similar to what I tried last year, but I recently saw:
https://github.com/scala/scala/pull/3884
which says the mechanisms are in flux in 2.11
10 matches
Mail list logo