If we decide to initialize Spark in `initializeSynchronous()` in Scala 2.11.12, 
it will look like the following which is odd.

Using Scala version 2.11.12 (Java HotSpot(TM) 64-Bit Server VM, Java 1.8.0_161)
Type in expressions to have them evaluated.
Type :help for more information.

scala> Spark context Web UI available at http://192.168.1.169:4040
Spark context available as 'sc' (master = local[*], app id = 
local-1528180279528).
Spark session available as 'spark’.
scala>

DB Tsai  |  Siri Open Source Technologies [not a contribution]  |   Apple, Inc

> On Jun 7, 2018, at 5:49 PM, Holden Karau <hol...@pigscanfly.ca> wrote:
> 
> Tests can just be changed to accept either output too :p
> 
> On Thu, Jun 7, 2018, 5:19 PM Dean Wampler <deanwamp...@gmail.com 
> <mailto:deanwamp...@gmail.com>> wrote:
> Do the tests expect a particular console output order? That would annoy them. 
> ;) You could sort the expected and output lines, then diff...
> 
> Dean Wampler, Ph.D.
> VP, Fast Data Engineering at Lightbend
> Author: Programming Scala, 2nd Edition 
> <http://shop.oreilly.com/product/0636920033073.do>, Fast Data Architectures 
> for Streaming Applications 
> <http://www.oreilly.com/data/free/fast-data-architectures-for-streaming-applications.csp>,
>  and other content from O'Reilly
> @deanwampler <http://twitter.com/deanwampler>
> http://polyglotprogramming.com <http://polyglotprogramming.com/>
> https://github.com/deanwampler <https://github.com/deanwampler>
> 
> On Thu, Jun 7, 2018 at 5:09 PM, Holden Karau <hol...@pigscanfly.ca 
> <mailto:hol...@pigscanfly.ca>> wrote:
> If the difference is the order of the welcome message I think that should be 
> fine.
> 
> On Thu, Jun 7, 2018, 4:43 PM Dean Wampler <deanwamp...@gmail.com 
> <mailto:deanwamp...@gmail.com>> wrote:
> I'll point the Scala team to this issue, but it's unlikely to get fixed any 
> time soon.
> 
> dean
> 
> Dean Wampler, Ph.D.
> VP, Fast Data Engineering at Lightbend
> Author: Programming Scala, 2nd Edition 
> <http://shop.oreilly.com/product/0636920033073.do>, Fast Data Architectures 
> for Streaming Applications 
> <http://www.oreilly.com/data/free/fast-data-architectures-for-streaming-applications.csp>,
>  and other content from O'Reilly
> @deanwampler <http://twitter.com/deanwampler>
> http://polyglotprogramming.com <http://polyglotprogramming.com/>
> https://github.com/deanwampler <https://github.com/deanwampler>
> 
> On Thu, Jun 7, 2018 at 4:27 PM, DB Tsai <d_t...@apple.com 
> <mailto:d_t...@apple.com>> wrote:
> Thanks Felix for bringing this up.
> 
> Currently, in Scala 2.11.8, we initialize the Spark by overriding loadFIles() 
> before REPL sees any file since there is no good hook in Scala to load our 
> initialization code.
> 
> In Scala 2.11.12 and newer version of the Scala 2.12.x, loadFIles() method 
> was removed.
> 
> Alternatively, one way we can do in the newer version of Scala is by 
> overriding initializeSynchronous() suggested by Som Snytt; I have a working 
> PR with this approach,
> https://github.com/apache/spark/pull/21495 
> <https://github.com/apache/spark/pull/21495> , and this approach should work 
> for older version of Scala too. 
> 
> However, in the newer version of Scala, the first thing that the REPL calls 
> is printWelcome, so in the newer version of Scala, welcome message will be 
> shown and then the URL of the SparkUI in this approach. This will cause UI 
> inconsistencies between different versions of Scala.
> 
> We can also initialize the Spark in the printWelcome which I feel more hacky. 
> It will only work for newer version of Scala since in order version of Scala, 
> printWelcome is called in the end of the initialization process. If we decide 
> to go this route, basically users can not use Scala older than 2.11.9.
> 
> I think this is also a blocker for us to move to newer version of Scala 
> 2.12.x since the newer version of Scala 2.12.x has the same issue.
> 
> In my opinion, Scala should fix the root cause and provide a stable hook for 
> 3rd party developers to initialize their custom code.
> 
> DB Tsai  |  Siri Open Source Technologies [not a contribution]  |   Apple, 
> Inc
> 
> > On Jun 7, 2018, at 6:43 AM, Felix Cheung <felixcheun...@hotmail.com 
> > <mailto:felixcheun...@hotmail.com>> wrote:
> > 
> > +1
> > 
> > Spoke to Dean as well and mentioned the problem with 2.11.12 
> > https://github.com/scala/bug/issues/10913 
> > <https://github.com/scala/bug/issues/10913>
> > 
> > _____________________________
> > From: Sean Owen <sro...@gmail.com <mailto:sro...@gmail.com>>
> > Sent: Wednesday, June 6, 2018 12:23 PM
> > Subject: Re: Scala 2.12 support
> > To: Holden Karau <hol...@pigscanfly.ca <mailto:hol...@pigscanfly.ca>>
> > Cc: Dean Wampler <deanwamp...@gmail.com <mailto:deanwamp...@gmail.com>>, 
> > Reynold Xin <r...@databricks.com <mailto:r...@databricks.com>>, dev 
> > <dev@spark.apache.org <mailto:dev@spark.apache.org>>
> > 
> > 
> > If it means no change to 2.11 support, seems OK to me for Spark 2.4.0. The 
> > 2.12 support is separate and has never been mutually compatible with 2.11 
> > builds anyway. (I also hope, suspect that the changes are minimal; tests 
> > are already almost entirely passing with no change to the closure cleaner 
> > when built for 2.12)
> > 
> > On Wed, Jun 6, 2018 at 1:33 PM Holden Karau <hol...@pigscanfly.ca 
> > <mailto:hol...@pigscanfly.ca>> wrote:
> > Just chatted with Dean @ the summit and it sounds like from Adriaan there 
> > is a fix in 2.13 for the API change issue that could be back ported to 2.12 
> > so how about we try and get this ball rolling?
> > 
> > It sounds like it would also need a closure cleaner change, which could be 
> > backwards compatible but since it’s such a core component and we might want 
> > to be cautious with it, we could when building for 2.11 use the old cleaner 
> > code and for 2.12 use the new code so we don’t break anyone.
> > 
> > How do folks feel about this?
> > 
> > 
> > 
> 
> 
> 

Reply via email to