[ 
https://issues.apache.org/jira/browse/SPARK-13710?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15209484#comment-15209484
 ] 

Michel Lemay commented on SPARK-13710:
--------------------------------------

Spark coordinates for curator did not change..  It still depends on Curator 
2.4.0 that later depends on ZK and jline 0.9.94.

The problem is in Scala itself. It went from a local copy of jline sources in 
2.10.x: https://github.com/scala/scala/tree/2.10.x/src/jline

to a standard maven import in 2.11.x:
https://github.com/scala/scala/blob/2.11.x/build.sbt#L74

However, I'm not sure exactly how it plays out at runtime but I guess it's 
something in the lines of:
- Spark jar is loaded in the JVM
- Scala runtime is loaded in the JVM but jline resources are discarded because 
one copy is already there (this part wasn't happening in scala 2.10.x since 
resources were names or located somewhere else)
- Spark tries to initialize the scala REPL but that fails because it loads the 
wrong jline/jansi


Since I'm no expert in JVM classloading and the likes so your guess is probably 
better than mine..

> Spark shell shows ERROR when launching on Windows
> -------------------------------------------------
>
>                 Key: SPARK-13710
>                 URL: https://issues.apache.org/jira/browse/SPARK-13710
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Shell, Windows
>            Reporter: Masayoshi TSUZUKI
>            Priority: Minor
>
> On Windows, when we launch {{bin\spark-shell.cmd}}, it shows ERROR message 
> and stacktrace.
> {noformat}
> C:\Users\tsudukim\Documents\workspace\spark-dev3>bin\spark-shell
> [ERROR] Terminal initialization failed; falling back to unsupported
> java.lang.NoClassDefFoundError: Could not initialize class 
> scala.tools.fusesource_embedded.jansi.internal.Kernel32
>         at 
> scala.tools.fusesource_embedded.jansi.internal.WindowsSupport.getConsoleMode(WindowsSupport.java:50)
>         at 
> scala.tools.jline_embedded.WindowsTerminal.getConsoleMode(WindowsTerminal.java:204)
>         at 
> scala.tools.jline_embedded.WindowsTerminal.init(WindowsTerminal.java:82)
>         at 
> scala.tools.jline_embedded.TerminalFactory.create(TerminalFactory.java:101)
>         at 
> scala.tools.jline_embedded.TerminalFactory.get(TerminalFactory.java:158)
>         at 
> scala.tools.jline_embedded.console.ConsoleReader.<init>(ConsoleReader.java:229)
>         at 
> scala.tools.jline_embedded.console.ConsoleReader.<init>(ConsoleReader.java:221)
>         at 
> scala.tools.jline_embedded.console.ConsoleReader.<init>(ConsoleReader.java:209)
>         at 
> scala.tools.nsc.interpreter.jline_embedded.JLineConsoleReader.<init>(JLineReader.scala:61)
>         at 
> scala.tools.nsc.interpreter.jline_embedded.InteractiveReader.<init>(JLineReader.scala:33)
>         at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native 
> Method)
>         at 
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
>         at 
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
>         at java.lang.reflect.Constructor.newInstance(Constructor.java:422)
>         at 
> scala.tools.nsc.interpreter.ILoop$$anonfun$scala$tools$nsc$interpreter$ILoop$$instantiate$1$1.apply(ILoop.scala:865)
>         at 
> scala.tools.nsc.interpreter.ILoop$$anonfun$scala$tools$nsc$interpreter$ILoop$$instantiate$1$1.apply(ILoop.scala:862)
>         at 
> scala.tools.nsc.interpreter.ILoop.scala$tools$nsc$interpreter$ILoop$$mkReader$1(ILoop.scala:871)
>         at 
> scala.tools.nsc.interpreter.ILoop$$anonfun$15$$anonfun$apply$8.apply(ILoop.scala:875)
>         at 
> scala.tools.nsc.interpreter.ILoop$$anonfun$15$$anonfun$apply$8.apply(ILoop.scala:875)
>         at scala.util.Try$.apply(Try.scala:192)
>         at 
> scala.tools.nsc.interpreter.ILoop$$anonfun$15.apply(ILoop.scala:875)
>         at 
> scala.tools.nsc.interpreter.ILoop$$anonfun$15.apply(ILoop.scala:875)
>         at 
> scala.collection.immutable.Stream$$anonfun$map$1.apply(Stream.scala:418)
>         at 
> scala.collection.immutable.Stream$$anonfun$map$1.apply(Stream.scala:418)
>         at scala.collection.immutable.Stream$Cons.tail(Stream.scala:1233)
>         at scala.collection.immutable.Stream$Cons.tail(Stream.scala:1223)
>         at scala.collection.immutable.Stream.collect(Stream.scala:435)
>         at scala.tools.nsc.interpreter.ILoop.chooseReader(ILoop.scala:877)
>         at 
> scala.tools.nsc.interpreter.ILoop$$anonfun$process$1$$anonfun$apply$mcZ$sp$2.apply(ILoop.scala:916)
>         at 
> scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply$mcZ$sp(ILoop.scala:916)
>         at 
> scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply(ILoop.scala:911)
>         at 
> scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply(ILoop.scala:911)
>         at 
> scala.reflect.internal.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:97)
>         at scala.tools.nsc.interpreter.ILoop.process(ILoop.scala:911)
>         at org.apache.spark.repl.Main$.doMain(Main.scala:64)
>         at org.apache.spark.repl.Main$.main(Main.scala:47)
>         at org.apache.spark.repl.Main.main(Main.scala)
>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>         at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>         at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>         at java.lang.reflect.Method.invoke(Method.java:497)
>         at 
> org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:737)
>         at 
> org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:183)
>         at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:208)
>         at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:122)
>         at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
> Using Spark's default log4j profile: 
> org/apache/spark/log4j-defaults.properties
> Setting default log level to "WARN".
> To adjust logging level use sc.setLogLevel(newLevel).
> 16/03/07 13:05:32 WARN NativeCodeLoader: Unable to load native-hadoop library 
> for your platform... using builtin-java classes where applicable
> Spark context available as sc (master = local[*], app id = 
> local-1457323533704).
> SQL context available as sqlContext.
> Welcome to
>       ____              __
>      / __/__  ___ _____/ /__
>     _\ \/ _ \/ _ `/ __/  '_/
>    /___/ .__/\_,_/_/ /_/\_\   version 2.0.0-SNAPSHOT
>       /_/
> Using Scala version 2.11.7 (Java HotSpot(TM) 64-Bit Server VM, Java 1.8.0_40)
> Type in expressions to have them evaluated.
> Type :help for more information.
> scala> sc.textFile("README.md")
> res0: org.apache.spark.rdd.RDD[String] = README.md MapPartitionsRDD[1] at 
> textFile at <console>:25
> scala> sc.textFile("README.md").count()
> res1: Long = 97
> {noformat}
> Spark-shell itself seems to work file during my simple operation check.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to