[ 
https://issues.apache.org/jira/browse/SPARK-13710?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15210120#comment-15210120
 ] 

Sean Owen commented on SPARK-13710:
-----------------------------------

I think your exclusion is legitimate for curator-recipes. It shouldn't affect 
2.10 in any event because it embeds its own copy. For 2.11, we're going to have 
conflicts if the assembly somehow resolves jline 0.9.x, which can happen with 
the closest-first dependency rules. Feel free to make a PR.

> Spark shell shows ERROR when launching on Windows
> -------------------------------------------------
>
>                 Key: SPARK-13710
>                 URL: https://issues.apache.org/jira/browse/SPARK-13710
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Shell, Windows
>            Reporter: Masayoshi TSUZUKI
>            Priority: Minor
>
> On Windows, when we launch {{bin\spark-shell.cmd}}, it shows ERROR message 
> and stacktrace.
> {noformat}
> C:\Users\tsudukim\Documents\workspace\spark-dev3>bin\spark-shell
> [ERROR] Terminal initialization failed; falling back to unsupported
> java.lang.NoClassDefFoundError: Could not initialize class 
> scala.tools.fusesource_embedded.jansi.internal.Kernel32
>         at 
> scala.tools.fusesource_embedded.jansi.internal.WindowsSupport.getConsoleMode(WindowsSupport.java:50)
>         at 
> scala.tools.jline_embedded.WindowsTerminal.getConsoleMode(WindowsTerminal.java:204)
>         at 
> scala.tools.jline_embedded.WindowsTerminal.init(WindowsTerminal.java:82)
>         at 
> scala.tools.jline_embedded.TerminalFactory.create(TerminalFactory.java:101)
>         at 
> scala.tools.jline_embedded.TerminalFactory.get(TerminalFactory.java:158)
>         at 
> scala.tools.jline_embedded.console.ConsoleReader.<init>(ConsoleReader.java:229)
>         at 
> scala.tools.jline_embedded.console.ConsoleReader.<init>(ConsoleReader.java:221)
>         at 
> scala.tools.jline_embedded.console.ConsoleReader.<init>(ConsoleReader.java:209)
>         at 
> scala.tools.nsc.interpreter.jline_embedded.JLineConsoleReader.<init>(JLineReader.scala:61)
>         at 
> scala.tools.nsc.interpreter.jline_embedded.InteractiveReader.<init>(JLineReader.scala:33)
>         at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native 
> Method)
>         at 
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
>         at 
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
>         at java.lang.reflect.Constructor.newInstance(Constructor.java:422)
>         at 
> scala.tools.nsc.interpreter.ILoop$$anonfun$scala$tools$nsc$interpreter$ILoop$$instantiate$1$1.apply(ILoop.scala:865)
>         at 
> scala.tools.nsc.interpreter.ILoop$$anonfun$scala$tools$nsc$interpreter$ILoop$$instantiate$1$1.apply(ILoop.scala:862)
>         at 
> scala.tools.nsc.interpreter.ILoop.scala$tools$nsc$interpreter$ILoop$$mkReader$1(ILoop.scala:871)
>         at 
> scala.tools.nsc.interpreter.ILoop$$anonfun$15$$anonfun$apply$8.apply(ILoop.scala:875)
>         at 
> scala.tools.nsc.interpreter.ILoop$$anonfun$15$$anonfun$apply$8.apply(ILoop.scala:875)
>         at scala.util.Try$.apply(Try.scala:192)
>         at 
> scala.tools.nsc.interpreter.ILoop$$anonfun$15.apply(ILoop.scala:875)
>         at 
> scala.tools.nsc.interpreter.ILoop$$anonfun$15.apply(ILoop.scala:875)
>         at 
> scala.collection.immutable.Stream$$anonfun$map$1.apply(Stream.scala:418)
>         at 
> scala.collection.immutable.Stream$$anonfun$map$1.apply(Stream.scala:418)
>         at scala.collection.immutable.Stream$Cons.tail(Stream.scala:1233)
>         at scala.collection.immutable.Stream$Cons.tail(Stream.scala:1223)
>         at scala.collection.immutable.Stream.collect(Stream.scala:435)
>         at scala.tools.nsc.interpreter.ILoop.chooseReader(ILoop.scala:877)
>         at 
> scala.tools.nsc.interpreter.ILoop$$anonfun$process$1$$anonfun$apply$mcZ$sp$2.apply(ILoop.scala:916)
>         at 
> scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply$mcZ$sp(ILoop.scala:916)
>         at 
> scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply(ILoop.scala:911)
>         at 
> scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply(ILoop.scala:911)
>         at 
> scala.reflect.internal.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:97)
>         at scala.tools.nsc.interpreter.ILoop.process(ILoop.scala:911)
>         at org.apache.spark.repl.Main$.doMain(Main.scala:64)
>         at org.apache.spark.repl.Main$.main(Main.scala:47)
>         at org.apache.spark.repl.Main.main(Main.scala)
>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>         at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>         at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>         at java.lang.reflect.Method.invoke(Method.java:497)
>         at 
> org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:737)
>         at 
> org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:183)
>         at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:208)
>         at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:122)
>         at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
> Using Spark's default log4j profile: 
> org/apache/spark/log4j-defaults.properties
> Setting default log level to "WARN".
> To adjust logging level use sc.setLogLevel(newLevel).
> 16/03/07 13:05:32 WARN NativeCodeLoader: Unable to load native-hadoop library 
> for your platform... using builtin-java classes where applicable
> Spark context available as sc (master = local[*], app id = 
> local-1457323533704).
> SQL context available as sqlContext.
> Welcome to
>       ____              __
>      / __/__  ___ _____/ /__
>     _\ \/ _ \/ _ `/ __/  '_/
>    /___/ .__/\_,_/_/ /_/\_\   version 2.0.0-SNAPSHOT
>       /_/
> Using Scala version 2.11.7 (Java HotSpot(TM) 64-Bit Server VM, Java 1.8.0_40)
> Type in expressions to have them evaluated.
> Type :help for more information.
> scala> sc.textFile("README.md")
> res0: org.apache.spark.rdd.RDD[String] = README.md MapPartitionsRDD[1] at 
> textFile at <console>:25
> scala> sc.textFile("README.md").count()
> res1: Long = 97
> {noformat}
> Spark-shell itself seems to work file during my simple operation check.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to