[ 
https://issues.apache.org/jira/browse/SPARK-2348?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14166566#comment-14166566
 ] 

AaronLin commented on SPARK-2348:
---------------------------------

I encounter this issue in sbt building spark1.1.0 (windows7 os), i solved this 
issue by changing one line in spark-class2.cmd
--------------old-------------
set JAVA_OPTS=-XX:MaxPermSize=128m %OUR_JAVA_OPTS% -Xms%OUR_JAVA_MEM% 
-Xmx%OUR_JAVA_MEM%
-----------------new--------------
set JAVA_OPTS=%OUR_JAVA_OPTS% -Djava.library.path=%SPARK_LIBRARY_PATH% 
-Dscala.usejavacp=true -Xms%OUR_JAVA_MEM% -Xmx%OUR_JAVA_MEM% 
--------------end----------------

it works.

> In Windows having a enviorinment variable named 'classpath' gives error
> -----------------------------------------------------------------------
>
>                 Key: SPARK-2348
>                 URL: https://issues.apache.org/jira/browse/SPARK-2348
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Core
>    Affects Versions: 1.0.0
>         Environment: Windows 7 Enterprise
>            Reporter: Chirag Todarka
>            Assignee: Chirag Todarka
>
> Operating System:: Windows 7 Enterprise
> If having enviorinment variable named 'classpath' gives then starting 
> 'spark-shell' gives below error::
> <mydir>\spark\bin>spark-shell
> Failed to initialize compiler: object scala.runtime in compiler mirror not 
> found
> .
> ** Note that as of 2.8 scala does not assume use of the java classpath.
> ** For the old behavior pass -usejavacp to scala, or if using a Settings
> ** object programatically, settings.usejavacp.value = true.
> 14/07/02 14:22:06 WARN SparkILoop$SparkILoopInterpreter: Warning: compiler 
> acces
> sed before init set up.  Assuming no postInit code.
> Failed to initialize compiler: object scala.runtime in compiler mirror not 
> found
> .
> ** Note that as of 2.8 scala does not assume use of the java classpath.
> ** For the old behavior pass -usejavacp to scala, or if using a Settings
> ** object programatically, settings.usejavacp.value = true.
> Exception in thread "main" java.lang.AssertionError: assertion failed: null
>         at scala.Predef$.assert(Predef.scala:179)
>         at 
> org.apache.spark.repl.SparkIMain.initializeSynchronous(SparkIMain.sca
> la:202)
>         at 
> org.apache.spark.repl.SparkILoop$$anonfun$process$1.apply$mcZ$sp(Spar
> kILoop.scala:929)
>         at 
> org.apache.spark.repl.SparkILoop$$anonfun$process$1.apply(SparkILoop.
> scala:884)
>         at 
> org.apache.spark.repl.SparkILoop$$anonfun$process$1.apply(SparkILoop.
> scala:884)
>         at 
> scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClass
> Loader.scala:135)
>         at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:884)
>         at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:982)
>         at org.apache.spark.repl.Main$.main(Main.scala:31)
>         at org.apache.spark.repl.Main.main(Main.scala)
>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>         at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
>         at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
>         at java.lang.reflect.Method.invoke(Unknown Source)
>         at org.apache.spark.deploy.SparkSubmit$.launch(SparkSubmit.scala:292)
>         at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:55)
>         at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to