Hi All,

I just installed a spark on my laptop and trying to get spark-shell to
work. Here is the error I see:

C:\spark\bin>spark-shell
Exception in thread "main" java.util.NoSuchElementException: key not found:
CLAS
SPATH
        at scala.collection.MapLike$class.default(MapLike.scala:228)
        at scala.collection.AbstractMap.default(Map.scala:58)
        at scala.collection.MapLike$class.apply(MapLike.scala:141)
        at scala.collection.AbstractMap.apply(Map.scala:58)
        at
org.apache.spark.deploy.SparkSubmitDriverBootstrapper$.main(SparkSubm
itDriverBootstrapper.scala:49)
        at
org.apache.spark.deploy.SparkSubmitDriverBootstrapper.main(SparkSubmi
tDriverBootstrapper.scala)


The classpath seems to be right:

C:\spark\bin>compute-classpath.cmd
;;C:\spark\bin\..\conf;C:\spark\bin\..\lib\spark-assembly-1.1.0-hadoop2.3.0.jar;
;C:\spark\bin\..\lib\datanucleus-api-jdo-3.2.1.jar;C:\spark\bin\..\lib\datanucle
us-core-3.2.2.jar;C:\spark\bin\..\lib\datanucleus-rdbms-3.2.1.jar

Manually exporting the classpath to include the assembly jar doesnt help
either.

What could be wrong with this installation? Scala and SBT are installed, in
path and are working fine.

Appreciate your help.
regards
Sunita

Reply via email to